How to know 'scene size' including textures? ( For Iray )
vex3d_22560
Posts: 130
Is there a way to know whether your scene is too big to fit on your video card? I'm looking to buy a machine for rendering Iray and I'm seeing that if your scene is too big for the card, it will drop the card and use CPU. So I'm wondering how to tell the scene size?
Comments
I don't think there is a way, apart from trying it and waiting to see if the render blows past the card's VRAM total.
Don't forget, if you only have one graphics card, it must also drive your monitor display. This isn't always a static memory size, so there won't always be the same amount of VRAM available for your render. If you have two graphics cards, you can set up one to be dedicated for the monitor display, leaving the entire memory of the other card for rendering.
I'm hoping I can use cpu graphics for display, and a GTX 960 4GB ( starting low, but its 3x better than what I got now ) for gpu rendering. Its quite a jump in price to get more than 4GB on a video card, and that's where I'm concerned. How can I know my limitations ?
How much memory does V7 naked take? at subd3? etc, these are details I'd like to know, so I can actually use what I'm spending all this money on, instead of still only relying on cpu because the scene won't fit the card.
I use a GEFORCE GTX 880M (i7 with 2.4GHz with 8GB RAM) and it has enough power to run a render with multiple characters and environment and so on (with iray) and I can use Internet or an office program during rendering... so a GTX 960 should have enough power.
If you can, get more than 4GB.
I have 4GB and its not enough for a lot that I would like to do. New characters have lots of textures and new clothing and sets have lots of normal maps...that 4GB's goes really fast.
This is my concern, because I tend to use the 'extra' that I can, specific SSS maps, normal maps, etc. There's several hundred megabyte difference in the textures I use on my custom G3F vs the default skin (Jeanne?)
Is there a way to access this data in the daz scripting sdk?
edit: the price difference between a 4GB and a 6GB is about $400. I'm only spending $200 on a 960GTX4GB which is significant price difference than $600 for a 980Ti6GB
edit2: my current build http://pcpartpicker.com/p/yxNPjX - I got most of this build as a suggestion from this forum, but I changed out the mobo and psu to support future upgrades ( additional video card )
Attached is an extract from the log file after rendering a single G3f, with toulouse hair, in the Dark Storm outfit (no cape) at 800x600.
Try the link to a Google Sheet in my signature (I'd like to know if it works )
You can plug the numbers in to get a rough idea of the memory requirement.
(Though if the GPU is doing other things it may still reject a scene that should theoretically fit.)
You don't need the log file to tell you the triangles, just use window > panes > scene info. It won't tell you the texture info though, unfortunately.
What's fun is seeing what I can shove into GPU memory on a 2gb card... :) (Integrated in a system with no expansion ports. Alienware Alpha. Great for games, I can get away with ultra settings in a lot of them. Not so much for Iray, but I've had some success in getting GPU rendering to happen. At least I'd hope my monitor display doesn't require all of the VRAM and GPU processor power...) May not be able to use G3, but I don't have much for her anyway. And amazingly enough if you're not that close up on a figure, a lot of the improvements aren't really noticeable.
That is exactly what kind of information I was looking for, prixat.
Thank you!!!
Did anyone try the spreadsheet link?
GPUz will show how much memory the card is using once its rendering - only accurate if the card is specifically for rendering.
You could turn down settings, turn of Windows features that take video RAM, even lower screen resolution; this would help to some extent.
I did! I can't open my log right now to test it but I see what I'm supposed to do. This is exactly what I wanted. If I can figure out how to script a panel in Daz that shows this info easily, I will be the happiest.
found this in my log:
Iray INFO - module:category(IRAY:RENDER): 1.2 IRAY rend info : CUDA device 0 (GeForce GTX 960): Scene processed in 21.163s
Iray INFO - module:category(IRAY:RENDER): 1.2 IRAY rend info : CUDA device 0 (GeForce GTX 960): Allocated 53 MB for frame buffer
Iray INFO - module:category(IRAY:RENDER): 1.2 IRAY rend info : CUDA device 0 (GeForce GTX 960): Allocated 804 MB of work space (1024k active samples in 0.058s)
Iray INFO - module:category(IRAY:RENDER): 1.2 IRAY rend info : CUDA device 0 (GeForce GTX 960): Used for display, optimizing for interactive usage (performance could be sacrificed)
I have 2 questions:
1. Does this mean my scene is using 804+53MB of video ram?
2. Is there a way to run my display off my integrated graphics from cpu so my nvidia 960GTX is 100% available for iray ?
In the Nvidia Control Panel you can set what applications use the integrated GPU and which use the Nvidia.
I tried using the link Prix, but I don't have edit access.
I'm in a similar boat. I hadn't discovered Daz before I bought my R9 390, and now I'm kicking myself! So I'm on the fence between building a CPU only Daz-specific machine (Core i7 4790), working on a scene, then hitting the KVM switch to jump back to my "normal" computer while that scene renders for a couple of hours, or buying a really nice Nvidia card (like a 6Gig 980ti). Price, amusingly enough, works out to be about the same. I should mention too that if I bought the new card, I could install the R9 390 in the family computer, then maybe install both the GTX 770 (2GB) card from that computer and the new card in my box. I've never tried installing multiple GPUs before, but I think as long as I'm not trying to run them in SLI I should be ok to use one for rendering (the 6gig) while I do other stuff with the lesser card.
I figure the 2nd computer option is safer, I already know how an i7 rendering pure CPU works, but I'd rather just be able to render faster on my main machine. However I do some decently large renders, and have grand plans for some large renders, so I don't want to end up with a 6gig brick for the purposes of rendering. Any thoughts?
It's 2023. Have the daz stuido developers figured out how to show the total texture size of a scene to the user?