CUDA support for 3Delight?
westernnomad
Posts: 90
I finally got educated enough to ask this question, thanks to Blender tutorials.
Does Daz and 3Delight have CUDA support?
If not, are there plans to add it to future versions?
Comments
3Delight is a CPU renderer, it doesn't use the GPU (and so isn't limited by video card memory, which would be an issue with many DS scenes). As far as I know this is currently true of the stand-alone 3Delight as well as the version included in DS.
Most GPU render engines provide its users with internal tools and workflows to reduce the VRAM size of a complete scene.
With some effort it is possible to fit standard DAZ Studio scenes in most VRAM.
Some ideas of a basic GPU rendering workflow are:
- Do not load textures of any objects that are not visible in the viewport into VRAM.
- Reduce the size of the textures of items in the background.
Update / Edit:
I put together more information in another forum post to not go further offtopic here:
http://www.daz3d.com/forums/discussion/45153/
- - -
@ CUDA and CPU rendering
Currently we live in a world in which both CPU and GPU rendering have their advantages and their drawbacks.
CPU rendering had a huge head start and now offers a lot of features we have come to get used to.
But as things are looking like right now it seems in the near future we will probably not see any more significant speed increases with CPU rendering.
On the other hand GPU rendering is at a very early stage but allready now is a lot faster while on the other hand a lot of features available in CPU rendering are yet to be included.
How the switch from HD to UHD / 4K will make render speed an even more important factor
- In 2015 / 2016 many users will start to switch to Ultra HD 4K technology. This means anyone creating images will need to produce at least in 3840 × 2160.
- Most users do NOT have a render farm at home.
- Most users have a workstation that can easily fit two GPU.
So basically the very moment you start to produce at 4K many users will come to the conclusion that the current CPU render engines are just too slow because any speed differences that are allready obvious at HD resolution will increase at another factor 4.
This means if an image at a current HD resolution of 1920x1080 took you one day to render it will take you four days to render the same image at 3840x 2160 in Ultra HD / 4K.
To simplify the calculation estimates are used.
GPU rendering is about 8 times faster than CPU rendering.
The render time of a UHD image takes about 4 times longer than a HD image.
Render time HD image at 1920x1080 with CPU = 24 hours = 1 Day
Render time HD image at 1920x1080 with GPU = 3 hours
Render time UHD / 4K image at 3840x 2160 with CPU = 96 hours = 4 Days
Render time UHD / 4K image at 3840x 2160 with GPU = 12 hours.
So yes all CPU render engine developers should start to have a very close look at CUDA cores and how they could transform their CPU render engine into a hybrid version that also supports Nvidia graphic cards. :exclaim:
Speculation:
My estimate is that in the near future
- CPU render engines will mostly be used by large studios that can afford a huge render farm
- Most CPU render engines that want to stay relevant will offer hybrid solutions that combine CPU and GPU in efficient ways
- GPU rendering will see a lot of improvments especially if we see technology developed for game engines to be included in Render Engines.