Why cant I select my gpu under render settings?

I just watched this video.

and tried to do as he suggested, but when I go to the advance tab under render settings to select my gpu, that option doesn't exist. I should be able to, I would think. Here is a link to my video card and a screen shot of what I see. Anyone have any ideas?

 

http://www.amd.com/en-us/products/graphics/desktop/7000/7700#

 

 

daz may 11.jpg
568 x 1200 - 171K

Comments

  • barbultbarbult Posts: 23,436

    Iray GPU rendering works only with NVIDIA based graphics cards. Yours is AMD, so it is not supported.

  • michaelolsonmichaelolson Posts: 117

    Thanks barbult. :) Guess I will have to upgrade.

  • SpottedKittySpottedKitty Posts: 7,232

    Note that it needs to be a recent-ish NVidia card, with as many CUDA cores and as much VRAM as your budget can handle. There are a few threads here and there with suggestions and recommendations. A word of warning, the really good ones are expen$$$ive, and you will need to be careful the new card doesn't overload your power supply.

  • nicsttnicstt Posts: 11,715

    The new 1070, released soon, will be the best budget option, unless 9 series cards become rediculously cheap.

    The 1070 is claimed to have better than Titan X performance, with 8GB of GDDR5 RAM; the 1080 has GDDR5X RAM, and better performance than the 1070; there are currently no reviews, or other available figures for games, never mind for rendering, so I'd wait until they appear before making any decission. I expect they'll be decent though.

  • JD_MortalJD_Mortal Posts: 758
    edited May 2016
    nicstt said:

    The new 1070, released soon, will be the best budget option, unless 9 series cards become rediculously cheap.

    The 1070 is claimed to have better than Titan X performance, with 8GB of GDDR5 RAM; the 1080 has GDDR5X RAM, and better performance than the 1070; there are currently no reviews, or other available figures for games, never mind for rendering, so I'd wait until they appear before making any decission. I expect they'll be decent though.

    Faster for games, not for actual 3D rendering. (Great for the value though! Two 1080's would put 5120 cuda-cores at your disposal, for scenes under 8GB, for the price of a single Titan-X, which will surely fall once these cards are released.)

    Titan-X Cuda-cores: 3072 (Ram 12GB GDDR5)  ~$1200-$900

    GTX 1080 Cuda cores: 2560 (Ram 8GB GDDR5x)  ~$700-$600 {1070's are crippled 1080's that fail the memory and core and clock requirements, standard is 1-GB of slow ram and ??? cuda-core failures, and ??? clock-speed instability. Listed as GDDR5 not GDDR5x, due to failure of {x} speeds. ~2048+/- ??? cuda cores.}

    http://videocardz.com/59897/nvidia-geforce-gtx-1080-gpu-z-specifications-leaked

    Faster = shortcuts via new DX features for games, not actual "performance" of the device. (Eg, marketing tricks, selling less for more, like diet soda. It's the CO2 that makes you fat, not the sugar.)

    Most of my scenes top-out my 16GB of Ram, demanding system-cache, so 1080's would be useless for most of my renders. Honeslty, Titan-X only helps if I do layer-rendering. Splitting things in half isn't a problem, but splitting them into four parts is a nightmare, for 1080's.

    Render a scene and go to task-manager to see how much memory you are using, including "swap"... (If swap is turned-off, daz just crashes if a scene is too large for RAM, which is a good indicator for indicating if you hit that RAM limit. Until iray is programmed to work in chunks, or daz pre-optomizes scenes into layered rendering and smart LOD detection... We are doomed to small renderings and expensive cards.)

    Post edited by JD_Mortal on
Sign In or Register to comment.