RTX vs GTX

I wonder what the difference is between GTX 1660 Ti and RTX 2060 when rendering Iray?

Comments

  • GordigGordig Posts: 9,186

    RTX cards yield significant advantages in Iray (and possibly other rendering engines, but I can't speak as much to those) over GTX cards. Also, even without those benefits, a newer card with the same number of CUDA cores as an older card will typically be faster. 

  • Joe2018Joe2018 Posts: 236

    My experience:

    GTX 1070Ti 8 GB vs. RTX 2060 8 GB = RTX needs half of the Rendertime

    RTX 2060 8 GB vs RTX 3070 8 GB = 3070 needs half of the Rendertime

    GTX 1070Ti 8 GB vs. RTX 3070 8 GB = 3070 needs a quarter or lower Rendertime - Big Scenes who need a few hours on the 1070 need only 30 - 60 Min. on the 3070.

  • In a non-RTX card some of the memory will be used for code emulating the RTX features, so it will probably be relatively slower (in additoon to other differences) and certainly relatively limited in memory.

  • Thank you for your prompt answers. I'm already getting ready with all my heart to buy RTX 2060 12Gb that's coming out soon. yes

  • RL_MediaRL_Media Posts: 339

    You mean 4060?

  • Richard Haseltine said:

    In a non-RTX card some of the memory will be used for code emulating the RTX features, so it will probably be relatively slower (in additoon to other differences) and certainly relatively limited in memory.

    Where did you get this information? The memory on an RTX is double the speed of a GTX and it has been proven over and over with benchmarks that the RTX cards are superior.

  • kyoto kidkyoto kid Posts: 40,586

    ...my Maxwell Titan-X is showing 8 GB in 4.12 and above (MSI Afterburner).

  • RL_Media said:

    You mean 4060?

    A reboot of the RTX2060 (with 12 GB of VRAM) will be launched come feb 2022. and with high possibility of a 16 GB 3070Ti. and a  3090Ti

  • GordigGordig Posts: 9,186

    jayjarrett said:

    Richard Haseltine said:

    In a non-RTX card some of the memory will be used for code emulating the RTX features, so it will probably be relatively slower (in additoon to other differences) and certainly relatively limited in memory.

    Where did you get this information? The memory on an RTX is double the speed of a GTX and it has been proven over and over with benchmarks that the RTX cards are superior.

    ...which is exactly what Richard was saying.

  • RL_MediaRL_Media Posts: 339

    JVRenderer said:

    RL_Media said:

    You mean 4060?

    A reboot of the RTX2060 (with 12 GB of VRAM) will be launched come feb 2022. and with high possibility of a 16 GB 3070Ti. and a  3090Ti

    That's weird. I guess I am out of the loop lol. Why would they re-release old ass tech with just some more VRAM? Did they just have tons they never sold or something?

  • nonesuch00nonesuch00 Posts: 17,929
    edited November 2021

    RL_Media said:

    JVRenderer said:

    RL_Media said:

    You mean 4060?

    A reboot of the RTX2060 (with 12 GB of VRAM) will be launched come feb 2022. and with high possibility of a 16 GB 3070Ti. and a  3090Ti

    That's weird. I guess I am out of the loop lol. Why would they re-release old ass tech with just some more VRAM? Did they just have tons they never sold or something?

    I'd buy it to bide my time until the RTX 4000 series is out if it was easily available and less than $300. That said, even the GTX 1650s are selling for $500 when you can find any (I'm not looking to buy those but I saw the prices).

    Post edited by nonesuch00 on
  • JVRendererJVRenderer Posts: 661
    edited November 2021

    RL_Media said:

    JVRenderer said:

    RL_Media said:

    You mean 4060?

    A reboot of the RTX2060 (with 12 GB of VRAM) will be launched come feb 2022. and with high possibility of a 16 GB 3070Ti. and a  3090Ti

    That's weird. I guess I am out of the loop lol. Why would they re-release old ass tech with just some more VRAM? Did they just have tons they never sold or something?

    Because there's a high demand for low end video cards by gamers. And these low ass tech have very low hash rate for Crypto Mining, so perhaps Miners won't snap them up so easily. In addtion, there is still enough resources to produce the turing cards than the hard to come by stuff on amperes. If there's money to be made, I suppose nVidia will try to make more money, even on old tech.

    Post edited by JVRenderer on
  • RL_MediaRL_Media Posts: 339

    Ah, low hash rate. That is good for the desperate gamer people, as long as the botters don't scalp em all to resell to them desperate gamers too! I am sitting on my 2080 super for now I guess. I had enough to buy a 3090 twice so far, keep spending it on other stuff because I can't find one for anywhere near nromal price. I got a new mobo- cpuRAM the first time, and a VR setup this time lol. Last time I checked, the GPU's available asking price was more than I paid for my van @.@ Hard pass on that noise.

  • I wouldn't trade my RTX 2070 TriFrozer for a GTX, not even with the fact that I could bank a nice payday if I sold it right now.  RTX definitely renders at a much faster rate.  My son keeps trying to steal my GPU for his gaming comp (he's running an older GPU on a Ryzen system) because he doesn't want to either A: buy a newer GPU or B: buy an entirely new system with a newer GPU (which would actually cost the same as the GPU alone right now).  I think it's insane how the cost of GPUs has skyrocketed.  It is quite literally cheaper to buy a whole new system with the new GPU already in it, but then you have the hours of reinstalls.... just nope.

Sign In or Register to comment.