Two 1660 GTX in PC. Will be useful?

Hello, I am writing to ask you if there is anyone around here who happens to have two nvidia GTX 1660s in their system. What happens is that I have a pc with a 1660 gtx and I just had the opportunity to buy another cheap 1660 gtx. What I want to know is how much the rendering time decreases with two iray cards of this type, to see if I should buy it or not. Also if there is someone with two cards of the same type who can tell me how much their rendering times improved. Regards.

Comments

  • nonesuch00nonesuch00 Posts: 17,890

    Doesn't decrease rendering time at all as the GPUs aren't designed to split the load.

  • Doesn't decrease rendering time at all as the GPUs aren't designed to split the load.

    No, as long as the scene fits onto each card then having two will increase the speed (not double but not far short). Memory, however, isn't pooled so a sceen that drops to CPU or stops due to lack of memory (depending on settings) withe the oen card will still do so with two.

  • nonesuch00nonesuch00 Posts: 17,890

    Doesn't decrease rendering time at all as the GPUs aren't designed to split the load.

    No, as long as the scene fits onto each card then having two will increase the speed (not double but not far short). Memory, however, isn't pooled so a sceen that drops to CPU or stops due to lack of memory (depending on settings) withe the oen card will still do so with two.

    Oh thanks. I thought they had to be about to be nvLinked to do that.

  • nicsttnicstt Posts: 11,714
    edited September 2020
    Isazforms said:

    Hello, I am writing to ask you if there is anyone around here who happens to have two nvidia GTX 1660s in their system. What happens is that I have a pc with a 1660 gtx and I just had the opportunity to buy another cheap 1660 gtx. What I want to know is how much the rendering time decreases with two iray cards of this type, to see if I should buy it or not. Also if there is someone with two cards of the same type who can tell me how much their rendering times improved. Regards.

    Don't buy.

    Wait to see what you can get from the 3000 series lineup; the only caveat for that is if those two 1660's are far cheaper than the equivalent 3000 series card.

    Just remember, two 1660's will only be better than one if the scene fits on the RAM of one card as they don't share RAM.

    IMO the minimum spec for RAM on a card is 11GB, sure you can manage with less, but it is a lot less hastle. I would amend that to 10GB for the 3000 series as that is what the 3080 at launch will have.

    I have 980ti, which has 6GB of RAM; it is no where near enough.

    I now render in Blender, using my Threadripper, which is faster than my 980ti; I am looking forward, however, to the 3090 appearing.

    I don't like Iray, which tends to colour my opinions, but I do try to keep that in mind.

    Post edited by nicstt on
  • Thanks for your comments. I come to update my query. In the end, with the crisis due to crypto stupidities, I made the decision to buy the other 1660 TI. I must tell you that the performance is close to double.

  • LeanaLeana Posts: 10,932

    nonesuch00 said:

    Doesn't decrease rendering time at all as the GPUs aren't designed to split the load.

    No, as long as the scene fits onto each card then having two will increase the speed (not double but not far short). Memory, however, isn't pooled so a sceen that drops to CPU or stops due to lack of memory (depending on settings) withe the oen card will still do so with two.

    Oh thanks. I thought they had to be about to be nvLinked to do that.

    Nvlink is needed to pool memory.

  • jmtbankjmtbank Posts: 164

    Richard Haseltine said:

    No, as long as the scene fits onto each card then having two will increase the speed.....

     

    Thats the thing.  Will you spend more time trying to get your scene to fit in 6gb than it would just to let it render on CPU whilst you go and do something else?

  • StratDragonStratDragon Posts: 3,167
    edited November 2021

    The 1660ti is an overlooked card. I've thrown some massive work it's way and the response is shockingly better than I think people imagine it is. For previews in scenes with hundreds of surface and texture variants it responds really well, for assisting with the CPU when you go over the GPU RAM it's very responsive. I've worked on projects that hover close to the 32GB mark so a GPU that would solely tackle that would cost me several thousand dollars. The power consumption is really low. I bought mine in 2019, the price has gone up considerably, the 3000 card prices are insane right now, bitcoin miners are driving everything up. 

    Post edited by Richard Haseltine on
  • IsazformsIsazforms Posts: 210
    edited November 2021

    StratDragon said:

    The 1660ti is an overlooked card. I've thrown some massive work it's way and the response is shockingly better than I think people imagine it is. For previews in scenes with hundreds of surface and texture variants it responds really well, for assisting with the CPU when you go over the GPU RAM it's very responsive. I've worked on projects that hover close to the 32GB mark so a GPU that would solely tackle that would cost me several thousand dollars. The power consumption is really low. I bought mine in 2019, the price has gone up considerably, the 3000 card prices are insane right now, bitcoin miners are driving everything up. 

    The recommendations they made me predate the miners issue. Currently with the budget that I raise it is impossible to get a 3060. It seems to me that you say that you can use the gpu with the memory of the cpu? I think I'm misunderstanding you.

    Post edited by Isazforms on
  • PerttiAPerttiA Posts: 9,294

    Isazforms said:

    StratDragon said:

    The 1660ti is an overlooked card. I've thrown some massive work it's way and the response is shockingly better than I think people imagine it is. For previews in scenes with hundreds of surface and texture variants it responds really well, for assisting with the CPU when you go over the GPU RAM it's very responsive. I've worked on projects that hover close to the 32GB mark so a GPU that would solely tackle that would cost me several thousand dollars. 

    It seems to me that you say that you can use the gpu with the memory of the cpu? I think I'm misunderstanding you.

    If that was possible, people would not be drooling over VRAM... As far as I have understood only some CPU integrated GPU's can use the system RAM, I have never had one, so can't confirm.

  • StratDragonStratDragon Posts: 3,167

    for previews the CUDA performance is doing the heavy lifting. I'd have to monitor it closer to say it's doing all of it. for rendering anything outside a figure or two is going to ballance the GPU with the CPU. 

    Are you going to be able to seamlessly work with the camera in Iray preview mode with a 1660ti?
    depending on how involved your scene is you're going to deal with some delay with varying degrees of usability and severity. Most of the time I'm working in texture preview, checking (and saving) with Iray Preview periodically. 

    If I do a large scene (multiple characters; props; lighting) and render in 4K it's going to cap my CPU and my GPU at 100% and I can expect I've reached a usable image in an hour or two, but sometimes it's going to take longer. Whatever it's doing it blows the doors off of LuxRender

    Bottom line; if I had $2K burning a hole in my pocket I'm going to the Fender website and building a custom Stratocaster, not investing in a 12GB card, which may or may not still need CPU to assist.

  • StratDragon said:

     

    If I do a large scene (multiple characters; props; lighting) and render in 4K it's going to cap my CPU and my GPU at 100% and I can expect I've reached a usable image in an hour or two, but sometimes it's going to take longer. Whatever it's doing it blows the doors off of LuxRender
     

    battle encoder, which I first heard about here will solve 100%  cpu usage... set it to 66% and you can still use your computer for other things. 

  • PerttiA said:

    Isazforms said:

    StratDragon said:

    The 1660ti is an overlooked card. I've thrown some massive work it's way and the response is shockingly better than I think people imagine it is. For previews in scenes with hundreds of surface and texture variants it responds really well, for assisting with the CPU when you go over the GPU RAM it's very responsive. I've worked on projects that hover close to the 32GB mark so a GPU that would solely tackle that would cost me several thousand dollars. 

    It seems to me that you say that you can use the gpu with the memory of the cpu? I think I'm misunderstanding you.

    If that was possible, people would not be drooling over VRAM... As far as I have understood only some CPU integrated GPU's can use the system RAM, I have never had one, so can't confirm.

    Obviously I know. I have used iray for years.

Sign In or Register to comment.