Daz Prayers Answered? 4060 Ti w 16GB VRAM!

13

Comments

  • outrider42outrider42 Posts: 3,679

    jd641 said:

    outrider42 said:

    That seems a bit odd, none of the 2070 Supers shown in the benchmark thread were faster than the 3060s. The 3060 is even flirting with 2080ti times. I am not saying I doubt your results, but I am curious what is different in the scenes you benchmarked, if that was the same version of Daz, and if you tried the benchmark thread scene with the 2 cards.

    That's what he's saying though, an increase would mean his renders took longer but a decrease in render time means they were faster with the 3060.

    Oh snap, that was a bad brain fart on my part, lol. I totally read it backwards somehow.

    On this note, the 4060ti has better ray tracing than the 3060ti, so there is some hope that the 4060ti can better differentiate itself from the 3060ti with Iray.

     

     

  • outrider42outrider42 Posts: 3,679

    Here are some rendering benchmarks for the 4060ti 8gb. They are much more favorable than the gaming benches. The performance is all over the place! It looks best in Blender, where it can beat the 3070ti is some scenes. In Vray and Octane it loses to the 3070. In every case, the gap between the 4060ti and the 4070 is disturbingly huge. Remember, the 16gb model has the exact same specs as the 8gb model, so the 16gb model should perform almost identically to this. I think this comes down to the complexity of the shaders. The 4060ti has very fast ray tracing cores, but not a lot of CUDA cores. So scenes with lots of complex shaders could slow the 4060ti down. If a scene is light on shading but has lots of geometry then the ray tracing cores can shine.

    https://cdn.mos.cms.futurecdn.net/cMxiLUiKHD9RMYA32HSuQe-1200-80.png.webp

    Link   https://www.tomshardware.com/reviews/nvidia-geforce-rtx-4060-ti-review/8

    The Vray benches are worrying. The good news is that the 4060ti only used about 140 Watts in gaming. It is quite efficient. The 3060 used 158, and the 3060ti used a full 200 Watts. So the 4060ti is a solid 60 Watts less than the 3060ti, and it is even more power efficient than the already frugal 3060. Odds are it will use even less than that for rendering Iray given the trends with Lovelace. So there are some good points to the card, as it should work with any PC's power budget. The low Wattage also makes it a good option for using multiple GPUs without needing a bigger power supply or massive case (unless your first GPU is a monster 4090 taking up all 4 slots).

    It is great that gamers are hating this card, because that should help it stay down in price for content creators.

  • I just can't see (for me) how not to get the 4060ti 16gb card... 
    I think, along with my 3060 I got back in February, it would make for a nice hobbyist set up.  Don't get me wrong, I'd love to have a 4090, and was saving up for one, but at 1/3 of the price, and 2/3rds of the VRAM, I think I can manage...
    Plus, if I "dollar cost average" over the next two years till the 5090/5060 even peeks its head out, It'll be a win.

     

  • PerttiAPerttiA Posts: 10,024

    PerttiA said:

    Checked the local shop. RTX 4060 Ti, sales will start in one hour and the cheapest one costs 460 Eur (including 24% VAT) (Asus and MSI)

    For comparison, the Asus GeForce DUAL-RTX3060-O12G-V2 is listed at 390Eur (including 24% VAT)

    After two days, they still haven't moved many of these. 

  • savagestugsavagestug Posts: 174

    I'll probably grab one when I can find one shorter than 11" on sale. The 16g 4060ti looks like an improvement over my 12g 3060

  • davesodaveso Posts: 7,024

    so the 4070ti shouold be a great card

  • ExpozuresExpozures Posts: 219

    Boy, nVidia really screwed the pooch on this launch.

    It should always been that the new gen always out performs the previous next gen-next step up.  So, a 4060 should fair better than a 3070, just like the 3070 outshone the 2080.

    4090 seems to be the only one they got 'right'...and they made that unbelievably expensive.

    Maybe it's time for some kind of upset/upheaval in the market.  Sucks that we're married to nVidia if we want that sweet-sweet iray juice.  But we're only a niche market for nVidia.  If Intel can get marketing their A-series cards and shore up compatibility with older titles and rock-solid drivers, they'll definitely be enticing to gamers which will cause nVidia and AMD to reset their course.

  • nonesuch00nonesuch00 Posts: 18,131

    thanks for those charts, I can see the RTX 4070 is too slow.

  • I'm trying to decide between buying a 3060ti or a 4060ti, as apparently the 3060ti has roughly 500 cuda cores MORE than the 4060ti, which I think is quite odd? Both have 8GB VRAM (I'm not paying £500 for the 16GB 4060ti model), so I guess the better bet for me (as interested in speeding up my renders as much as possible) would be the 3060ti model then?

     

  • kwerkxkwerkx Posts: 105

    BLUEZKYWORLD said:

    I'm trying to decide between buying a 3060ti or a 4060ti, as apparently the 3060ti has roughly 500 cuda cores MORE than the 4060ti, which I think is quite odd? Both have 8GB VRAM (I'm not paying £500 for the 16GB 4060ti model), so I guess the better bet for me (as interested in speeding up my renders as much as possible) would be the 3060ti model then?

    Not sure what you will be rendering.  I myself like to cram as much as possible into my scene and often run into the 10GB VRAM limit on my card.. 8GB would drive me insane.  Again, don't know what you will be rendering.. maybe 8GB will be enough; but, IMO, the beauty of the x060 series has been the beefy 12GB of VRAM they offer.

  • savagestugsavagestug Posts: 174

    BLUEZKYWORLD said:

    I'm trying to decide between buying a 3060ti or a 4060ti, as apparently the 3060ti has roughly 500 cuda cores MORE than the 4060ti, which I think is quite odd? Both have 8GB VRAM (I'm not paying £500 for the 16GB 4060ti model), so I guess the better bet for me (as interested in speeding up my renders as much as possible) would be the 3060ti model then?

     

    If you aren't going to get the 16gb 4060ti then you should go with the 3060 with 12gb. The extra VRAM will be more benficial than the extra cores.

  • If you aren't going to get the 16gb 4060ti then you should go with the 3060 with 12gb. The extra VRAM will be more benficial than the extra cores.

    +1

    But I will add, that for the little bit extra, it might be worth spending for the 4060Ti 16GB.
    To me, that will be the poor-man's x090 card.  Basically 3070 performance with 16GB of VRAM and better efficiency

    The 4060Ti (8Gb) cards have already shown up discounted ($399  MFP with sales at $349-$379).  The 4060 Ti 16GB cards are MFP'd at $499... so really $100 more for DOUBLE the VRAM.  Now, your county/pricing may be different, but even at MFP, it will probably be the Daz Render Sweet Spot Card.

  • nonesuch00nonesuch00 Posts: 18,131

    I just did the math for what I need to pay for before I buy a RTX 4090 and so I'll need to be patient and buy in Jun 2024; maybe by then they'll be talking about the RTX 5090. 

  • outrider42outrider42 Posts: 3,679

    BLUEZKYWORLD said:

    I'm trying to decide between buying a 3060ti or a 4060ti, as apparently the 3060ti has roughly 500 cuda cores MORE than the 4060ti, which I think is quite odd? Both have 8GB VRAM (I'm not paying £500 for the 16GB 4060ti model), so I guess the better bet for me (as interested in speeding up my renders as much as possible) would be the 3060ti model then?

    You cannot compare CUDA cores between different generations. Ray tracing is also a huge factor, and the 4000 series has MUCH faster ray tracing than previous gen. We don't have any Iray benchmarks yet for the 4060ti, but I would wager that the 4060ti is faster than the 3070 at Iray. It is faster than the 3070 in other PBR render engines like Octane, let alone the 3060ti. So just going by that, I would expect Iray to perform in a similar way.

    Also, the 4060ti is getting 16gb of VRAM, and VRAM is so vital to rendering. If you run out of that 8GB of VRAM with the 8GB model...your GPU is useless. It does nothing. You have to rework your scene to fit into that 8gb buffer, there is no way around it...unless you are ok with rendering on a CPU.

    Only masochists render Iray on CPU.

  • In the end, the 4060 ti 16gb might be a moot arguement, since it's been 1 week and you can't buy it anywhere.  

  • PerttiAPerttiA Posts: 10,024

    kevinso2001 said:

    In the end, the 4060 ti 16gb might be a moot arguement, since it's been 1 week and you can't buy it anywhere.  

    It has been available all this time here in Finland.

  • outrider42outrider42 Posts: 3,679

    kevinso2001 said:

    In the end, the 4060 ti 16gb might be a moot arguement, since it's been 1 week and you can't buy it anywhere.  

    There are at least 2 models in stock to add to cart at Newegg.

    The narrative that they did not make any of these is false. It is true they did not make a ton of them, but they do indeed exist. It is pretty normal for a new GPU product to by hard to find in the weeks following its launch because of supply issues.

    This will be a curious card for a few years. Right now the demand is not great with everything going on. But I have a feeling that in a year or two the 16gb 4060ti will increase in demand for one reason or another. It could be because of gaming demands, price drops, improving economy, a brand new crypto boom. While Ether is dead for mining, there could always be a new crypto that takes its place, and the 4060ti would be a great mining card if its price dropped. The bus size can hurt mining some coins, but there are other coins that are not so bus heavy. So it would depend on what kind of coin took off.

    At any rate, we have all seen how rapidly the world can change from one year to the next. Just a couple years ago the market was completely blown up. Now the GPU market has totally crashed, even though prices still remain relatively high on new ones (mainly due to AI this time). The market can just as easily flip back again for any number of reasons.

  • richardandtracyrichardandtracy Posts: 5,689
    edited July 2023

    There are some in the UK, £490-£510 range. However, it is much easier to find the 8Gb ones at £100 or more cheaper.

    Given that I have an 8 month old 3060 at the moment, and have not yet overtopped its VRAM, I'm not jumping yet.

    Regards,

    Richard

    Corrected tripe writing error.

    Post edited by richardandtracy on
  • kevinso2001kevinso2001 Posts: 7
    edited July 2023

    This is good at least now there's a way to get one, when i checked it was not available so probably later that day stock was added.

             The question now for RTX 3060 owners is assuming you can sell your 3060 for $240, are you willing to spend $260 for 25% more vRAM and 1000 more cuda cores which will speed up render times by around 20% (rtx 3070 level performance), or will you spend around $550 for a RTX 3090 that is 300% faster with 200% more vRAM?

     

    outrider42 said:

    kevinso2001 said:

    In the end, the 4060 ti 16gb might be a moot arguement, since it's been 1 week and you can't buy it anywhere.  

    There are at least 2 models in stock to add to cart at Newegg.

    The narrative that they did not make any of these is false. It is true they did not make a ton of them, but they do indeed exist. It is pretty normal for a new GPU product to by hard to find in the weeks following its launch because of supply issues.

    This will be a curious card for a few years. Right now the demand is not great with everything going on. But I have a feeling that in a year or two the 16gb 4060ti will increase in demand for one reason or another. It could be because of gaming demands, price drops, improving economy, a brand new crypto boom. While Ether is dead for mining, there could always be a new crypto that takes its place, and the 4060ti would be a great mining card if its price dropped. The bus size can hurt mining some coins, but there are other coins that are not so bus heavy. So it would depend on what kind of coin took off.

    At any rate, we have all seen how rapidly the world can change from one year to the next. Just a couple years ago the market was completely blown up. Now the GPU market has totally crashed, even though prices still remain relatively high on new ones (mainly due to AI this time). The market can just as easily flip back again for any number of reasons.

    Post edited by kevinso2001 on
  • kevinso2001kevinso2001 Posts: 7
    edited July 2023

    Ok the rtx 4060 ti 16gb has FINALLY arrived in my country. The cheapest variant (16G Palit RTX 4060 Ti Jetstream OC) costs $584. Now rtx 3090 2nd hand GPU costs $792. So the price difference of the two is just $208 (smaller than the $310 example in my last post)

    Post edited by kevinso2001 on
  • mtl1mtl1 Posts: 1,507

    Is this card worth it over the 3060 12GB? How much faster is this in rendering -- ie. can we see 2x render speeds?

  • KitsumoKitsumo Posts: 1,216

    mtl1 said:

    Is this card worth it over the 3060 12GB? How much faster is this in rendering -- ie. can we see 2x render speeds?

    That depends. If you already own a 3060 12Gb, it's probably not worth it. If you have anything less than a 3060 and you're considering an upgrade, it's worth it to get this instead. You won't see 2x speeds. It's about 71% faster by my math. (source)

  • kyoto kidkyoto kid Posts: 41,058

    ...on the other hand, the only 16 GB card Nvidia produces is the RTX A4000 (formerly Quadro series) which is still selling for around 900$ (refurbished) - 1,300$ (used ones can be found for about 650$).

    Prices for the A4000 may drop somewhat once the 20GB RTX 4000 SFF (small form factor) Ada card is released, but as with most pro grade cards depreciation is generally not as steep as for the gaming ones.

    Unfortunately there are no head to head comparisons between the 4060 and the A4000 due to many reviewers not having the 4060 12 GB in their hands yet.  However just looking at the base specs and understanding the advancement Ada Lovelace has over Ampere. I would give the 16 GB 4060 Ti the nod particularly for 3d Artists and content creators who want to go beyond the 12 GB barrier, but donl't have the finances to purchase the heigher end 4080 and 4090.  With a 165 W the 4060 draws only 5 W more than my 12 year old old 1 GB GTX 650 which is barely a blip on the TechPower Up performance rating chart in comparison (9%).

     

  • outrider42outrider42 Posts: 3,679

    kevinso2001 said:

    This is good at least now there's a way to get one, when i checked it was not available so probably later that day stock was added.

             The question now for RTX 3060 owners is assuming you can sell your 3060 for $240, are you willing to spend $260 for 25% more vRAM and 1000 more cuda cores which will speed up render times by around 20% (rtx 3070 level performance), or will you spend around $550 for a RTX 3090 that is 300% faster with 200% more vRAM?

     

    outrider42 said:

    kevinso2001 said:

    In the end, the 4060 ti 16gb might be a moot arguement, since it's been 1 week and you can't buy it anywhere.  

    There are at least 2 models in stock to add to cart at Newegg.

    The narrative that they did not make any of these is false. It is true they did not make a ton of them, but they do indeed exist. It is pretty normal for a new GPU product to by hard to find in the weeks following its launch because of supply issues.

    This will be a curious card for a few years. Right now the demand is not great with everything going on. But I have a feeling that in a year or two the 16gb 4060ti will increase in demand for one reason or another. It could be because of gaming demands, price drops, improving economy, a brand new crypto boom. While Ether is dead for mining, there could always be a new crypto that takes its place, and the 4060ti would be a great mining card if its price dropped. The bus size can hurt mining some coins, but there are other coins that are not so bus heavy. So it would depend on what kind of coin took off.

    At any rate, we have all seen how rapidly the world can change from one year to the next. Just a couple years ago the market was completely blown up. Now the GPU market has totally crashed, even though prices still remain relatively high on new ones (mainly due to AI this time). The market can just as easily flip back again for any number of reasons.

    First of all, the 3090 is not 3 times faster than a 3060. I have both of these cards myself and tested them. The 3090 is indeed a full 2+ times faster than the 3060, peaking at about 2.5 times faster in some scenes. But not 3.

    I am not trying to be touchy, but we need have to the information correct here so any potential buyers are not being mislead. Also we still not have an actual benchmark for Iray with the 4060ti. So we cannot say for sure it equals a 3070. In some benchmarks it beats a 3070ti, which is another tier up. That makes a difference, and changes the equation a bit.

    It also depends on if you can actually find a 3090 at this point in time. You will not find one brand new, and should be highly skeptical if somebody claims to have a new one. So you have to buy used, and that isn't exactly a fair comparison to a new product. Of course a used 2.5 year old product is going to be heavily discounted. You want to have buyer's protections for sure when looking at the used market now since so many cards have been mistreated crypto cards. It wasn't long ago that a (I believe) German repair shop had a sudden surge of dead AMD GPUs. After some investigating, they concluded that someone had used them for mining, and that used a pressure washer or something similar to hastily clean the cards before selling them. That caused micro fractures to form inside the card, making it like a ticking time bomb to fail. I agree with their conclusion, it is the only thing that makes sense for the particular failures he was seeing, and all of the cards were bought used from the same place.

    For years I have been totally fine buying used GPUs. Up until my 3090, every GPU I had bought was used, including two 1080tis. I have bought used GPUs for years, and expensive ones at that. But with how huge mining was the past couple years, things are a bit different now, and you absolutely do need to be careful when buying a used GPU today. Make sure you have a recourse in case you get a bad GPU. Personally, I would still buy a used 3090 if I had the means, but I would make double sure to research the seller (which I do anyway) and that it can be returned.

    If somebody doesn't want to buy used, I really can't argue with that like I would have 2 years ago. I can understand why they wouldn't.

    While the 4060ti is a modest VRAM bump over the 3060, that VRAM can still be quite valuable by itself. Even a 3060 16gb would have value to Daz Iray users. Of course, this only matters if you will actually use that VRAM. If you don't build scenes over 12gb, then there is nothing to consider here. The 4070 is not so far up in price and is faster.

    Interestingly I don't go over the 3060's 12gb as often as I did my 1080ti's 11gb. I could hit that 11gb so easy on my 1080ti, even the one without a display attached (though it would have a little more space than the 1080ti driving the screen.) But I do go over it enough that it would be nice to have a larger capacity. At this point the speed is almost secondary, with how fast these GPUs already are.

  • davesodaveso Posts: 7,024

    why is it that these cards do not use system ram? maybe you would need 64gig but why not?

  • PerttiAPerttiA Posts: 10,024

    daveso said:

    why is it that these cards do not use system ram? maybe you would need 64gig but why not?

    One reason is that it would slow them down. 

  • kyoto kidkyoto kid Posts: 41,058

    [ugh yet another host 504 error]

    ...CPU integrated graphics already uses system memory.  CPU based rendering in Iray even with a Threadripper is slower than GPU rendering.  The only reason major film producers like Pixar still use CPU rendering is because it is more accurate, handles larger scenes better, and they can afford to have giant climate controlled warehouse sized render farms (would hate to have to pay their power bill every month).

    I suppose if you had an AMD EPYC™ 9754 with 256 threads along with a systm that has 512 GB of memory, you could get close, but who here has the money for that kind of rig when  the CPU alone costs about 12,000 USD.  You could get an RTX 6000 Ada with 48 GB of VRAM 18,176 shader units, 192 ROPs, 568 Tensor and 142 RT cores and still have about 4,000$ change in your pocket.  

    Even I'd be hard pressed to use up that much VRAM.

     

  • davesodaveso Posts: 7,024

    PerttiA said:

    daveso said:

    why is it that these cards do not use system ram? maybe you would need 64gig but why not?

    One reason is that it would slow them down. 

    but once you hit capacity during a render, at least it could continue on using system ram.  it would be something you wouldn;t need to worry about as much or have the need to buy a card with more ram. Aha, I think I hit on it. 

  • Richard HaseltineRichard Haseltine Posts: 101,010

    daveso said:

    PerttiA said:

    daveso said:

    why is it that these cards do not use system ram? maybe you would need 64gig but why not?

    One reason is that it would slow them down. 

    but once you hit capacity during a render, at least it could continue on using system ram.  it would be something you wouldn;t need to worry about as much or have the need to buy a card with more ram. Aha, I think I hit on it. 

    While that may be a factor it would seem likely that adding the feature would require some deep (and therefore risky) changes, and making ti work would probably have a performance imapct - and games cards sell on performance - with only limited benefit in the core markets for these devices.

  • marblemarble Posts: 7,500

    PerttiA said:

    daveso said:

    why is it that these cards do not use system ram? maybe you would need 64gig but why not?

    One reason is that it would slow them down. 

    Wasn't there some talk about offloading textures to system RAM? Doesn't Octane do that (Out-of-Core)? 

Sign In or Register to comment.