Holy Cuda Cores, Renderman! My GPU has tripled in price!

13567

Comments

  • Faeryl WomynFaeryl Womyn Posts: 3,316

    A number of you heard me go on about my currently 16 year old computer. I can use Daz, only draw back is that I can use dforce and iray is a major pain, even 3delight has been giving me grief lately, however that's mainly due to the fact I recently discovered how much dust is in this thing...lol  The fans hate me right now...lol

    Thanks to someone who heard my rants about my computer I now have a newer one, lots more ram and a 650 nvidia card and you know what, considering how much of an upgrade that is from the old one, I'm happy with that. Of course I do want a better gpu at some point, maybe even more ram at some point, doesn't change the fact iray is a way lesser headache and I can do dforce now so ya...I'm happy.

    If I can spend 6 years on such an old computer, can't many of you spend a year or 2 on your current one till the prices start to lower?

  • kyoto kidkyoto kid Posts: 40,593

    ,..between my Titan-X and system RAM it's about twice the difference. A scene that is 6 GB in system RAM comes to a around 3GB in VRAM (running MSI Afterburner to monitor the render process).  

  • Subtropic PixelSubtropic Pixel Posts: 2,378
    edited April 2021

    kyoto kid said:

    ...will and M1 system be good for rendering Iray on?  Likely not given that Apple and Nvidia parted ways a few years ago.  The latest iOS  no longer even supports Nvidia drivers. 

    No, M1 is based on the ARM architecture.  So CPU and GPU don't "fit together in the old way".  This is all new, so it may come about that one day M1 (or most likely M1+n) can serve as an unbiased rendering platform, but it's not available now.

    This is all future stuff, which is why I brought it up.  This may just be the thing we need to get us out of the legacy rendering hole we're in.  And because Apple is the one who decided to get a divorce from Intel and AMD (remember, the M1 replaces not just Intel's CPUs, but also the AMD graphic cards that currently get installed into Mac computers), then Apple may just be the company with the vision for that.  Or maybe the M1 process will one day beget even more new ideas.  TSMC makes the M1 chip and there are other ARM chip makers out there, so there's always a chance for innovation.

     

    McGyver said:

    Subtropic Pixel said:

    McGyver said:

    The great thing to keep in mind is that after demand dies down a bit and the prices are done skyrocketing, they will hardly go down much and the new higher prices will be the accepted "average" customers are willing to pay... so win-win!
    If you have money invested in those companies... otherwise we're all screwed... and not in the good way.

    Or maybe not.

    EVERYTHING changes with Apple's new M1 "System on a Chip" (SoC).  The new chip, based on the ARM architecture, comes with its own multi-core CPU, multiple GPUs, and RAM all built in.

    How does this change things?

    Well, the CPU components are split into "efficiency cores" and "performance cores".  In the current M1 chip, there are 4 of each of these cores.  The former are low-power and low-heat cores that run that "usual stuff" like word processing tasks, or browsers.  The stuff we spend 90% of our computing time making use of.  The latter cores engage when there's real work needed.  They schedule and dispatch the work quickly so that the performance cores can go back to sleep right away; nighty night! 

    I haven't heard if the built in GPUs behave similarly, but if you think about it, this could be done in a future iteration of the chip.  Have multiple GPU cores, and just let them scale up and come into service as there's work to be done; then shut them down when the work is completed.  Surely with some thought, the engineers could make something that could recognize the difference between graphic rendering work and non-graphic "GPGPU" (General Purpose GPU eligible) work.

    And then there's the RAM, also built in.  So far, SSD/Hard Drive space is not a part of the M1 SOC, which is probably just as well for people like me who buy them two-at-a-time (for production+backup).

    Apple is putting the M1 into iMacs, MacBook Pros, and iPads.  The chip has a translation feature called "Rosetta" that can allow "old fashioned" X86 and X64 programs run well, but the long-term is that Apple wants to see developers create "universal binaries" that can run on the old X and M architectures equally well.

    So far, the M1 chips have been making a good showing.  As fast as the Intel parts while generating less heat (and by extension, generating less noise too).  I think this may be promising, not only for the future of CPU or "compute" type processing, but maybe also for graphic processing (that is, processing of graphics), as well as general purpose graphic processing too (including folding, boinc, seti, and crypto).

    Yes, this is all Apple.  But maybe all it takes to change an industry is one thought-leader willing to stick their neck out.

    And all of this COULD cause a crash in the old-fashioned Intel/AMD/Nvidia ecosystems.

    I don't know... I'd love that to be true, but I don't see the M1 translating into real changes for the Windows World any time soon...

    I didn't claim that it would happen "soon".  New ways of thinking always take time to implement.

    and even if it did, nobody would be willing to lower prices, they'd rather try selling outdated hardware at a premium then take a hit to their expectations.

    Pricing is not part of anything I said either.  Pricing will adapt based on the rise and fall of supply versus demand.


    Physical stores still sell at a discount to get rid of unwanted stock, online stores will always find a rube to pay full price... nobody is really buying this stuff from a store you walk into.

    That and nothing ever comes down after a crisis or demand based price surge, and on rare occasions when it does, it's still considerably higher then any normal cost of living type price increase one would expect to see over the same time span.

    I didn't say anything about stock and physical stores either.  All I said was that it's possible that ARM architecture, and more specifically, Apple's implementation of it in the M1, may change everything for the better.  But not right now.  What's happening right now would just be the impetus for change.  If the Intel, AMD, and Nvidia triumvirate were working well, there'd be no reason to consider doing anything differently, would there?

    Just like when land-drilled oil became more scarce, we started looking for oil under water.  And then we started looking at alternate sources of energy.  The difficulty of finding energy became the impetus for change.

    Just look at literally everything after fuel prices shot up after Hurricane Katrina... gas prices stabilized and eventually dropped, but the things that we were told were costing more because of increased shipping costs due to fuel prices, only stabilized, they didn't go down to reflect the decrease in fuel prices.

    Prices didn't go down, partly because demand stayed high.  After all, it takes fuel to rebuild, well, anything.

    People got used to the prices and companies were like "yeah, that's what it costs now".

    Yep, and I believe that's because we had so many roadblocks to rebuilding after Katrina.  Unreasonable blockers just ended up raising the cost of recovery, which ended up slowing it down while also keeping prices high.

    But I'm just extremely pessimistic when it comes to this stuff.

    And you should be pessimistic.  You're RIGHT to be pessimistic.  The evidence is plain that recovery and growth were artificially hampered, and this certainly IS unconscionable.  But if you're looking for somebody's feet at which to lay the blame, it probably should be spread around more than you might be inclined to.

    Post edited by Subtropic Pixel on
  • windli3356windli3356 Posts: 219

    Glad I pulled the trigger bought a whole machine at $3600 just for RTX3090 at the end of 2020 because the card itself is impossible to find, now my RTX2060 worth about $2000 smiley  thinking about selling it 

  • kyoto kidkyoto kid Posts: 40,593

    ...most likely with the years I have remaining I'll still be dealing with the standard CPU + GPU.

    Not holding my breath with ARM technology yet, as while Apple's been an innovator they've also had a few failures as well (LISA, NEXT, the "trash can" Pro, and the iMac Pro).   They also lost me with their marketing and pricing philosophy particularly when they went back to "closed architecture" after the Mac II, Power PC, and original "Cheesgrater", (which meant that upgrading was at their prices and sometimes involved buying a completely new, more powerful version of the same machine).  Issues with software compatibility and their iOS is another (as an example the fact Daz still doesn't have a version of their software ready for the "Big Sur" OS after over five months).

    PCs with separate CPUS and GPUs may seem "clunky" in comparison to the ARM concept but they get the job done and at a more affordable cost to the end user (well save for GPUs right now thanks to the latest "Crypto Rush" and chip shortage).  

    Again, thankful for my old Titan-X. 

  • Got my hands on an RTX 3090(or rather will as soon as it arrives in about a week) from a legit source: $4000 surprise .

    One step closer to having all the parts for my build:

    AMD Ryzen Threadripper 3960X Processor
    Floe DX RGB 360 TT Premium Edition cpu cooler
    ROG Zenith II Extreme AMD TRX40 E-ATX motherboard
    G.Skill Royal Z DDR4-3200MHz CL14-14-14-34 1.35V 128GB (8x16GB)
    EVGA GeForce RTX 3090 KiNGPiN HYBRID 24Gb GDDR6X
    NVidia RTX A6000 48Gb GDDR6

    Seagate Firecuda 520 2Tb M.2 NVMe PCIe Gen4 SSD
    WD_BLACK 6Tb Performance Desktop Hard Drive
    Corsair or EVGA 1600-watt Titanium-rated PSU
    Riing Quad 12/14 RGB Radiator Fan TT Premium Edition(all radiator/case fans)

    The Case I haven't decided on yet, but so far I've picked out either:
    Lian Li's pc-o11d-rog (model O11DXL-X ) or Thermaltake's View 71 Tempered Glass ARGB Edition. I think I'm leaning more towards the Lian Li because its almost half the net weight vs the View 71.

    I think the next difficult part to get is going to be the 1600-watt PSU since those are being sucked up by a lot of Butcoin miners. The APC UPS might cost another arm & leg depending on what else I need to connect to it.

  • IceCrMnIceCrMn Posts: 2,114

    MSRP on the 3090 is $1,499.

    I simply won't pay these ridiculous prices.

    I'll just do without a new Nvidia card and my 2060 will have to be good enough.

    Seems everyone is catering to the miners these days and by the looks of it that won't be changing any time soon.

    I honesty believe nvidia will abandoned gaming and all other endeavors to focus their entire company on crypto-mining in the near future.

  • kyoto kidkyoto kid Posts: 40,593
    edited April 2021

    ...is that the price for just the 3090?  At that price I'd have gone with a second A6000 (only 600$ more) and an NVLink bridge, along with double the system RAM as it appears cost is no concern for this build. 

    Otherwise I don't understand the purpose of having both an A6000 and 3090 on the same rig as the A6000 has twice the VRAM of the 3090. 

    Post edited by kyoto kid on
  • WendyLuvsCatzWendyLuvsCatz Posts: 37,885
    edited April 2021

    as far as I am concerned this has destroyed what was once a community I enjoyed being part of.

    in the old days back in 2010 my lousy Dell laptop was able to run Poser and Carrara (not DS3 terribly well)

    my lack of hardware did not exclude me from using DAZ 3D assets

    [  ]

     

    (I admit I still will likely buy older products on sale sometimes but also don't wish to support a company not with my interests at heart with my custom and money)

    certainly nothing G8.1 or using features found later than DS4.11 is of much use to me with my hardware

    Post edited by Chohole on
  • PaintboxPaintbox Posts: 1,633
    edited April 2021

    Yeah, people forget if this continues long, DAZ will lose its community simply because there are no cards for hobbyists and even professionals to render IRAY with.

    So it's absolutely wonderful to see how they thought NFT would be "great" for the community. Enough said.

    https://en.wikipedia.org/wiki/Tulip_mania

    Post edited by Paintbox on
  • TaozTaoz Posts: 9,739

    kyoto kid said:

    ...most likely with the years I have remaining I'll still be dealing with the standard CPU + GPU.

    Not holding my breath with ARM technology yet, as while Apple's been an innovator they've also had a few failures as well (LISA, NEXT, the "trash can" Pro, and the iMac Pro).   They also lost me with their marketing and pricing philosophy particularly when they went back to "closed architecture" after the Mac II, Power PC, and original "Cheesgrater", (which meant that upgrading was at their prices and sometimes involved buying a completely new, more powerful version of the same machine).  Issues with software compatibility and their iOS is another (as an example the fact Daz still doesn't have a version of their software ready for the "Big Sur" OS after over five months).

    PCs with separate CPUS and GPUs may seem "clunky" in comparison to the ARM concept but they get the job done and at a more affordable cost to the end user (well save for GPUs right now thanks to the latest "Crypto Rush" and chip shortage). 

    +1

  • IceCrMnIceCrMn Posts: 2,114

    Paintbox said:

    Yeah, people forget if this continues long, DAZ will lose its community simply because there are no cards for hobbyists and even professionals to render IRAY with.

    So it's absolutely wonderful to see how they thought NFT would be "great" for the community. Enough said.

    https://en.wikipedia.org/wiki/Tulip_mania

    We are already there.

    Getting a new GPU just isn't going to happen, unless you are willing to pay thousands of dollars for a card that retails for a few hundred.

     

  • Yeah, very glad I finally managed to build my current rig at the start of 2020, before everything hit. I'd intended to build it at the start of 2019, but my husband got laid off and that put the whole thing off for a year. If I'd waited two months later, I'd have had to put it off again, and quite likely I'd be having to still put it off now, or having to suck it up and get on the waiting list for a prebuilt rig, or basically gutting my old system for the graphics card (that old system went into a custom built virtual pinball rig, we're stuck on that because I can't really upgrade the computer's graphics right now to take full advantage of the new screen I'm dropping into it in a couple of weeks).

    Apparently the specific chip shortage that's hitting graphics cards AND cars is some display controller IC.

    I have no doubt that nVidia (and other makers of said cards) may be looking at the money they could make just making mining cards, or at least repurposing production that would have normally been used for graphics cards that they can't make because ICs those need that mining cards don't just aren't available in the quantities they need for demand.

    I also expect we'll start seeing the MSRP on graphics cards rising as it's clear demand isn't abating.

  • kyoto kid said:

    ...most likely with the years I have remaining I'll still be dealing with the standard CPU + GPU.

    Well, of course each of us has to decide the best course for our own personal situation.

    Not holding my breath with ARM technology yet, as while Apple's been an innovator they've also had a few failures as well (LISA, NEXT, the "trash can" Pro, and the iMac Pro).   They also lost me with their marketing and pricing philosophy particularly when they went back to "closed architecture" after the Mac II, Power PC, and original "Cheesgrater", (which meant that upgrading was at their prices and sometimes involved buying a completely new, more powerful version of the same machine).  Issues with software compatibility and their iOS is another (as an example the fact Daz still doesn't have a version of their software ready for the "Big Sur" OS after over five months).

    I never said that ARM would create a utopia.  Only that the current pinch may be the impetus for this new architecture to maybe make some big changes in the industry going forward.  You're right that we can't keep going on like this with gaming cards disguised as $4,000 "everyday solutions".  I would say that the current way is simply not supportable.

    The more people who DON'T upgrade, the fewer people to make use of new software and technologies down the road.  And THAT is not sustainable in any way.

    PCs with separate CPUS and GPUs may seem "clunky" in comparison to the ARM concept but they get the job done and at a more affordable cost to the end user (well save for GPUs right now thanks to the latest "Crypto Rush" and chip shortage). 

    "Clunky" isn't even on my radar; not sure where you got that, or if you were responding to me on that point.  If they fit in the case and they're compatible with all the other stuff in a build, then they're not clunky.  My laptop has an RTX 2080 in it.  It fits into the laptop form factor and it's compatible, so by definition, it's not clunky.  But it does get scaldingly hot under load, yes it does.  So maybe this isn't a laptop after all, but a "lapcooker"?  But I digress...

    And you say "save for GPUs right now", but I submit to you that maybe this time it's a sea-change that we're going through, and not just a tide that comes in and goes out every 12.5 hours.  Meaning that this is not necessarily a "right now" kind of situation, and that maybe, just maybe $4,000 GPU cards IS WHAT THEY COST.

    The marketing people at Nvidia surely know that eventually the Crypto craze will experience major attrition.  It must, because that is the way of things.  People will figure out that they can't make money at it (or that only a very few big players actually have any chance of making money at it), and then they will start losing interest.  People will stop buying and prices will come back to earth.  And THEN, crypto miners will start getting out of it, and they'll put their used gear on the used market, which will absolutely crash the market for new GPUs for a period of time.

    I suggest that Nvidia's and AMD's marketing soothsayers already know that this is a risk.  They know that R&D needs constant funding injections.  They know that if the market crashes, then a lot of things become more risky and business becomes more dangerous; more prone to failure.

    Businesses don't like uncertainty.  Markets hate uncertainty.  And companies need to know what supplies, ping, power, pipe, and feedstocks are going to cost, and what they can charge for products.  It would be in Nvidia's and AMD's best interests to be able to meet demand with current or somewhat growing supply.  When the demand chart goes parabolic and starts to look like the left side of a Christmas tree or the Eiffel Tower, companies would do well to resist the temptation to lay out cash for new factories and so forth, because parabolic left-side charts often acquire a matching right-side.  And then, look out below!

     

    magog_a4eb71ab said:

    Got my hands on an RTX 3090(or rather will as soon as it arrives in about a week) from a legit source: $4000 surprise .

    This is a perfect example of somebody doing exactly what they need to do.  In this case, "pay up" is his decision.  It's not an invalid decision!

     

    IceCrMn said:

    MSRP on the 3090 is $1,499.

    I simply won't pay these ridiculous prices.

    I'll just do without a new Nvidia card and my 2060 will have to be good enough.

    Seems everyone is catering to the miners these days and by the looks of it that won't be changing any time soon.

    I honesty believe nvidia will abandoned gaming and all other endeavors to focus their entire company on crypto-mining in the near future.

    And here is another person who has decided what he needs to do.  Different from KK and Magog above.

    People are already making their choices, in their own ways in their own time.  Me?  I've chosen to go outside and get my garden ready for spring planting.  Then I'll come back in a couple hours and maybe grill a wild boar sausage for lunch.

    It's not that I can't afford a $4,000 3090 card.  It's what I want.  But the price is a barrier for me because I plan to have a good, fun, and pleasant retirement one day, and maybe art will be a part of that (I sure hope it is).  It's not that I have decided against a $4,000 card, oh no.  But on balance, I'm definitely leaning AWAY from such a purchase, so it's more accurate, I think, to say that I have decided not to buy a $4,000 card today.  Well, it's still early here in Florida, so maybe it's more accurate to day that I've decided definitely not to buy a $4,000 card before lunchtime.  And probablly not this afternoon either.  cheeky

    As the saying goes, we live in interesting times.  smiley

  • Philippi_ChildPhilippi_Child Posts: 648
    edited April 2021

    Very content with what I have and if it means using older content G3 and G8 (eventually G8 will be out out to pasture) so be it. I'm just lowly hobbyiest that just wants to have fun and relax. So I'll stick with the PC I built 2 years ago. I read and article in PC mag back in Feb where Nivdia was dusting off outdated GPU because of the world wide chip shortages. They planned on re-releasing GTX 1050Ti and RTX 2060 cards. Whether this happened I am not sure since I am not in the market for a graphics card but because of the seriouness of the shortage these cards coulb doubled in price now. This chip shortage is forcasted to continue into later 2022.

    But there is a bright side it's becoming golf weather!!

    Post edited by Philippi_Child on
  • The M1 discussion is, as acknowledged, highly speculative and not really relevant to the this thread - please drop it before it turns into platform warring.

  • Faeryl WomynFaeryl Womyn Posts: 3,316

    Paintbox said:

    Yeah, people forget if this continues long, DAZ will lose its community simply because there are no cards for hobbyists and even professionals to render IRAY with.

    So it's absolutely wonderful to see how they thought NFT would be "great" for the community. Enough said.

    https://en.wikipedia.org/wiki/Tulip_mania

    Why would Daz have to lose it's community just because other companies can't keep up with the demand for a product. It's not Daz' fault. Also don't most of you already have working computers turning out iRay renders, with lots of cores and ram and other stuff.

    Having the newest, most expensive, rarest, biggest...etc etc etc anything doesn't mean anyone's work will improve or that they'll be left behind. Everyone seems to be forgetting a very important aspect of what anyone does, it's called skill.

  • TorquinoxTorquinox Posts: 2,637
    edited April 2021

    Skill means nothing when your hardware dies and you can't get anything new 'cause there's nothing to buy. Spending $4k+ for a $1500 card or $1500 for a $300 card is beyond the pale. It's stupid, and what is directly to blame? Did you say crypto and NFTs? And what is Daz now peddling? Did you say NFTs? Well then. My observation, we have a righteous rage brewing in this community, and it's only so long before people find other ways to occupy their time, other places to spend their money. It may not cme to that, but some of us want some speedy new hardware.

    Post edited by Torquinox on
  • MelissaGTMelissaGT Posts: 2,610
    edited April 2021

    Faeryl Womyn said:

    Paintbox said:

    Yeah, people forget if this continues long, DAZ will lose its community simply because there are no cards for hobbyists and even professionals to render IRAY with.

    So it's absolutely wonderful to see how they thought NFT would be "great" for the community. Enough said.

    https://en.wikipedia.org/wiki/Tulip_mania

    Why would Daz have to lose it's community just because other companies can't keep up with the demand for a product. It's not Daz' fault. Also don't most of you already have working computers turning out iRay renders, with lots of cores and ram and other stuff.

    Having the newest, most expensive, rarest, biggest...etc etc etc anything doesn't mean anyone's work will improve or that they'll be left behind. Everyone seems to be forgetting a very important aspect of what anyone does, it's called skill.

    Ok, so what happens when your current hardware goes belly up? Because it does have a finite lifespan. I'm running a 1080TI from August 2017. I also game with it. How many years does it have left? Last I checked, even 1080TI's are selling for double MSRP (or more).

    Post edited by MelissaGT on
  • TorquinoxTorquinox Posts: 2,637

    melissastjames said:

    Ok, so what happens when your current hardware goes belly up? Because it does have a finite lifespan. I'm running a 1080TI from August 2017. I also game with it. How many years does it have left? Last I checked, even 1080TI's are selling for double MSRP (or more).

    Agreed!

  • j cadej cade Posts: 2,310
    Torquinox said:

    Skill means nothing when your hardware dies and you can't get anything new 'cause there's nothing to buy. Spending $4k+ for a $1500 card or $1500 for a $300 card is beyond the pale. It's stupid, and what is directly to blame? Did you say crypto and NFTs? And what is Daz now peddling? Did you say NFTs? Well then. My observation, we have a righteous rage brewing in this community, and it's only so long before people find other ways to occupy their time, other places to spend their money. It may not cme to that, but some of us want some speedy new hardware.

    The thing is (and you even say it) you don't "need" the $1500 card to use DS. Whether it's at MSRP or some ridiculous markup.

    It's one thing to say "I want super fast renders so I want a high end gpu and it sucks that you can't get one right now" it's a very different claim that "no one is going to use DS because no one can get a machine that has the specs to run it" and as your proof use that high end gpu.

    You don't *need* a custom built computer with the latest and greatest gpu to use DS. I've never used DS on anything other than midrange laptops. Personally I can't imagine spending 1500 on a gpu ever. I have never spent that much on an entire computer.

    I'm not saying the state of the gpu market is not awful and that it's illegitimate to be upset about it. Just that tying this to a complete inability to use DS at all is a massive overstatement
  • GalaxyGalaxy Posts: 562

    If this global price madness continues then in future I will use 3delight only rendering. I will buy or look for used PC. I will use outdated PC components and probably outdated software version, but I cannot give up rendering...

  • TorquinoxTorquinox Posts: 2,637

    j cade said:

    Torquinox said:

    Skill means nothing when your hardware dies and you can't get anything new 'cause there's nothing to buy. Spending $4k+ for a $1500 card or $1500 for a $300 card is beyond the pale. It's stupid, and what is directly to blame? Did you say crypto and NFTs? And what is Daz now peddling? Did you say NFTs? Well then. My observation, we have a righteous rage brewing in this community, and it's only so long before people find other ways to occupy their time, other places to spend their money. It may not cme to that, but some of us want some speedy new hardware.

    The thing is (and you even say it) you don't "need" the $1500 card to use DS. Whether it's at MSRP or some ridiculous markup.

     

    It's one thing to say "I want super fast renders so I want a high end gpu and it sucks that you can't get one right now" it's a very different claim that "no one is going to use DS because no one can get a machine that has the specs to run it" and as your proof use that high end gpu.

     

    You don't *need* a custom built computer with the latest and greatest gpu to use DS. I've never used DS on anything other than midrange laptops. Personally I can't imagine spending 1500 on a gpu ever. I have never spent that much on an entire computer.

     

    I'm not saying the state of the gpu market is not awful and that it's illegitimate to be upset about it. Just that tying this to a complete inability to use DS at all is a massive overstatement

    Sure. It's okay to limp along with substandard hardware, even though we all know the latest products are more resource intensive than ever. Maybe some of us are tired of limping along.

  • TorquinoxTorquinox Posts: 2,637
    edited April 2021

    Galaxy said:

    If this global price madness continues then in future I will use 3delight only rendering. I will buy or look for used PC. I will use outdated PC components and probably outdated software version, but I cannot give up rendering...

    I'm already doing that. My system is old and still soldiering on. I was looking forward to being able to do IRay, to running 3D Coat without giving my system apoplexy, to doing cloth simulation and other resource intensive operations, to being able to efficiently render some animations, etc. Now, all that is caught in a holding pattern. Not happy, no.

     

    Anyway, I vented a lot more in this thread than I usually do, but I will leave my posts as I am reasonably certain I am not the only one feeling this way right now.

    Post edited by Torquinox on
  • Richard Haseltine said:

    The M1 discussion is, as acknowledged, highly speculative and not really relevant to the this thread - please drop it before it turns into platform warring.

    Sure, I'll let it drop.  To platform war for or against ARM in its current state would be akin to going to battle with a bag of Beanie Babies, soup ladles, ice cream scoops, the book "Catcher in the Rye", and catnip.  There'd be no way to win such a battle, although the ensuing pandemonium might be funny to watch.  But then anytime catnip is involved, pandemonium and hilarity usually follow forthwithedly; that other stuff is unneeded. wink

     

  • Torquinox said:

    Skill means nothing when your hardware dies and you can't get anything new 'cause there's nothing to buy. Spending $4k+ for a $1500 card or $1500 for a $300 card is beyond the pale. It's stupid, and what is directly to blame? Did you say crypto and NFTs? And what is Daz now peddling? Did you say NFTs? Well then. My observation, we have a righteous rage brewing in this community, and it's only so long before people find other ways to occupy their time, other places to spend their money. It may not cme to that, but some of us want some speedy new hardware.

    Sure its stupid, but it is what it is. No amount of kicking & screaming is going to change it. I don't like it or butcoin mining anymore than you do. The shortages are going to continue probably well into mid-2022 and I've already been kicking the can down the road long enough when it comes to replacing my laptop, so if anything its probably even more stupid to defiantly sit around waiting for a miracle surplus to appear out of thin air when the prices are going to climb even higher. I probably could have even went with a lesser card for half the price, but my system specs & the environment its going to be in require a decent cooling solution for the card when rendering, which meant going with the Kingpin RTX 3090. Being that I could not get two of them, the only available alternative was a high end Quadro RTX A6000 which runs at lower wattages and doesn't throw off as much heat inside the case. I'll be able to game and have a render going at the same time with no issues OR I'll be able to do it quicker using both cards.

    So if I can swing it now, that's what I'm going to do. Stupid? Yeah, because I waited so long up to this point so I have to pay more. It would be even more stupid to wait longer; especially since I do need a new system and don't want to buy pre-builts or cookie-cutter systems that have limited options.

    I should also point out that Dell and Digital Storm still have RTX cards(all the way up to/including 3090s) available for their systems, so unless you're like me and are very picky or don't need a new desktop, you can get them without being scalped. The only catch is, you need to purchase the whole system, so it might work for some while others not so much.

  • IceCrMn said:

    MSRP on the 3090 is $1,499.

    I simply won't pay these ridiculous prices.

    I'll just do without a new Nvidia card and my 2060 will have to be good enough.

    Seems everyone is catering to the miners these days and by the looks of it that won't be changing any time soon.

    I honesty believe nvidia will abandoned gaming and all other endeavors to focus their entire company on crypto-mining in the near future.

    Its not going to happen. I could be wrong, but I don't think NVidia is dumb enough to make that kind of move. They might ramp up production a little, but beyond that, they're not going to invest in new facilities & such just for the sake of selling more graphics cards to a market share that is based on speculation and uncertantly with a whole lot of risk. And they're certainly not going to abandon gaming for crapto-currency. Gaming is more reliable economically in the long term while crapto-currency is not.

    The whole crapto-currency madness is no different than any other popular fad both past & present. Eventually, it will become a s***streak that everyone will want washed out. They're learning that in Turkey now when the top guy of the one crypto-currency company over there decided to leave with over $2 billion of everyone's money leaving them high and dry.

  • outrider42 said:

    This is about crypto. While COVID played a big role, it is ultimately crytpo that has driven costs the way they are. With crypto at record highs it is just too enticing to buy a card. People will buy a GPU even if they don't need it, because they can pay it off thanks to mining and make some cash doing nothing. Mining is the ultimate way to make money for nothing, Dire Straights can't top that. This is why prices are too the moon, because enough people are willing to pay them. It is not simply supply, supplies have been terrible for hardware launches in the past, scarce hardware is absolutely not new. In previous launches cards were very hard to get for weeks and months, but that not did cause prices to triple or inflate the way they have recently. Just because they were hard to get didn't mean people were paying triple for them. I think this point gets lost on a lot of people. Nvidia has actually produced quite a lot of 3000 series cards. They routinely restock numerous cards every couple weeks. I refuse to believe that demand is that high purely because more people are at home. Most basic computers are just fine for working at home, you don't need a freakin' RTX 3090 to take phone calls remotely. And if you are working at home, you should be working, you are not getting more game time. No, the demand is due to GPUs becoming money printers, simple as that.

    In years past consoles would be scalped on ebay during launch. But even back then the prices were not so insane, even with inflation. You might see one or two units get sold for a wild price, but most scalping sales were just a couple hundred higher than MSRP. Today everything is wack.

    I know what you might say, "but the consoles do not mine". No, but that doesn't mean that the buyer hasn't made money mining, does it. Crypto has introduced a whole lot of new currency into the market, and while COVID has hurt some businesses, there are plenty of people who have done extremely well during COVID (some by mining, some by other means). Enough people to be able to buy goods for almost any price and not care about the cost. The gap between the rich and the poor is widening, and right now the rich are flexing their wallets and inflating everything beyond normal people's reach.

    If crypto just suddenly died tomorrow, GPU prices would fall drastically within days afterward.

    And keep in mind that buying a Quadro should not be the answer here. You should only buy a Quadro if you actually need a Quadro, not because the prices of gaming cards are out of control. Having 48GB of VRAM does you no good if your system only has 64GB of RAM installed. Even 128GB may not cut it. Guys, I can hit 50GB+ of RAM in use when I render and I have two 1080tis, those have 11GB VRAM. That is basically a 5 to 1 RAM to VRAM ratio. So if you get a A6000 with 48GB of VRAM, it is possible you will not be able to access the full capacity of that VRAM in 99% of configurations. You may seriously need 512GB of RAM for this system if you wish to use that amount of VRAM. If you use 128GB of RAM, you might only reach around 24GB of VRAM in use....the size of the 3090. And the 3090 is also faster than the A6000. It is not drastically faster, but it is faster. That might open the door for the A5000 I suppose, but then the question is why would you go for a slower and still yet very expensive card? The Quadro line has advantages, but most of them do not apply to us here. The only real advantages are they might be smaller and use less power than the gaming versions. That is pretty much it. They can do TCC mode, but recent Windows updates have made TCC mode less important since Windows no longer gobbles up as much memory. On my 1080tis, the one connected to the monitor has actually reported 11GB in use while the 2nd card without a monitor has reported 10.8GB in use during renders. That was also when I had 50GB used in RAM. No other apps were open.

    I will probably never use 48 Gb VRAM, lol. The only time I would ever come close is if I use both the 3090 & A6000. I think the most I would ever use is 16-18Gb(for now). If & when I'm actually able to and need to use the full potential of the card, I'll either have a separate smaller workstation for it, or I'll just swap out the motherboard if necessary. This is just to hold me over until I have more options available to me while having something that I can potentially build upon, because right now the market in general just sucks.

  • Faeryl Womyn said:

    A number of you heard me go on about my currently 16 year old computer. I can use Daz, only draw back is that I can use dforce and iray is a major pain, even 3delight has been giving me grief lately, however that's mainly due to the fact I recently discovered how much dust is in this thing...lol  The fans hate me right now...lol

    Thanks to someone who heard my rants about my computer I now have a newer one, lots more ram and a 650 nvidia card and you know what, considering how much of an upgrade that is from the old one, I'm happy with that. Of course I do want a better gpu at some point, maybe even more ram at some point, doesn't change the fact iray is a way lesser headache and I can do dforce now so ya...I'm happy.

    If I can spend 6 years on such an old computer, can't many of you spend a year or 2 on your current one till the prices start to lower?

    Absolutely not. I've been on this laptop with a GTX 765M since 2013 and at the time I thought it was a good replacement for a desktop. It held true until I started working in Daz. Practically every scene I render gets kicked off to the cpu, so any kind of rendering with Iray is painfully slow. I'm also unable to use DForce.

  • kyoto kidkyoto kid Posts: 40,593
    edited April 2021

    Galaxy said:

    If this global price madness continues then in future I will use 3delight only rendering. I will buy or look for used PC. I will use outdated PC components and probably outdated software version, but I cannot give up rendering...

    ...I considered doing so a couple years ago and to that end purchased several 3DL utilities including River Soft's/SickleYield's RSSY IRay to 3DL Converter, Parris's IBL Master, and Wowie's AweShader system.  Was working on this scene as an experiment when a drive crash took everything.  Render time about 14 minutes. The Iray version, multiple hours (CPU mode).

    Attachment does have a bit of post processing in Exposure 3 to give it a 1960s photo look.

     

    -bus stop 1960s photo.jpg
    1500 x 1125 - 2M
    Post edited by kyoto kid on
Sign In or Register to comment.