Holy Cuda Cores, Renderman! My GPU has tripled in price!

24567

Comments

  • TorquinoxTorquinox Posts: 2,637

    I read that increased demand from AI is contributing to the RAM & SSD shortage. It seems like there is a pool of hardware-intensive technologies coupled with increased demand due to Covid that is working together to hoover up everything. It's a seller's market, and it may stay that way for quite a while.

  • GoggerGogger Posts: 2,313

    melissastjames said:

    Gogger said:

    JamesJAB said:

    Yes, you can configure a base Alienware R10 (AMD) or R12 (Intel) with a RTX 3090 for about $3000.... You know the same price that the card alone is going for on ebay.

    That's how I got my RTX3090 early this year. Kick butt video card in a kick butt computer! It is my fourth AWESOME Alienware desktop in ten years. Three still get used every day. (I just don't have room for the fourth to be operational, and it is a bit dated now anyway). I have second guessed myself many times since this last purchase, but I am glad I did it when I did - things have only gotten worse.

    But what model 3090 did you get? All the configurators I priced out didn't even specify, which means you get whatever they have in stock. I specifically want the EVGA RTX 3090 FTW 3 Ultra and I haven't seen any OEM build houses offereing EVGA cards at all, let alone that specific one.  

    I'd have to check actual card specs when I get home, but I do know it was NOT a straight up Nvidia card like I would have preferred. I honestly don't have a strong preference for any particular card so I was easy to please. I hope you are able to find exactly what you are looking for!  

    This scalper crap sure ruins A LOT and the crypto craze robbing the general hobbyist markets is disappointing as well. I wonder if this is just how it is now, or if things will eventually get better? 

  • TorquinoxTorquinox Posts: 2,637

    melissastjames said:

    Torquinox said:

    As has been mentioned before, an alternate strategy is to order a ready-made machine with the GPU already installed. You can usually get the machine in a couple weeks. But even there, the price is only a bargain compared to what's going on with GPUs alone. And you may not be able to get the very specific version of 3090 that @melissastjames seeks. 

    I tried a few different build configs online (CyberPowerPC, OriginPC, etc) and I couldn't get the card I want, or the exact combination of case/ram/mobo that I want...and by the time I had something even comparable to the build list I put together on my own, the price was pretty much the same...including the current scalper price that PC PartPicker is showing for the GPU. So no savings there.  

    I believe it. Prices are up! It seems to be the way things are now.

  • droidy001droidy001 Posts: 277

    Think I've got a 560ti with 1 dodgy fan somewhere. I'll have to look for it.

     

    Anyway, I ordered the cheapest system I could that allowed me to add a 3060.

    Site said they had them in stock but alas they were telling fibs. I got an email same day saying they were due in the following day, but since then nothing. The tracker show they haven't started the build on my system yet.

     

    All sorts of plans going through my head. It has a 3400g so I  could harvest the gpu and sell the rest as a full budget pc, or keep it as a htpc.

     

    One thought was to swap the gpu with my titan x and the cpu with my ryzen 2600 and at current prices that would nearly pay for the upgrade. 

  • Best Buy online keeps selling a few Nvidia 3090 cards every few weeks it seems at US$1499 in the late morning or early afternoons but it's hard to catch...I came close a few months ago when they had stock but I wasn't fast enough ordering.

  • droidy001droidy001 Posts: 277

    Best Buy online keeps selling a few Nvidia 3090 cards every few weeks it seems at US$1499 in the late morning or early afternoons but it's hard to catch...I came close a few months ago when they had stock but I wasn't fast enough ordering.

    Totally sucks, those bots are fast. Retailers are saying that they're trying to stop them, but tbh I don't think they care as long as they're selling.
  • IceCrMnIceCrMn Posts: 2,114

    StratDragon said:

    I knew prices had gone up, I bought a $279 1660Ti in 2019, it's almost $1,000.00 US now
    Apparently this is not the time to buy a GPU.

     

    Posted this when I saw an RTX 3090 listed for $5,450 on newegg

    https://www.daz3d.com/forums/discussion/475796/current-gpu-cost-and-availability-issues/p1

     The card sold later that same day.

  • Yeah, the whole thing is just nutso.

    I picked up a Titan RTX at list price from New Egg last year and then 6 months later they announced the 3090 and I about had a heart attack when I saw the specs and the price.

    Flash forward 9 months later and now you can't find either card to save your life. I'm so glad I picked up the Titan RTX when I did.

  • y3kmany3kman Posts: 765

    You're forced to buy a prebuilt if you want a new video card now.

  • SlimerJSpudSlimerJSpud Posts: 1,453

    There's a worldwide chip shortage affecting many industries. It's not due to crypto, it's due to viro. Fallout from the pandemic reaches far and wide. Supply and demand, folks. First it was the pandemic, but now it's more about buying habits. People decided they LIKE working from home ==> more computers, more screens, more new phones.

  • kyoto kidkyoto kid Posts: 40,593
    edited April 2021

    y3kman said:

    You're forced to buy a prebuilt if you want a new video card now.

    ...that's why I'm considering the A5000 and being done with it. I already have two solid and very serviceable DIY systems, older yes, but they will still work. So on the render system, I have 12 CPU threads instead of 16, and PCIe 2,0, not that much of a deal as the latter only means it takes a bit longer to load a scene into VRAM. 

    Crikey for the price of that one seen at Newegg that was mentioned above, I could get two A5000s and the NVLink bridge (48 GB + a total of 16384 CUDA 512 Tensor and 128 RT cores, and still have enough change in my pocket to get six 8 GB sticks DDR3 1333. 

    Post edited by kyoto kid on
  • PerttiAPerttiA Posts: 9,537

    MeneerWolfman said:

    Yeah, the whole thing is just nutso.

    I picked up a Titan RTX at list price from New Egg last year and then 6 months later they announced the 3090 and I about had a heart attack when I saw the specs and the price.

    Flash forward 9 months later and now you can't find either card to save your life. I'm so glad I picked up the Titan RTX when I did.

    I have also been patting myself on the back as I bought an RTX 2070 Super in June last year, at the time when everybody was saying "Don't do it... The new, cheaper and better 30xx cards will be just around the corner..." 

  • Joe2018Joe2018 Posts: 238

    In german tech magazines was a statement that Nvidia said the price will be high the whole year. Currently the magazines also say RAM Price will increase between 20 and 30%.

    So I am very happe that I get my new PC a few days ago: with a RTX 3070. The whole PC (Lenovo Legion 5i) was cheaper then the single graphics card - yes,that is Crazy!

    NS: it looks there is a shortage of components for many stuff. So the prices for scanners or printers increased between 100 and 300% in the last half year! (in Germany)

  • Eventually, there will be a market correction and prices will crash.

    Don't know when that will happen, but that's when I'll get my 2 3090s for $750 each.  In the meantime, there's lots of other things to do.

  • dyretdyret Posts: 182

    I just got an HP Workstation from 2012 with a 1050 in it for free. 4 gig is a little to litte so I looked for another one, and thes cards are now about 300 dollars used in Norway.

  • nonesuch00nonesuch00 Posts: 17,944

    Even my PNY GTX 1650 Super 4GB that can't even do RTX is selling used for $550 on Amazon. In fact, less than a year ago I was given that card free because they weren't selling at the $175 list price. I'd say it's people fishing and not getting bites except I sold my AMD Radeon RX 570 8GB MK2 OC for $350 about 3 months ago.

    Sort of glad I ain't doing much rendering lately as I wouldn't want to burn out my video card in this situation.

  • LeatherGryphonLeatherGryphon Posts: 11,189

    StratDragon said:

    I knew prices had gone up, I bought a $279 1660Ti in 2019, it's almost $1,000.00 US now
    Apparently this is not the time to buy a GPU.

     

    Yeah, I bought my  6GB GTX-1660 at about the same time.  Just for jollies I went looking to see if they were still available.  Prices shocked me when I finally did find one.surprise  This is crazy. 

    Now I'm looking for a basic 12GB RTX-3060 for anywhere near the MSRP (Manufacturer's Suggested Retail Price) or even under $500.  Almost snagged one yesterday but I wasn't quick enough.sad  But just my luck, they're still available at NewEgg if you want to shell out $1400 (US)angry  But beware of scams.indecision

  • ColinFrenchColinFrench Posts: 641

    AllenArt said:

    I've heard the same price jump is about to happen with ram and processors too.

     

    And SSD prices are beginning to jump because of a new crypto called Chia. From a recent RPS article:

    "Buying an new SSD could be about to become as difficult as buying a new, next-gen graphics card in the coming months thanks to the rise of a new cryptocurrency known as Chia. According to a new report from HKEPC (thanks Hexus), storage prices in Hong Kong have soared in recent days as miners snap up high capacity drives in anticipation of Chia being the next big thing, causing several models to go out of stock. Now, manufacturers in China have confirmed they're starting to produce dedicated cryptocurrency mining SSDs to help handle the growing demand."

    Full (depressing) article here.

     

     

  • N-RArtsN-RArts Posts: 1,437

    Oh, well. It's a good job that my HDD failed when it did, or else I wouldn't have been able to afford the SSD that now sits in its place.

     

  • McGyverMcGyver Posts: 7,005

    The great thing to keep in mind is that after demand dies down a bit and the prices are done skyrocketing, they will hardly go down much and the new higher prices will be the accepted "average" customers are willing to pay... so win-win!
    If you have money invested in those companies... otherwise we're all screwed... and not in the good way.

  • Subtropic PixelSubtropic Pixel Posts: 2,378
    edited April 2021

    McGyver said:

    The great thing to keep in mind is that after demand dies down a bit and the prices are done skyrocketing, they will hardly go down much and the new higher prices will be the accepted "average" customers are willing to pay... so win-win!
    If you have money invested in those companies... otherwise we're all screwed... and not in the good way.

    Or maybe not.

    EVERYTHING changes with Apple's new M1 "System on a Chip" (SoC).  The new chip, based on the ARM architecture, comes with its own multi-core CPU, multiple GPUs, and RAM all built in.

    How does this change things?

    Well, the CPU components are split into "efficiency cores" and "performance cores".  In the current M1 chip, there are 4 of each of these cores.  The former are low-power and low-heat cores that run that "usual stuff" like word processing tasks, or browsers.  The stuff we spend 90% of our computing time making use of.  The latter cores engage when there's real work needed.  They schedule and dispatch the work quickly so that the performance cores can go back to sleep right away; nighty night! 

    I haven't heard if the built in GPUs behave similarly, but if you think about it, this could be done in a future iteration of the chip.  Have multiple GPU cores, and just let them scale up and come into service as there's work to be done; then shut them down when the work is completed.  Surely with some thought, the engineers could make something that could recognize the difference between graphic rendering work and non-graphic "GPGPU" (General Purpose GPU eligible) work.

    And then there's the RAM, also built in.  So far, SSD/Hard Drive space is not a part of the M1 SOC, which is probably just as well for people like me who buy them two-at-a-time (for production+backup).

    Apple is putting the M1 into iMacs, MacBook Pros, and iPads.  The chip has a translation feature called "Rosetta" that can allow "old fashioned" X86 and X64 programs run well, but the long-term is that Apple wants to see developers create "universal binaries" that can run on the old X and M architectures equally well.

    So far, the M1 chips have been making a good showing.  As fast as the Intel parts while generating less heat (and by extension, generating less noise too).  I think this may be promising, not only for the future of CPU or "compute" type processing, but maybe also for graphic processing (that is, processing of graphics), as well as general purpose graphic processing too (including folding, boinc, seti, and crypto).

    Yes, this is all Apple.  But maybe all it takes to change an industry is one thought-leader willing to stick their neck out.

    And all of this COULD cause a crash in the old-fashioned Intel/AMD/Nvidia ecosystems.

    Post edited by Subtropic Pixel on
  • StratDragonStratDragon Posts: 3,167
    edited April 2021

    y3kman said:

    You're forced to buy a prebuilt if you want a new video card now.

     

    even that is hit or miss. I waited 6 months for an Alienware build that they kept saying was in stock for the first 3 months, then explained they didn't have the GPU after about 30 calls. Alas, this build was for a relative, I'm still duct taping my 2010 mac pro

     

    Subtropic Pixel said:

    Yes, this is all Apple.  But maybe all it takes to change an industry is one thought-leader willing to stick their neck out.

    Apple has at least two generations to realize the potential or failure of the M1 IMHO. I will keep my distance for now. Crypto miners are still buying cards like Americans buy bullets, eventually prices will fall but it may take bitcoin to fall, and hard.

     

    Post edited by StratDragon on
  • TaozTaoz Posts: 9,739

    Be careful where you buy - this is the same RAM sold by Corsair on amazon versus on Corsairs own site:

    https://www.amazon.co.uk/Corsair-CMK32GX4M2B3200C16R-Vengeance-Performance-Desktop/dp/B01ARHF6Z4/

    https://www.corsair.com/us/en/Categories/Products/Memory/VENGEANCE®-LPX-32GB-(2-x-16GB)-DDR4-DRAM-3200MHz-C16-Memory-Kit---Red/p/CMK32GX4M2B3200C16R

    - cost about 5 times as much on amazon.  I bought two sets last year in a local store, at about the same price as on the Corsair site. 

  • his xhis x Posts: 866
    edited April 2021

    AllenArt said:

    Torquinox said:

    AllenArt said:

    I've heard the same price jump is about to happen with ram and processors too.

    That's horrifying and already in progress, I think.

    More we can thank crypto for. 

    I've been curious about the cause of the chip shortage myself. The consensus seems to be the lockdown is the primary cause. Demand for gaming hardware is up, although yes, bitcoin mining could be a factor.

    Intel says they will increase their number of chip fabs in Arizona, and will also offer their fab services to third parties, which they do not do now. However, they expect this expansion will  take two years to complete.

    Post edited by his x on
  • GreycatGreycat Posts: 332

    All kinds of things use computer chips. There’s a shortage of new cars because they can’t get chips for them. At same time as the demand for chips went up covix hit and companies that made chips cut back. All chips are made over seas now in Asia. With the situation with China making things even worst. It's unlikely that the price chips will come down in the near future if ever.

  • kyoto kidkyoto kid Posts: 40,593
    edited April 2021

    ...will and M1 system be good for rendering Iray on?  Likely not given that Apple and Nvidia parted ways a few years ago.  The latest iOS  no longer even supports Nvidia drivers. 

    Post edited by kyoto kid on
  • nicsttnicstt Posts: 11,714

    My advice, wait.

    Do you actually need it, or is it just want.

    Eventually, you'll get one.

    I did; I just kept looking every few days, and eventually picked up a 3090 for £1600 from my usual supplier

  • kyoto kidkyoto kid Posts: 40,593
    edited April 2021

    ...still considering an A5000 as it is a dual not triple slot card and consumes 120w less than the 3090.  This may finally be the time to make the transition to a workstation card. 

    Post edited by kyoto kid on
  • McGyverMcGyver Posts: 7,005

    Subtropic Pixel said:

    McGyver said:

    The great thing to keep in mind is that after demand dies down a bit and the prices are done skyrocketing, they will hardly go down much and the new higher prices will be the accepted "average" customers are willing to pay... so win-win!
    If you have money invested in those companies... otherwise we're all screwed... and not in the good way.

    Or maybe not.

    EVERYTHING changes with Apple's new M1 "System on a Chip" (SoC).  The new chip, based on the ARM architecture, comes with its own multi-core CPU, multiple GPUs, and RAM all built in.

    How does this change things?

    Well, the CPU components are split into "efficiency cores" and "performance cores".  In the current M1 chip, there are 4 of each of these cores.  The former are low-power and low-heat cores that run that "usual stuff" like word processing tasks, or browsers.  The stuff we spend 90% of our computing time making use of.  The latter cores engage when there's real work needed.  They schedule and dispatch the work quickly so that the performance cores can go back to sleep right away; nighty night! 

    I haven't heard if the built in GPUs behave similarly, but if you think about it, this could be done in a future iteration of the chip.  Have multiple GPU cores, and just let them scale up and come into service as there's work to be done; then shut them down when the work is completed.  Surely with some thought, the engineers could make something that could recognize the difference between graphic rendering work and non-graphic "GPGPU" (General Purpose GPU eligible) work.

    And then there's the RAM, also built in.  So far, SSD/Hard Drive space is not a part of the M1 SOC, which is probably just as well for people like me who buy them two-at-a-time (for production+backup).

    Apple is putting the M1 into iMacs, MacBook Pros, and iPads.  The chip has a translation feature called "Rosetta" that can allow "old fashioned" X86 and X64 programs run well, but the long-term is that Apple wants to see developers create "universal binaries" that can run on the old X and M architectures equally well.

    So far, the M1 chips have been making a good showing.  As fast as the Intel parts while generating less heat (and by extension, generating less noise too).  I think this may be promising, not only for the future of CPU or "compute" type processing, but maybe also for graphic processing (that is, processing of graphics), as well as general purpose graphic processing too (including folding, boinc, seti, and crypto).

    Yes, this is all Apple.  But maybe all it takes to change an industry is one thought-leader willing to stick their neck out.

    And all of this COULD cause a crash in the old-fashioned Intel/AMD/Nvidia ecosystems.

    I don't know... I'd love that to be true, but I don't see the M1 translating into real changes for the Windows World any time soon... and even if it did, nobody would be willing to lower prices, they'd rather try selling outdated hardware at a premium then take a hit to their expectations.
    Physical stores still sell at a discount to get rid of unwanted stock, online stores will always find a rube to pay full price... nobody is really buying this stuff from a store you walk into.

    That and nothing ever comes down after a crisis or demand based price surge, and on rare occasions when it does, it's still considerably higher then any normal cost of living type price increase one would expect to see over the same time span.

    Just look at literally everything after fuel prices shot up after Hurricane Katrina... gas prices stabilized and eventually dropped, but the things that we were told were costing more because of increased shipping costs due to fuel prices, only stabilized, they didn't go down to reflect the decrease in fuel prices.

    People got used to the prices and companies were like "yeah, that's what it costs now".

    But I'm just extremely pessimistic when it comes to this stuff.

  • outrider42outrider42 Posts: 3,679

    This is about crypto. While COVID played a big role, it is ultimately crytpo that has driven costs the way they are. With crypto at record highs it is just too enticing to buy a card. People will buy a GPU even if they don't need it, because they can pay it off thanks to mining and make some cash doing nothing. Mining is the ultimate way to make money for nothing, Dire Straights can't top that. This is why prices are too the moon, because enough people are willing to pay them. It is not simply supply, supplies have been terrible for hardware launches in the past, scarce hardware is absolutely not new. In previous launches cards were very hard to get for weeks and months, but that not did cause prices to triple or inflate the way they have recently. Just because they were hard to get didn't mean people were paying triple for them. I think this point gets lost on a lot of people. Nvidia has actually produced quite a lot of 3000 series cards. They routinely restock numerous cards every couple weeks. I refuse to believe that demand is that high purely because more people are at home. Most basic computers are just fine for working at home, you don't need a freakin' RTX 3090 to take phone calls remotely. And if you are working at home, you should be working, you are not getting more game time. No, the demand is due to GPUs becoming money printers, simple as that.

    In years past consoles would be scalped on ebay during launch. But even back then the prices were not so insane, even with inflation. You might see one or two units get sold for a wild price, but most scalping sales were just a couple hundred higher than MSRP. Today everything is wack.

    I know what you might say, "but the consoles do not mine". No, but that doesn't mean that the buyer hasn't made money mining, does it. Crypto has introduced a whole lot of new currency into the market, and while COVID has hurt some businesses, there are plenty of people who have done extremely well during COVID (some by mining, some by other means). Enough people to be able to buy goods for almost any price and not care about the cost. The gap between the rich and the poor is widening, and right now the rich are flexing their wallets and inflating everything beyond normal people's reach.

    If crypto just suddenly died tomorrow, GPU prices would fall drastically within days afterward.

    And keep in mind that buying a Quadro should not be the answer here. You should only buy a Quadro if you actually need a Quadro, not because the prices of gaming cards are out of control. Having 48GB of VRAM does you no good if your system only has 64GB of RAM installed. Even 128GB may not cut it. Guys, I can hit 50GB+ of RAM in use when I render and I have two 1080tis, those have 11GB VRAM. That is basically a 5 to 1 RAM to VRAM ratio. So if you get a A6000 with 48GB of VRAM, it is possible you will not be able to access the full capacity of that VRAM in 99% of configurations. You may seriously need 512GB of RAM for this system if you wish to use that amount of VRAM. If you use 128GB of RAM, you might only reach around 24GB of VRAM in use....the size of the 3090. And the 3090 is also faster than the A6000. It is not drastically faster, but it is faster. That might open the door for the A5000 I suppose, but then the question is why would you go for a slower and still yet very expensive card? The Quadro line has advantages, but most of them do not apply to us here. The only real advantages are they might be smaller and use less power than the gaming versions. That is pretty much it. They can do TCC mode, but recent Windows updates have made TCC mode less important since Windows no longer gobbles up as much memory. On my 1080tis, the one connected to the monitor has actually reported 11GB in use while the 2nd card without a monitor has reported 10.8GB in use during renders. That was also when I had 50GB used in RAM. No other apps were open.

Sign In or Register to comment.