GTX vs. Quadro

I see most people are using GTX cards.  I'm using a GTX 780 Ti and a GTX 970.  Is anyone using a Quadro?  I see they're much more expensive, and typically have fewer CUDA cores, but they are made specifically for rendering.  I'm wondering if they outperform the GTX.  I mean, I would think if they are made for rendering, they may go a lot faster despite the fewer cores.

Comments

  • mjc1016mjc1016 Posts: 15,001

    I think this video sums up the differences...

  • areg5areg5 Posts: 617

    I've seen that video, but I don't think it really explains it.  I don't think the Quadro is merely more expensive so it can be sold to large corporations.  I notice that my GTX cards heat up quickly and then throttle down the clock time.  I'm wondering if Quadro cards might be so much more efficient in rendering, that they maintain their speed and render faster even though they might have fewer CUDA cores.  This guy is basically saying they're pretty much the same thing, they just jack up the cost of the Quadro to sell them to professionals. 

     

    Does anyone here use Quadro cards, and if so do they find them faster and more efficient than GTX's?

  • The video does cover that - Quadros are engineered to be stable at high loads for longer than game cards. And of course they cost more because the extra features don't beenfit most gamers, so they don't sell as many, so the R&D costs have to be covered by fewer units, so they cost more, so they sell even fewer, so the R&D costs...I'm sur nVidia does enjoy better margins on Quaddros, but as with high-end anything there's a feedback that genuinely pushes the price higher than the raw perfomrance gains (in stability and accuracy, not in speed, here) warrants - we see the same thing across the range of the consumer cards too.

  • mjc1016mjc1016 Posts: 15,001
    edited November 2016

    Is a Quadro going to be better for Studio...probably not.  There don't seem to be Studio/Iray optimized drivers, so there won't be any specific driver helps.  Some of the other general driver optimizations may be useful...but without benchmarking similar cards it would be hard to say how much of a benefit.

    Is it going have better performance?  Maybe. 

    Is a render done on it going to look 'better'? Again maybe, but probably not that anyone/most anything would notice that the pixels are all render 0.1% more color correct. 

    Is a Quadro likely to complete a long animation without overheating and dropping out?  Yes, it probably will. 

    There is one other consideration...but then it goes into the very upper ends of the Quadro line...they are sold in configurations with much more memory....up to 24 GB, so you could pack a huge scene on one of those and still have the benefits of GPU rendering.  But that does come at a rather steep price tag.

    Post edited by mjc1016 on
  • areg5areg5 Posts: 617
    mjc1016 said:

    Is a Quadro going to be better for Studio...probably not.  There don't seem to be Studio/Iray optimized drivers, so there won't be any specific driver helps.  Some of the other general driver optimizations may be useful...but without benchmarking similar cards it would be hard to say how much of a benefit.

    Is it going have better performance?  Maybe. 

    Is a render done on it going to look 'better'? Again maybe, but probably not that anyone/most anything would notice that the pixels are all render 0.1% more color correct. 

    Is a Quadro likely to complete a long animation without overheating and dropping out?  Yes, it probably will. 

    There is one other consideration...but then it goes into the very upper ends of the Quadro line...they are sold in configurations with much more memory....up to 24 GB, so you could pack a huge scene on one of those and still have the benefits of GPU rendering.  But that does come at a rather steep price tag.

    I saw the prices.  Seriously ridiculous.  I don't think the renders are going to look any better, I'm just wondering if a lower end 8 gig card might perform better than a GTX that looks better on paper from the standpoint of not overheating.

     

  • mjc1016mjc1016 Posts: 15,001

    There you start getting into the comparing across generations questions...because there aren't going to be any Quadro cards that are Pascal (current/newest generation) that will be cheaper than Pascal GTX cards...there may be last generation, but 8GB were the mid to upper tier Quadro cards, so they won't be the cheaper ones. 

    For most Studio use, GTX cards are more than adequate and with decent system cooling, a reasonable expectation for not 'cooking' the card. 

    If there are other programs that you regularly use that can benefit from a Quadro (Photoshop can be one) then the question becomes even more muddled...

  • areg5areg5 Posts: 617
    mjc1016 said:

    There you start getting into the comparing across generations questions...because there aren't going to be any Quadro cards that are Pascal (current/newest generation) that will be cheaper than Pascal GTX cards...there may be last generation, but 8GB were the mid to upper tier Quadro cards, so they won't be the cheaper ones. 

    For most Studio use, GTX cards are more than adequate and with decent system cooling, a reasonable expectation for not 'cooking' the card. 

    If there are other programs that you regularly use that can benefit from a Quadro (Photoshop can be one) then the question becomes even more muddled...

    Describe what you mean by "decent cooling."  Case fanes?  Water cooling?  Water cooling is intriguing, and I see reports of GTX cards rendering under full load at 50 degrees.

  • mjc1016mjc1016 Posts: 15,001

    Case fans in a proper configuration, in a not stuffed to the gills/cramped case.

  • areg5areg5 Posts: 617
    mjc1016 said:

    Case fans in a proper configuration, in a not stuffed to the gills/cramped case.

    That's a good first step.  I built my computer out of an and old mid-tower case which isn't particularly efficient.  I never gave the case much thought, until now.  I guess it is kind of cramped in there.  I will say that the GPU's cool down very quickly now that I adjusted the fan profiles, and the only other heat producer in there is the processor.  I have an SSD, so that doesn't make any heat at all.  I found this:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16811853030&ignorebbr=1&nm_mc=KNC-GoogleAdwords-PC&cm_mmc=KNC-GoogleAdwords-PC-_-pla-_-Cases+(Computer+Cases+-+ATX+Form)-_-N82E16811853030&gclid=CNrZqcWKn9ACFQNAhgodJ-YNrQ&gclsrc=aw.ds

     

    Which I'm going to do some research on.  Still pretty fascinated with liquid cooling.  I make comics, so I am really putting my system through hell.

     

  • Short summary (not anywhere near complete):

    Quadros add lots more circuitry for bus controlling, ECC VRAM, double speed VRAM (2x of the GTX), and more.  Higher level Pascal Quadros add Deep Learning functions and other simulation based CUDA operations.  This is why they are more expensive.  Unless you're using a professional level Motherboard with bus mastering you are not likely to see much improvement using a Quadro over a GTX using DS.  Autodesk programs can see upwards of 3000% increases in performance moving to Quadro over GTX due to specialized code utilizing the Quadro's innate features.  C4D can see 300-500% improvement for similar reasons.

    Right now, Iray does not use any of the Quadro's strengths (this will change according to nVidia) so the only improvement that the Quadro will give is higher VRAM speeds once things are loaded into memory.  Iray is still single precision while most of the Quadro's strengths come using double precision operations.

    Kendall

  • areg5areg5 Posts: 617
    mjc1016 said:

    Case fans in a proper configuration, in a not stuffed to the gills/cramped case.

     

    Ok ...I have a strategy.  I ordered a Coolermaster tower that will really give me a lot of room:  the Haf X.  It comes with 3 fans, and it has a bracket to put an extra fan on the GPU's.  I hear it can really make a difference.  It is also liquid compatible if I decide to go that route in the future.  Let's see if that makes a difference.

  • namffuaknamffuak Posts: 4,191
    areg5 said:
    mjc1016 said:

    Case fans in a proper configuration, in a not stuffed to the gills/cramped case.

     

    Ok ...I have a strategy.  I ordered a Coolermaster tower that will really give me a lot of room:  the Haf X.  It comes with 3 fans, and it has a bracket to put an extra fan on the GPU's.  I hear it can really make a difference.  It is also liquid compatible if I decide to go that route in the future.  Let's see if that makes a difference.

    It should. My case has 3 120 mm fans in the front, in front of the drive bays, pulling air in; 2  120 mm fans in the back, near the top, pulling air out; and a 210 mm fan on top, also pulling air out. Really good air flow, as long as I clean the intake filters  smiley. I'm strictly air-cooled and the cpu stays under 60 C even when running a 2 hour render. Now that I have the 1080 I've dropped the cpu from the list; the render that takes 57 minutes on the 980 ti that I mentioned in the other thread drops to 52 minutes if I add the cpu, so not really worth it.

  • areg5areg5 Posts: 617
    namffuak said:
    areg5 said:
    mjc1016 said:

    Case fans in a proper configuration, in a not stuffed to the gills/cramped case.

     

    Ok ...I have a strategy.  I ordered a Coolermaster tower that will really give me a lot of room:  the Haf X.  It comes with 3 fans, and it has a bracket to put an extra fan on the GPU's.  I hear it can really make a difference.  It is also liquid compatible if I decide to go that route in the future.  Let's see if that makes a difference.

    It should. My case has 3 120 mm fans in the front, in front of the drive bays, pulling air in; 2  120 mm fans in the back, near the top, pulling air out; and a 210 mm fan on top, also pulling air out. Really good air flow, as long as I clean the intake filters  smiley. I'm strictly air-cooled and the cpu stays under 60 C even when running a 2 hour render. Now that I have the 1080 I've dropped the cpu from the list; the render that takes 57 minutes on the 980 ti that I mentioned in the other thread drops to 52 minutes if I add the cpu, so not really worth it.

    I've played around with the settings, and found that the CPU usually adds nothing, or even slows it down.  I have a GTX 970 and a GTX 780 ti.  I have to run them at 75% so they don't kick off, and I modulated the fan surves on the cards to go full bore at around 50 degrees.  A typical render for me takes about 35-40 minutes depending on the complexity of it, and I keep the sizes down so they fit into the GPU RAM.  The 780 only has 3 gig, so I can't do huge scenes.  I currently have a not very well ventilated Mid tower, and my card temps hover around 70 degrees.  Yeah, I'm hoping with better cooling I can run the cards at full power and stay both stable and cooler.  I would settle for 60.  I hear with water cooling you can overclock them and run at 50, so on my next rig I might try something like that.

  • fastbike1fastbike1 Posts: 4,078

    Your GPU will already have 2 or 3 fans mounted on it. A case fan "pointed" at the GPU will have to be carefully placed so as not to be worse. I don't think GTX cards for rendering are  as much of an issue as people seem to think for  couple of reasons. First, they are fast enough that they aren't cranking for all that long. People still seem to think in terms of renders running from overnight to days. That won't be happening with a GTX.

    Second, the driver controls do a good job of maintaining temperature at or below the design operating temperature (which is not the thermal limit). Both of the GTXs I've had have never needed to run the fans higher than 60% to keep temperature at the operating point.

    Third, the pascal cards require less power, which in general means less heat to dissipate.

    areg5 said:
    mjc1016 said:

    Case fans in a proper configuration, in a not stuffed to the gills/cramped case.

     

    Ok ...I have a strategy.  I ordered a Coolermaster tower that will really give me a lot of room:  the Haf X.  It comes with 3 fans, and it has a bracket to put an extra fan on the GPU's.  I hear it can really make a difference.  It is also liquid compatible if I decide to go that route in the future.  Let's see if that makes a difference.

     

  • mjc1016mjc1016 Posts: 15,001
    fastbike1 said:
     Both of the GTXs I've had have never needed to run the fans higher than 60% to keep temperature at the operating point.

    Third, the pascal cards require less power, which in general means less heat to dissipate.

    Space in the case and space between the cards helps a lot with that, too...

    On my motherboard there is practically no room between the cards with 2 inserted, so the second card is sucking warm air from the back of the first.  So my next build is going to be with a motherboard with more space.

    And yeah, there is that (third point)...lower power will, by virtue of being lower power, be cooler.

  • IF you have deep pockets or are doing science or math get a quadro=too much. IF you are doing Iray get a titan x=12gb video ram. IF you want to spend less get a 1070=8gb video ram.

  • IF you have deep pockets or are doing science or math get a quadro=too much. IF you are doing Iray get a titan x=12gb video ram. IF you want to spend less get a 1070=8gb video ram.

  • areg5areg5 Posts: 617

    Maybe I'll save up for the Titan X.  The 3 gig 780 really limits my scene size.  It is fast, though.  I'm thinking of setting up the GPU fan to pull, but I'll play around with the build once I get it together.  My CPU is a 4790.  I'm presuming I wouldn't see much difference with a faster one, since the rendering is going through the cards.

  • JD_MortalJD_Mortal Posts: 760
    edited November 2016

    Titan-X's are showing-up all over for about $750 for the 12GB models. Well worth the money now.

    Price is higher for Quadro's, due to direct support from Nvidia, for those actually buying from Nvidia. No other reason. (Those not buying from Nvidia are just paying more for nothing. You can get them half price without support. Which is why they sell them and push them all the time. Not Nvidia, the hack-dealers.)

    The chips are the same, the capacitors are the same, the drivers are the same. One set of drivers has the "bit" that unlocks the double-precision floating-point values. Now, in the new ones, there is a section of the chip that ALSO has that "bit", as a physical device on the chip. For gaming computers, they short it out, and that burns that single gold wire, representative of the checking bit, "Use double precision bits". The chip is still the same.

    (Literally, there is a hacked set of drivers that just turns your card into a Quadro, or turns a Quadro into the equivelant GeForce card. 100%, Quadro is slower than the counterpart, in all tests, for real-world uses.)

    I have both. Quadro is only faster on specific hardware that is programmed to be faster on it. (Which is almost none now.) And, it is only faster at a few specific things, it is slower at much more things, due to using double-precision floating values in all formulas. Takes longer to do all math, twice as long to do all divisions and multiplications. Also due to error-checking each cell of RAM data before giving it out, or writing it in memory as OK to use.

    The only program that I saw it ever faster on, was Sketchup 3D, when refreshing the screen. I don't do weather-computational programming nor do I do theoretical universe planatary and galaxy calculations... I surf for porn and render it. Doesn't require more than four decimals of precision and two hands to do that.

    Any other claims of them being special in any way, is pure hype. These have been tested, forwards and back, and disected and even the patents and schematics are the same. (The gaming chips are essentially chips that have also failed to be "perfect", on benches. Minor imperfections that "could" reduce stabiltity of function of cores, but often times does not, in a major way.)

    http://gpu.userbenchmark.com/Compare/Nvidia-Quadro-K6000-vs-Nvidia-GTX-Titan-X/2837vs3282

    http://www.videocardbenchmark.net/compare.php?cmp[]=3106&cmp[]=3554&cmp[]=3597

    http://forums.autodesk.com/t5/installation-licensing/geforce-vs-quadro-benchmarks-actual-tests/td-p/4102908

    The Tesla K80 is the one with the most "Cuda cores", at 4992, with Titan-X having only 3072. Guess how much it costs? $3,500 to $5,500 (The prior price is "new/used" and "unsupported". The latter comes with Nvidia's prime support.)

    Doing Maths...

    Titan-X $1000 for 3072 cores = 3.07 cores per dollar or $0.3255 per core

    Tesla K80 $5000 for 4992 cores = 0.998 cores per dollar or $1.0016 per core

    Two Titan-x's gives you over 6100 cores for under $2000, still less than half the cost of one Tesla K80, and 1100 more cores.

    However, Tesla K80 has 24GB of RAM... not sure if that is 12 and 12, in dual-core, isolated/shared, or one core with all 24GB available for it. That is a rather large Daz scene. 12GB is just at my limits now.

    Also... Daz doesn't support Tesla or Quadro drivers, nor do they support Daz. They barely support windows. (The drivers) :P

    However, IRAY might. I have not looked into that. I have not seen it posted anywhere. That is a specific set of support needed for those drivers. Like I said, it takes hacks to make drivers work on opposing cards. It is not just a plug-n-play deal. Stuff has to be programmed for the cards, and DX (Direct-X, which is for games), is not. (Which is what all these render-engines use. None talk directly to the cards core components, or they would just as easily do the same with Radeon, which has the same exact formulas, just a diferent core stack, for flow, through the API.)

    Post edited by JD_Mortal on
Sign In or Register to comment.