10 bit color for Gforce and Titan cards

RexRedRexRed Posts: 1,327

NVIDIA Unveils New Studio Driver with Support for 10-bit Color for Creatives

 JUL 29, 2019

NVIDIA has just released the latest version of its Studio Driver, and creatives who use the company’s GPUS should take note. According to NVIDIA, the new driver, “delivers the best performance and reliability for creative applications via extensive testing of creator workflows” by adding support for 10-bit color for creatives who use programs like Adobe Photoshop and Premiere.

The new driver was announced at SIGGRAPH 2019, and it’s a big deal for PC users who don’t want to pony up for NVIDIA’s expensive Quadro cards. Up until now, only NVIDIA’s Quadro RTX cards supported 30-bit color (10-bits per channel) leaving users with NVIDIA’s GeForce and Titan lines of laptop and desktop GPUs limited to 24-bits (8-bits per channel).

https://petapixel.com/2019/07/29/nvidia-unveils-new-studio-driver-with-support-for-10-bit-color-for-creatives

 

Comment:

It appears that what this means is yes, you can save files from Photoshop at 10bit if you own a titan or a geforce card just as you can with a quatro card.

This also means there is no difference between the color depth of an image rendered by a quatro, a geforce or a titan card.

This also means you don't have to pony up for the quatro card to be a "professional artist" or to create print quality products with your existing cards.

Yes, this is old news but I thought I would put it out there anyway.

Your renders have gotten better and you probably did not even notice. :)

Post edited by RexRed on

Comments

  • RexRedRexRed Posts: 1,327

    My Nvidia studio driver is letting me boost the color profile on my one of my HD monitors to 12bpc at 32bit.

  • pwiecekpwiecek Posts: 1,577

    I'm assuming that's 10 bits per channel, ne?

  • Surely this is about display, not the data format - you can create a 16-bit per channel image in Photoshop regardless of the bit-depth available on the display, higher bit depths for display (given sutiable hardware) merely improve the fidelity of what's on screen (given suitable eyes) as I understand it.

  • GordigGordig Posts: 10,055

    Surely this is about display, not the data format - you can create a 16-bit per channel image in Photoshop regardless of the bit-depth available on the display, higher bit depths for display (given sutiable hardware) merely improve the fidelity of what's on screen (given suitable eyes) as I understand it.

    It’s about trying to win this argument: https://www.daz3d.com/forums/discussion/425716/do-s-and-don-ts-about-lighting#latest

  • RexRedRexRed Posts: 1,327
    edited August 2020
    Gordig said:

    Surely this is about display, not the data format - you can create a 16-bit per channel image in Photoshop regardless of the bit-depth available on the display, higher bit depths for display (given sutiable hardware) merely improve the fidelity of what's on screen (given suitable eyes) as I understand it.

    It’s about trying to win this argument: https://www.daz3d.com/forums/discussion/425716/do-s-and-don-ts-about-lighting#latest

    Thank you Gordig for making note of that.

    To me it is not an argument, it is a search to understand the parameters of how these things work.

    Please don't make assumptions about my motives.

    Last night I had my hand on the buy button to purchase two Titan RTX graphics cards. That is almost $5000 dollars. 

    It is certainly not in my best interest to polarize this discussion and alienate people from it.

    But, I find myself defending my motives and credentials rather than simply discussing the substance of this issue.

    I am getting conflicting information here.

    Nvidia says I can create 10bit renders with my 1080ti graphics cards and then I am told here that I "assuredly and emphatically" cannot.

    So, I slowly lifted my finger from the buy button and decide to wait and ask about it here. Here is where I have bought the thousands of dollars worth of really fine models.

    Why did I not buy the cards then? Part of it was because I do not fully understand how 10 bit renders are made...

    Yes, I don't know it all and I have no problem admitting it, this is why I am here asking in a, "humble tone". That was the same tone I had before too.

    I just push a button and render the scenes I make and if they look nice and are in a high resolution I send them to my marketers.

    Another reason why I did not buy the Titan RTX is because the fans are situated on the bottom and not out the back of the card. I don't want to blow hot air into my machine.

    Also, if I can achieve the same thing from one card for about the same price... Why not do that instead?

    There is also the issue of Nvidia and 3rd party licenses but most of the licenses are for 3D applications that are WAY out of my price range to purchase let alone the models for them.

    I am quite happy in the DAZ ecosystem for now which I do not think requires any licenses or gimmicks to make 10bit images.  

    I am leaning towards the Quadro 8000 then maybe if get rich off all my art I can buy another one. :)

    This discussion does not have a win the argument argument element to me, it has a practical, I need to know this element; so I do not buy the wrong thing.

    There is the difference.

    I also game a lot and live stream in 4k with OBS studio. How will quadro handle these tasks?

    Will I run into problems with the drivers because Quadro is more specialized? Will OBS even see the Quadro?

    The Quadro driver is smaller in size to the Titan RTX driver, why and what does this mean?  

    I can admit I do not know it all... but I hope to lean more before this is all said and done.

    Maybe we can l all learn together. That seems like a kind of utopian idea but call me a hopeless romantic. I don't mind that.

     

    Post edited by RexRed on
  • fastbike1fastbike1 Posts: 4,077

    Without taking sides, perhaps the question is - What do 10 bit renders actually bring to the image i.e. is the wider gamut actually evident? What is the final output format?

  • WendyLuvsCatzWendyLuvsCatz Posts: 38,211

    more to the point is can people without the hardware even tell

    if you are posting on a web page and us plebians with the cheap monitors and tablets are looking

    or printing it limited to the inks

  • RexRedRexRed Posts: 1,327
    fastbike1 said:

    Without taking sides, perhaps the question is - What do 10 bit renders actually bring to the image i.e. is the wider gamut actually evident? What is the final output format?

    I think it becomes evident in print medium i.e. magazines and items with prints on them, clothing, ad banners in stores etc..

    Advertisers a want their ads to pop out more than their competitors so they set strict guidelines about the bit depth and resolution of images they will accept from artists.

  • RexRedRexRed Posts: 1,327
    edited August 2020

    I just read online that you can't put a Quadro and a GTX in the same PC together.

    https://nvidia.custhelp.com/app/answers/detail/a_id/2280/~/can-i-use-a-geforce-and-quadro-card-in-the-same-system%3F

    Post edited by RexRed on
  • RexRedRexRed Posts: 1,327

    RTX on GTX: Nvidia is enabling ray tracing on some GeForce GTX graphics cards

    https://www.pcworld.com/article/3373496/rtx-nvidia-geforce-gtx-gpu-ray-tracing-dxr.html

  • 30-bit color support

    In addition, this new NVIDIA Studio Driver introduces support for 30-bit color across all product lines, including GeForce and TITAN, for the first time, allowing for seamless color transitions without banding. With 24-bit color, a pixel can be built from 16.7 million shades of color. By increasing to 30-bit color, a pixel can now be built from over 1 billion shades of color, which eliminates the abrupt changes in shades of the same color. 

    Latest NVIDIA Studio Driver Available Now: Supercharge your favorite creative apps

    Image courtesy of Farhan Perdana

    (24-bit and 30-bit differences simulated for demonstration on non-30-bit capable displays. 30-bit color requires supported display, software, and content)

    Multiple creative applications currently take advantage of 30-bit color including Adobe Photoshop, Adobe Premiere Pro, Autodesk RV, Colorfront Transkoder, Assimilate Scratch, and Foundry Nuke.

    Learn more about how NVIDIA GPUs accelerate content creation and our new RTX Studio laptops, then download the latest Studio Driver and provide us your feedback on the NVIDIA forums.

     https://www.nvidia.com/en-us/geforce/news/studio-driver/

    I don't read that as having anything to do with what can be created, rather - as I suggested above - it is about applications that can, with suitable content (i.e. that using 10 or more bits per channel) and on suitable display hardware, give a smoother rendition than a display or GPU or application that is limited to 8 bit per channel display.

  • RexRedRexRed Posts: 1,327
    edited August 2020

    You could be right Richard, maybe it has to do with display or viewing only but considering that Nvidia said explicitly that they were giving lower models the same capability in Photoshop as the Quadro it seems it is authoring ability also.

    But i did not read anything about the lower models now having licensing privileges with the big apps.

    It is a game changer but how much is still to be determined.

    And I might say the 24 bit image is exaggerated. None of my images I have rendered on my PC have looked like that looks, more like 16 or 8 bit simulated for demonstration purposes, the old Windows 3.1 days...

    I think the eye can perceive the difference between 25 and 30bit but I am not sure, I am half blind even on a good day. 

    It has sure has sold a lot of TVs though.

    Post edited by RexRed on
  • RexRed said:

    You could be right Richard, maybe it has to do with display or viewing only but considering that Nvidia said explicitly that they were giving lower models the same capability in Photoshop as the Quadro it seems it is authoring ability also.

    It needs special application coding to use 10 bits per channel - applications that don't have that will continue to display as if using 8 bits per channel. The applications listed are (some of) those that can take advantage, by my reading.

    RexRed said:

    But i did not read anything about the lower models now having licensing privileges with the big apps.

    It is a game changer but how much is still to be determined.

    And I might say the 24 bit image is exaggerated. None of my images I have rendered on my PC have looked like that looks, more like 16 or 8 bit simulated for demonstration purposes, the old Windows 3.1 days...

    Yes, the text underneath does say the effect is faked - otherwise it would not be possible to illustrate it

    RexRed said:

    I think the eye can perceive the difference between 25 and 30bit but I am not sure, I am half blind even on a good day. 

    It has sure has sold a lot of TVs though.

     

  • RexRedRexRed Posts: 1,327
    edited August 2020

    I think I have made up my mind to buy a Titan RTX card (or two).

    But I am going to wait at least a month or two until the new line-up of cards come out.

    I am hoping for two things...

    A new Titan RTX will be revealed and it might have these improvements.

    First, I am hoping it will have ddr6x instead of ddr6 ram 

    and second, maybe they will have a founders edition that will blow the air out through the back instead of under it.

    If the air is blown down instead of out the back that will not be a deal-breaker.

    My logic is if I buy this card and in two years it dies I am not out $5000 dollars for a Quadro.

    I can buy two of these cards for the price of one Quadro and if one dies I can fall back on the other.

    So that is it for now.

    One big deciding factor is that the Titan GTX is consumer grade.

    That to me means the drivers are less specialized and it will work with games, live streaming and multitasking where the Quadro seems geared only for specific tasks

    and its drivers are limited and do not work with the Gforce Experience.

    Someday I may regret this when I have three Titan GTX's in my PC and I could have instead had three Quadro's running.

    It is a very difficult decision but with the NVlink I can always add more VRAM to the Titan GTX setup.

    And when you compare the render speeds side by side the Titan actually renders an insignificant bit faster than the Quadro

    Another consideration is if a new Quadro is released with DDR6x ram and the new Titan RTX does not have DDR6x ram.

    I believe DDR6x ram is significantly faster than DDR6.

    But a new Titan RTX may be released and it should have DDR6x ram also.

    So it is a waiting game.

    I expect the new cards to be roughly the same price or more by a slight amount and the current cards to go down in price.

    That may be a reason to wait a month also saving a few hundred dollars could be worth waiting a month for.

    If the new Titan has DDR6x ram I will pay the extra and go with that.   

    There is one other issue.

    I can't seem to find an answer for this anywhere on the internet and believe me I have looked.

    I am quite certain I can run the Titan GTX with my 3x1080TI's but will the 1080TI's assist the Titan RTX with rendering by devoting its cuda cores to that render?

    I have not found a single peep about that. Perhaps because it is a given that they will work together to render, but I am not sure.

    It would really be a disappointment if the Titan GTX and 1080TI are not compatible together when rendering.

    Post edited by RexRed on
  • If a render fits on both the 1080Tis and the Titan all will be used; if it fits on only the Titan then the 1080Tis will drop out and the Titan will continue; if it fits on neither then the redner will drop to CPU or stop (depending on settings).

    You might want to bear in mind that titan RTX cards can be linked, pairwise, via nVLink to share memory for materials - they will render a little more slowly than when running alone, but they will substantially increase the head room for materials which are usually the biggest resource hog.

  • RexRedRexRed Posts: 1,327
    edited August 2020

    If a render fits on both the 1080Tis and the Titan all will be used; if it fits on only the Titan then the 1080Tis will drop out and the Titan will continue; if it fits on neither then the render will drop to CPU or stop (depending on settings).

    You might want to bear in mind that titan RTX cards can be linked, pairwise, via nVLink to share memory for materials - they will render a little more slowly than when running alone, but they will substantially increase the head room for materials which are usually the biggest resource hog.

    Wow Richard, that info is very useful, it helped me progress further in my understanding of all of this..

     

     

    Post edited by RexRed on
  • RexRedRexRed Posts: 1,327
    edited August 2020

    Okay I just read another thread on Daz and it compliments what Richard is saying.

    If the scene fits in the 1080ti's then it will render on all cards and CPU.

    If it only fits in the Titan RTX card then the 1080ti's will not be used.

    If it won't fit in the Titan RTX card then only the CPU will render.

    That makes sense now...

    So, then I have to figure out what projects would use CPU rendering and what would use the 1080ti rendering.

    My first thought was why even have the 1080ti's on board?

    But, if I want to create a simple animation with a scene that is under 11gb of vram then all of my cards including the cpu could fly through that render sequence.

    I would save the Titan RTX for large complex scenes that would produce only one very detailed large image.

     

    Post edited by RexRed on
  • RexRedRexRed Posts: 1,327

    100 more hours left for my animation to finish rendering........

    Question, I was thinking I could get one of the Titan RTX now and one of the new Titan RTX after it comes out.

    Would these two different models work together with NVlink? The NVlink dongle is over 150 dollars, yikes!

     

     

  • RexRedRexRed Posts: 1,327

    NVIDIA NVLink Bridge Compatibility Chart

    https://www.pugetsystems.com/labs/articles/NVIDIA-NVLink-Bridge-Compatibility-Chart-1330/

    It seems that some cards of different types can be linked through NVlink.

    Do both cards get slowed down to the speed of the slowest card?

    It seems waiting for the faster card to come out before purchasing is better than mixing one and having it negate the speed increase of a newer model.

  • PaintboxPaintbox Posts: 1,633
    edited August 2020

    Your scene needs to fit into your cards VRAM in order to render with the GPU, so whatever nvidia card will pick up.  If not, the render will be CPU only, which is SLOW.

    You don't want to render iRay with your CPU. So you have to check how big your scenes are, and make a decision on that basis. Check the benchmarks thread to see what speed you will see increased.

    Post edited by Paintbox on
  • I belive that the cards have to match for the nVLink to work properly.

  • RexRedRexRed Posts: 1,327
    edited August 2020

    Okay here is another question,

    I just found out there is now PCIE 4.0

    My motherboard is Asus ROG Strix Gaming PCIE 3.0

    4.0 is double the bandwidth.

    When and how will this PCIE 4.0 advantage ever figure into render times?

    "Nobody will ever need more than 640KB of memory"... :)

     

    Post edited by RexRed on
  • RexRed said:

    Okay here is another question,

    I just found out there is now PCIE 4.0

    My motherboard is Asus ROG Strix Gaming PCIE 3.0

    4.0 is double the bandwidth.

    When and how will this PCIE 4.0 advantage ever figure into render times?

    "Nobody will ever need more than 640KB of memory"... :)

    It probably won't - at least unless we get memory sharing across the PCIE bus instead of nVLink. The speed of the bus is relevant to iray only in the initial transfer of data to the GPU - perhaps if you were using the Iray preview a faster connection (assuming both GPU and MB used the higher standard) might have a noticeable effect, but for rendering I doubt it would.

  • RexRedRexRed Posts: 1,327

    Excellent info Richard! It seems like it is a future thing and memory sharing across the PCIE bus that would be a game changer for sure!

    Perhaps the entire scene would not have to fit into any one particular card if that was the case...

  • RexRedRexRed Posts: 1,327
    edited August 2020

    Well, i just bought 2 Nvidia Titan RTX's...

    I figured when the price goes down there won't be any left to buy.

    I bought the last one Amazon had in stock and the second one from a 3rd party vendor at Amazon.

    Waiting a few months for a card that may take a year to come out is not a good plan either.

    So it should take a week or two for me to get my system set up with the new cards... 

    I bought a 4 slot spacing NVlink it skips a double space... I may put one of my 1080TI's in there just for the heck of it but i think letting the cards have some cooling room might be good too.

    The temptation was too great.

    Now I have 2 x 1080TI's that I might be selling for around 4 to 500 each... (they are almost brand new)

    Post edited by RexRed on
  • RexRedRexRed Posts: 1,327
    edited September 2020

    Today I returned my 2 Titan RTX cards for a refund. They cost me $5000+

    2 days ago Nvidia announced the new cards, overnight the two Titan RTX depreciated in value from $2500 a piece to $500 a piece.

    I will wait till the new RTX 3090 becomes available.

    It is $1000 less per card and almost 50% faster, 50% cooler and 50% less power consumption...

    I am lucky the return window was still open.

    Under load the Titan RTX were running at 80 to 87 degrees, that was my main reason for returning them (That is too hot!).

    The RTX 3090 has a fan that blows directly through the card and also exhausts out through the back of the card and it is the only one of the three new cards that has NVlink ports.

    50% quieter too.

    The Titan RTX' were LOUD.

    If my thread here caused you to go out and buy a new graphics card RETURN IT NOW (while you still have the chance)!

    Presales for the new lowest tier RTX cards start in about a week, it will take a while before the 3090s are released but it is worth waiting a few months for that.

    Post edited by RexRed on
  • RexRedRexRed Posts: 1,327

    $3,977.77  for an 8k monitor yikes!

  • RexRedRexRed Posts: 1,327

    This Q and A was copied from the Nvidia website.

    Q: Will the 30 series cards be supporting 10bit 444 120fps? Traditionally Nvidia consumer cards have only supported 8bit or 12bit output, and don’t do 10bit. The vast majority of hdr monitors/TVs on the market are 10bit.

    The 30 series supports 10bit HDR. In fact, HDMI 2.1 can support up to 8K@60Hz with 12bit HDR, and that covers 10bit HDR displays.

    Comment: This sort of goes over my head but it seem the 30 series card supports the creation of 12bit images for use with 12bit HDR displays... 

  • RexRed said:

    This Q and A was copied from the Nvidia website.

    Q: Will the 30 series cards be supporting 10bit 444 120fps? Traditionally Nvidia consumer cards have only supported 8bit or 12bit output, and don’t do 10bit. The vast majority of hdr monitors/TVs on the market are 10bit.

    The 30 series supports 10bit HDR. In fact, HDMI 2.1 can support up to 8K@60Hz with 12bit HDR, and that covers 10bit HDR displays.

    Comment: This sort of goes over my head but it seem the 30 series card supports the creation of 12bit images for use with 12bit HDR displays... 

    No, it still has nothing to do with image creation if you mean rendering. It is about allowing applications that can draw their windows in 10 or 12 bit colour to do so and send the result without reduction to a display that supports that bit depth

  • TomDowdTomDowd Posts: 198

    Richard is correct. The bit depth a video card output relates to its ability to support high-bit depth displays, not the color depth present in an output file. You can work in 32-bits per channel even on a common monitor, but you will not be able to see the entire color gamut on that monitor. The 10-bit HDR referenced is for HDR-capable televisions that need 10 or 12 bits of color per channel to support the high dynamic range. 

    Different file formats in still image production, and different video codecs in video production, support saving in higher than 8-bit formats. And you want that higher bit-depth if you are going to be doing additional color correction, post-processing, visual effects work, and so on. 

    Overkill:

    https://www.tomshardware.com/news/what-is-10-bit-color,36912.html

    https://www.dpreview.com/forums/thread/4402628

    https://petapixel.com/2018/09/19/8-12-14-vs-16-bit-depth-what-do-you-really-need/

    https://color-management-guide.com/10-bit-display.html

    TD

Sign In or Register to comment.