I never thought I’d see the day: The latest rerelease of Nvidia’s Studio drivers allows you to view HDR video and wide P3-gamut colors “across all Nvidia product lines and GPUs.”
Until today you had to spring for a pricey Nvidia to properly view your shiny ray-traced renders or accurately grade HDR video in professional applications such as . Now that capability comes down to GeForce and Titan. And not just the RTX models.
The driver announcement from Siggraph comes in conjunction with news of more laptops added to its RTX Studio roster, though . There are two new Lenovos: the Y740 15 Studio Edition and Y740 17 Studio Edition, variations of its gaming laptops but .
Screenshot by Lori Grunin/CNET
Photoshop has long given you the option to turn on a 30-bit color pipe between it and the graphics card. But if you enabled it on a system with a consumer-targeted, it didn’t do anything. That’s why there’s always been such confusion as to whether you could display 30-bit color with a GeForce card. I mean, there’s a check box and you can check it!
But Photoshop and Premiere use OpenGL to communicate with the graphics card, at least for color rendering, and the specific API calls to use deep color have only worked with Quadro cards. That can sting when you spent over $1,000 on a GTX 1080 Ti.
In its briefing, Nvidia made it sound like 30-bit-on-GeForce was a brand new idea inspired by Studio users’ requests. Does that mean the company was intentionally ignoring all the previous pleas — such as this one from its own forums in 2014?
It’s possible Nvidia decided that it had bigger professional fish to fry with Quadro, including AI and big data, and decided that the advantages of letting GeForce support a previously limited-to-workstation capability would boost the professional credibility for its new Studio marketing push. That seems especially likely given the adoption of AMD’s graphics on almost every hardware platform, as well as its high-powered exclusive partner, Apple.
Or maybe it’s to allow game designers to work on an Nvidia graphics card that can actually play games without having to pay hundreds extra for just to get the extra color depth, since GeForce and Titan hold up pretty well in the midrange 3D-acceleration department.
To properly take advantage of this, you still need all the other elements — a color-accurate display capable of 30-bit (aka 10-bit) color, for one. The ability to handle a 30-bit data stream is actually pretty common now — most displays claiming to be able to decode HDR video, which requires a 10-bit transform, can do it — but you won’t see much of a difference without a true 10-bit panel, which are still pretty rare among nonprofessionals.
That’s because most people associate insufficient bit depth with banding, the appearance of visually distinguishable borders between what should be smoothly graduated color. Monitors have gotten good at disguising banding artifacts by visually dithering the borders between colors where necessary. But when you’re grading HDR video or painting on 3D renders, for example, dithering doesn’t cut it.
And the extra precision is surely welcome when your doctor is trying to tell the difference between a tumor and a shadow on his cheap system. From Nvidia’s own white paper in 2009: “While dithering produces a visually smooth image, the pixels no longer correlate to the source data. This matters in mission critical applications like diagnostic imaging where a tumor may only be one or two pixels big.”