Originally posted by babali
View Post
NVIDIA 545.29.02 Linux Driver Released With Much Better Wayland Support
Collapse
X
-
Originally posted by Khrundel View PostDoesn't "10 bit per component" mean HDR?
Because for me the 16 million colors (8-bit) and 1 billion colors (10-bit) are actually the same 7-10 primary colors, but with different shades (levels of light so they are more dark or more light.
I don't see why HDR would need anything else.
But I've seen people online saying that for HDR support another thing is need, like coding the light (luminance, brightness) in another way.
So at this point I'm not sure why aren't the 10-bit colors enough to cover HDR too.
But if I think at the following example:
Having an image completely red (like half-way between the darkest red and the lightest red) which can be rendered in both 8-bit and 10-bit (probably a bit more precised in 10-bit).
Maybe it's possible to add information for how strong the light behind the screen can "burn", like the intensity, so in a room the wall across the TV can be a little or more red too.
So in this case, while the 10-bit colors help to pinpoint more precisely the red level between all the "reds", another thing is needed to explain to the TV how bright it should display that image from all the candles / nits it can display.
Originally posted by slagiewka View Post
From the release notes:
...
Originally posted by babali View PostHow many times have we seen "much better wayland support"? But this time, maybe it finally good!
Comment
-
-
Originally posted by sdack View PostI have been speculating about it myself in the past, but I believe it is just Nvidia's way of saying they are supporting HDR10. They will only phrase this neutrally as "Deep Color" or "10-bit per component" rather than calling it HDR10. HDR10 is not the only 10-bit standard and HDR10 implies that the colour information is encoded in the REC.2020-colourspace, which is a non-linear colourspace mapping 1024 values onto a wider range of intensities than the sRGB-colourspace does. The mapping of the values to intensities then happens in the monitor, while the graphics card and driver merely have to pass this information on. So it makes more sense to call it "Deep Colour" (which is how Nvidia has been calling it) or "10-bits per component" rather than to imply to know what exactly happens with the 10-bit colour information on the output devices (aka monitor).
10 bits per channel is just the number of shades of the primary colors that can be supported. 10bpc (sometimes called DeepColor) has been supported for ages, but was restricted (via drivers) to professional cards.
HDR (High Dynamic Range) is about range (i.e. the difference between the lowest and higher luminosity). It doesn't require 10bpc, but since the difference between darkest and lightest shade is greater than SDR, 10bpc is usually employed simply because without that, HDR would be a banding fest.
HDR10 is one )of the many) ways to encode HDR content. HDR10 doesn't mean you will also see HDR content. Devices can read HDR content and tone map (or simply clip) it back to SDR.
Comment
-
-
Originally posted by bug77 View Post10 bits per channel is just the number of shades of the primary colors that can be supported. 10bpc (sometimes called DeepColor) has been supported for ages, but was restricted (via drivers) to professional cards.
HDR (High Dynamic Range) is about range (i.e. the difference between the lowest and higher luminosity). It doesn't require 10bpc, but since the difference between darkest and lightest shade is greater than SDR, 10bpc is usually employed simply because without that, HDR would be a banding fest.
HDR10 is one )of the many) ways to encode HDR content. HDR10 doesn't mean you will also see HDR content. Devices can read HDR content and tone map (or simply clip) it back to SDR.Last edited by sdack; 31 October 2023, 11:26 AM.
Comment
-
-
Like bug77 said, "Deep Color 10bpc" output is not the same as HDR10, you can use 10bpc on SDR(sRGB) mode:
14t47f.png
Now, "Deep Color" signalling is a requirement for the EOTF used on the HDR10 standard, the PQ curve (HDR10's EOFT) needs at least 10bpc of color depth to avoid banding and quantization artifacts:
Last edited by isaacx123; 31 October 2023, 11:18 AM.
Comment
-
-
Originally posted by sdack View PostOnly in order to have HDR10 does the graphics card and the driver need to support 10 bit per coloiur channel.
Pretty much every modern PC monitor for photo editing accepts 10-bit inputs from the GPU, professional photo editing software like Photoshop can output at 10bpc, all in SDR signalling.
Comment
-
-
No, like *I* said. I already wrote "HDR10 is not the only 10-bit standard and ...". See my third sentence of my original comment. I suggest you start there and read what I wrote. Do not go by what bug77 wrote. He just is not good at reading comments.Last edited by sdack; 31 October 2023, 11:41 AM.
Comment
-
-
Originally posted by caligula View Post
Not for those who use Geforce 700 series or older (excluding GTX 745/750). There are still many who use GT 710 or 730.
(Sure, GOG's release of Skyrim says it won't work on a dual-core i3 and PCSX2 struggles on it, but the only thing I had to pay for was a KVM switch and I have such a giant backlog that I'm more interested in stuff like the selection of DOS, Win9x, PSX, and PSP games I didn't have as a kid anyway. Tomba, Disgaea, Aquanox, Stronghold, and the Spyro trilogy, here I come.)Last edited by ssokolow; 31 October 2023, 11:45 AM.
Comment
-
Comment