Announcement

Collapse
No announcement yet.

NVIDIA 545.29.02 Linux Driver Released With Much Better Wayland Support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by babali View Post
    How many times have we seen "much better wayland support"? But this time, maybe it finally good!
    Not for those who use Geforce 700 series or older (excluding GTX 745/750). There are still many who use GT 710 or 730.

    Comment


    • #12
      Originally posted by Khrundel View Post
      Doesn't "10 bit per component" mean HDR?
      I thought so.
      Because for me the 16 million colors (8-bit) and 1 billion colors (10-bit) are actually the same 7-10 primary colors, but with different shades (levels of light so they are more dark or more light.
      I don't see why HDR would need anything else.
      But I've seen people online saying that for HDR support another thing is need, like coding the light (luminance, brightness) in another way.
      So at this point I'm not sure why aren't the 10-bit colors enough to cover HDR too.

      But if I think at the following example:
      Having an image completely red (like half-way between the darkest red and the lightest red) which can be rendered in both 8-bit and 10-bit (probably a bit more precised in 10-bit).
      Maybe it's possible to add information for how strong the light behind the screen can "burn", like the intensity, so in a room the wall across the TV can be a little or more red too.
      So in this case, while the 10-bit colors help to pinpoint more precisely the red level between all the "reds", another thing is needed to explain to the TV how bright it should display that image from all the candles / nits it can display.

      Originally posted by slagiewka View Post

      From the release notes:
      ...
      Thanks, I missed that!

      Originally posted by babali View Post
      How many times have we seen "much better wayland support"? But this time, maybe it finally good!
      Many times and I bet we will see it many times in the future too, that it will actually surpass the meme for KDE.

      Comment


      • #13
        Originally posted by sdack View Post
        I have been speculating about it myself in the past, but I believe it is just Nvidia's way of saying they are supporting HDR10. They will only phrase this neutrally as "Deep Color" or "10-bit per component" rather than calling it HDR10. HDR10 is not the only 10-bit standard and HDR10 implies that the colour information is encoded in the REC.2020-colourspace, which is a non-linear colourspace mapping 1024 values onto a wider range of intensities than the sRGB-colourspace does. The mapping of the values to intensities then happens in the monitor, while the graphics card and driver merely have to pass this information on. So it makes more sense to call it "Deep Colour" (which is how Nvidia has been calling it) or "10-bits per component" rather than to imply to know what exactly happens with the 10-bit colour information on the output devices (aka monitor).
        No, just no.

        10 bits per channel is just the number of shades of the primary colors that can be supported. 10bpc (sometimes called DeepColor) has been supported for ages, but was restricted (via drivers) to professional cards.
        HDR (High Dynamic Range) is about range (i.e. the difference between the lowest and higher luminosity). It doesn't require 10bpc, but since the difference between darkest and lightest shade is greater than SDR, 10bpc is usually employed simply because without that, HDR would be a banding fest.
        HDR10 is one )of the many) ways to encode HDR content. HDR10 doesn't mean you will also see HDR content. Devices can read HDR content and tone map (or simply clip) it back to SDR.

        Comment


        • #14
          Originally posted by bug77 View Post
          10 bits per channel is just the number of shades of the primary colors that can be supported. 10bpc (sometimes called DeepColor) has been supported for ages, but was restricted (via drivers) to professional cards.
          HDR (High Dynamic Range) is about range (i.e. the difference between the lowest and higher luminosity). It doesn't require 10bpc, but since the difference between darkest and lightest shade is greater than SDR, 10bpc is usually employed simply because without that, HDR would be a banding fest.
          HDR10 is one )of the many) ways to encode HDR content. HDR10 doesn't mean you will also see HDR content. Devices can read HDR content and tone map (or simply clip) it back to SDR.
          No, sorry, you are just not understanding it right. What happens with the 10 bits per colour is for the monitor to decide. You can send HDR10 information to a monitor and it may decide to map it to 250, 400 or 800 nits for example (HDR monitors still have a brightness control like any other monitor), and it can do so with a true 10-bit DAC or just an 8-bit DAC using temporal shading. Only in order to have HDR10 does the graphics card and the driver need to support 10 bits per colour channel. You cannot have HDR10 without first supporting 10 bits per colour channel. This is not some strange coincident that in a time where HDR10 is becoming a more wide-spread standard and we are also now seeing support for HDR10 under Linux, that Nvidia's driver now prepares to support 10 bits per colour channel.
          Last edited by sdack; 31 October 2023, 11:26 AM.

          Comment


          • #15
            Like bug77 said, "Deep Color 10bpc" output is not the same as HDR10, you can use 10bpc on SDR(sRGB) mode:

            14t47f.png

















            Now, "Deep Color" signalling is a requirement for the EOTF used on the HDR10 standard, the PQ curve (HDR10's EOFT) needs at least 10bpc of color depth to avoid banding and quantization artifacts:

            Last edited by isaacx123; 31 October 2023, 11:18 AM.

            Comment


            • #16
              Originally posted by sdack View Post
              Only in order to have HDR10 does the graphics card and the driver need to support 10 bit per coloiur channel.
              Not really, you want your display to have as much color depth precision as it can when editing photos or grading videos, even SDR videos (709/1886).
              Pretty much every modern PC monitor for photo editing accepts 10-bit inputs from the GPU, professional photo editing software like Photoshop can output at 10bpc, all in SDR signalling.

              Comment


              • #17
                Originally posted by isaacx123 View Post
                Not really ...
                Yes, really! In order to have HDR10 do you need 10 bits per colour channel. How else do you propose should it work?! It is right there in the name HDR10 - the "10" meaning HDR with 10 bits per colour channel.

                Comment


                • #18
                  sdack I misunderstood your comment, I thought you meant 10bpc was only useful for HDR10, sorry.
                  Yeah, 10bpc is a requirement for HDR10, I said so in my post above.

                  Comment


                  • #19
                    Originally posted by isaacx123 View Post
                    Like bug77 said, ...
                    No, like *I* said. I already wrote "HDR10 is not the only 10-bit standard and ...". See my third sentence of my original comment. I suggest you start there and read what I wrote. Do not go by what bug77 wrote. He just is not good at reading comments.
                    Last edited by sdack; 31 October 2023, 11:41 AM.

                    Comment


                    • #20
                      Originally posted by caligula View Post

                      Not for those who use Geforce 700 series or older (excluding GTX 745/750). There are still many who use GT 710 or 730.
                      *nod* I have my brother's old GTX 760 in the closet and will probably slap it into the next machine I build from scratch to save money, given that I'm used to running a GTX 750 for my daily driver and built a hand-me-down gaming rig with an even older Radeon HD 5870 (the HP prebuilt's UEFI won't boot with anything nVidia post-6xx) that plays practically everything I care about perfectly well.

                      (Sure, GOG's release of Skyrim says it won't work on a dual-core i3 and PCSX2 struggles on it, but the only thing I had to pay for was a KVM switch and I have such a giant backlog that I'm more interested in stuff like the selection of DOS, Win9x, PSX, and PSP games I didn't have as a kid anyway. Tomba, Disgaea, Aquanox, Stronghold, and the Spyro trilogy, here I come.)
                      Last edited by ssokolow; 31 October 2023, 11:45 AM.

                      Comment

                      Working...
                      X