Announcement

Collapse
No announcement yet.

NVIDIA Is Working Towards HDR Display Support For Linux, But The Desktop Isn't Ready

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by juno View Post

    No.

    You have to differ from pure color depth to "HDR". What they now merchandise as "HDR" also contains the maximum brightness of the screen, which implies a higher contrast ratio as well. "HDR" TVs and monitors shine brighter compared to what we used to know. Sure, some professional monitors have been using 10 bits per color for some years, but most still only reach something around 300 cd/m².
    So the goal is, not to just increase color resolution, but also widen the gamut. White is brighter while trying to maintain the same black level.

    Or, in other words: the contrast ratio doesn't increase by maintaining black and white and just filling in more steps in between. But this is not what high dynamic range is about.

    Sure, that's not what this news is about. They talk about GPUs and software. There you don't have to care about that. But what is discussed here in this topic and in this specific post is about the monitors, because HDR monitors are actually emerging.
    The maximum brightness doesn't matter, it just a matter of color-space correction, What needs work is the depth needed to use that extra range, you can always extend range, but doing so without depth just increases the visible step between adjacent colors.

    And sorry HDR monitors are not emerging, they have been around for a long time, they are characterized by their 10bit color-space.

    Like when we talk dynamic range on audio we talk 16bit vs higher bit precisions. The actually volume the speakers can play is completely irrelevant. It doesn't become more HDR audio just because your speakers can play at stadium levels.

    Comment


    • #22
      HDR is in Alpha-level development on Windows and Android. It has to be worked into the OS and the respective software that you are using, be it games, media players, etc. The driver side of things in Windows is coming along nicely - I don't know what's going on with Android, but I imagine they aren't there yet - but actual software to use it is non-existent outside of what's cooking in test labs. Given that, it is believed that CES, Computex, and IFA 2017 should bring a slew of HDR stuff for gaming and computers, with monitors coming towards the end of the year.

      Comment


      • #23
        Originally posted by carewolf View Post
        The maximum brightness doesn't matter
        Believe that, if you like

        Originally posted by carewolf View Post
        you can always extend range, but doing so without depth just increases the visible step between adjacent colors.
        Yes and I'm talking about both, you only about one of them.

        Originally posted by carewolf View Post
        And sorry HDR monitors are not emerging, they have been around for a long time, they are characterized by their 10bit color-space.
        No, as I said before. Professional monitors have been using 10 bpc for years. There are very few of them that really exceed values like 500cd/m² (nits) and you pay a few thousand bucks for each of those. The vast majority of consumer displays is somewhere around 250 nits.
        Now there is an industry standard for this, consumer TVs and monitors are coming and that's why both AMD and NVIDIA also started talking about it more. Consumer cards used to output max. 8 bpc RGB, only workstation cards were unlocked to do more. Until now.
        Originally posted by NVIDIA
        First generation HDR compliant displays will generate 1000 nits of luminance


        Originally posted by carewolf View Post
        Like when we talk dynamic range on audio we talk 16bit vs higher bit precisions. The actually volume the speakers can play is completely irrelevant. It doesn't become more HDR audio just because your speakers can play at stadium levels.
        Oh, what a nice comparison. Speakers can also play back audio "louder what you can hear" aka damage your ears. But there is not a single monitor that makes you blind because it is too bright.



        Oh and a quote from the presentation this article was about

        Originally posted by Andy Ritger
        High Dynamic Range (HDR): express a wider range of luminance than today.
        Last edited by juno; 21 September 2016, 08:46 PM.

        Comment


        • #24
          Originally posted by juno View Post
          Believe that, if you like


          Yes and I'm talking about both, you only about one of them.


          No, as I said before. Professional monitors have been using 10 bpc for years. There are very few of them that really exceed values like 500cd/m² (nits) and you pay a few thousand bucks for each of those. The vast majority of consumer displays is somewhere around 250 nits.
          Now there is an industry standard for this, consumer TVs and monitors are coming and that's why both AMD and NVIDIA also started talking about it more. Consumer cards used to output max. 8 bpc RGB, only workstation cards were unlocked to do more. Until now.




          Oh, what a nice comparison. Speakers can also play back audio "louder what you can hear" aka damage your ears. But there is not a single monitor that makes you blind because it is too bright.



          Oh and a quote from the presentation this article was about


          https://www.x.org/wiki/Events/XDC201...c-2016-hdr.pdf
          Nicely said.
          If you look at the ultra hd premium spec they have two requirements, as far as this area is concerned: 90% coverage of the dci-p3 COLOR SPACE (so A MUCH wider gamut than we've ever before had) and, for hdr, two different contrast levels (which is what hdr means---afaict the two levels are present to account for the differences in luminance range of backlit LCDs and oleds).

          Want the ultimate home entertainment experience? You should probably know about Ultra HD Premium.

          Comment


          • #25
            Originally posted by carewolf View Post

            The maximum brightness doesn't matter, it just a matter of color-space correction, What needs work is the depth needed to use that extra range, you can always extend range, but doing so without depth just increases the visible step between adjacent colors.

            And sorry HDR monitors are not emerging, they have been around for a long time, they are characterized by their 10bit color-space.

            Like when we talk dynamic range on audio we talk 16bit vs higher bit precisions. The actually volume the speakers can play is completely irrelevant. It doesn't become more HDR audio just because your speakers can play at stadium levels.
            The maximum brightness doesn't matter in terms of what? In terms of our eyes, it is more important than resolution and color combined.

            HDR as discussed in the display industry and in this video is about standardizing how a display handles content authored for different displays. This means that you can take a 4k HDR movie that was color graded for the top of the line 20,000 dollar display with 10,000 nits of brightness and display it on a 1,000 nit display and get the intent of the content creator. This should have the effect of solving major parts of the display color variance we see in today's monitors, though that isn't their actual goal.

            HDR in this respect is not the same thing that photographers talk about when speaking of HDR photos and Videos. they are talking about data. This is talking a bout a standaterdized interchange format for displays. It is about the metadata around the bpc, not the bits per color. Its equivalent to the artist info in an mp3. Yes they talk about 16bit and other things, but that isn't about the actual data, its about supporting the infrastructure around the HDR standard.

            I am not saying that HDR (High Dynamic Range) isn't concerned with the bits per color, I am saying that HDR as used in this instance is a marketing term. I would also argue that a 10-bit display is not High Dynamic Range, but rather expanded gamut. Because they don't expand the range of anything.

            Again we are not talking about a format, such as 16bit audio. We are in fact talking about making displays have similar luminescence to Theaters. HDR as used in this instance is akin to marketing headphones for giving a more dynamic response curve to the audio. Its all marketing.

            Comment


            • #26
              Originally posted by atomsymbol



              The black color and "bright black color" are identical. They could have mapped "bright black color" to dark gray, but that's not the case.
              Totally forgot about the bright variants, my bad.

              Comment


              • #27
                Originally posted by dragorth View Post

                The maximum brightness doesn't matter in terms of what? In terms of our eyes, it is more important than resolution and color combined.
                .

                In terms of protocol. It is a user calibration. Brightness is not, and can not be absolute and independent of circumstances, it needs to be adjusted to environment the screen is in, turning it the max higher just makes the screen too bright for some places, and making the lowest black blacker, just makes it invisble in others. It needs to fall inside the scope where it is actually visible, and which the users feels comfortable with.


                Comment


                • #28
                  Originally posted by dragorth View Post
                  HDR as discussed in the display industry and in this video is about standardizing how a display handles content authored for different displays. This means that you can take a 4k HDR movie that was color graded for the top of the line 20,000 dollar display with 10,000 nits of brightness and display it on a 1,000 nit display and get the intent of the content creator. This should have the effect of solving major parts of the display color variance we see in today's monitors, though that isn't their actual goal.

                  HDR in this respect is not the same thing that photographers talk about when speaking of HDR photos and Videos. they are talking about data. This is talking a bout a standaterdized interchange format for displays. It is about the metadata around the bpc, not the bits per color. Its equivalent to the artist info in an mp3. Yes they talk about 16bit and other things, but that isn't about the actual data, its about supporting the infrastructure around the HDR standard.

                  But that would also make it completely separate from the HDR we talk about in Video streams, and in game rendering and on desktops, because that is all extending the color-depth. We have things called HDR at the moment, and it refers to bit-depth. So now there is something else called HDR, which doesn't yet exist and is supposed to be something completely different than HDR.

                  Comment


                  • #29
                    Originally posted by carewolf View Post
                    [/SIZE]
                    But that would also make it completely separate from the HDR we talk about in Video streams, and in game rendering and on desktops, because that is all extending the color-depth. We have things called HDR at the moment, and it refers to bit-depth. So now there is something else called HDR, which doesn't yet exist and is supposed to be something completely different than HDR.
                    We humans have a habit of making one work have multiple meanings. It happens in all natural languages. So yes it does mean both. Depending on which setting it is used in.

                    Comment


                    • #30
                      Originally posted by carewolf View Post
                      In terms of protocol. It is a user calibration. Brightness is not, and can not be absolute and independent of circumstances, it needs to be adjusted to environment the screen is in, turning it the max higher just makes the screen too bright for some places, and making the lowest black blacker, just makes it invisble in others. It needs to fall inside the scope where it is actually visible, and which the users feels comfortable with.[/SIZE]
                      This standard, as far as I know, doesn't take the brightness controls into account. It only takes into account the digital signal it gets through the video port.

                      And it can be absolute if all the pieces are talking to each other. The monitor says what it supports to the computer, and the computer can transform its data to accurately display on the monitor. We already have self-calibrating monitors, even if they are very expensive.

                      Comment

                      Working...
                      X