Announcement

Collapse
No announcement yet.

HDR Support Is Being Worked On For Linux's DRM Code

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • HDR Support Is Being Worked On For Linux's DRM Code

    Phoronix: HDR Support Is Being Worked On For Linux's DRM Code

    With more HDR monitors hitting the market, Intel developers are working on plumbing support for High Dynamic Range displays into the Linux kernel's DRM layer...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    I guess AMD has their own HDR code in DC also?

    Comment


    • #3
      Originally posted by ernstp View Post
      I guess AMD has their own HDR code in DC also?
      I'm hoping they'll work together on this, they've been talking a lot about Atomic and reusing the helpers in drm core

      Comment


      • #4
        Could somebody explain me what is HDR in a computer display? I know what HDR is in photography, but I can't imagine what it would be in a display... isn't that just a wider gamut?

        Comment


        • #5
          Originally posted by Mariusz View Post
          Could somebody explain me what is HDR in a computer display? I know what HDR is in photography, but I can't imagine what it would be in a display... isn't that just a wider gamut?
          If you read through the comments on the older linked nvidia support article, they go into details there, bit of back and forth about such confusion with HDR in different industries/formats I think.

          The way I understand it is if your display is only capable of showing 8-bit colours, and HDR is like 10-bit, your RGB10 image may have some banding as it can't represent the extra bits of data on the display side of things? In the older article, they refer to it as luminance(blacks/whites) and nits, so it might not be about colour bit depth support(there was an argument about older professional monitors that supported 10-bits, while this new stuff was about protocol for the extra meta data to be communicated from the system to the display over HDMI/DP I think).

          Hopefully someone who understands it better can answer you properly

          Comment


          • #6
            Just a heads-up​, hdr doesn't say anything about colors only the maximum number of stops that can be represented. So, if you look at a cie slice you'll see two coordinates ((x,y), or (a,b), usually) with the third, usually, not shown. The third is luminance, and you need all three defined in order to define, for instance, the color space that a monitor is able to reproduce.
            Specs like hdr10/+, Dolby vision, rec2100 deal with both color space and contrast, amongst other things.

            Comment


            • #7
              HDR on notebooks is usually double possible brightness and 10bit colors ("number of steps"). Means in the standard mode (no HDR), your display has just higher possible brighness available. In the HDR mode, the colors are stretched in the center and you get bonus color brightness below (darker colors) and above (brighter colors). So the standard colors are not the range 0-100%, but 25-75% and you get bonus colors in the 0-25% and 75-100% ranges. HDR itself doesn't requite 10bit colors, but you will get color banding on 8bit displays of course.

              Comment


              • #8
                One of the usual demos for an HDR display is the Sun coming around the Earth from a space satellite view. The absolute black of space with stars and then the brightness of the Sun rising over the curve. It is very dramatic.

                Comment

                Working...
                X