Announcement

Collapse
No announcement yet.

AMD GPU Driver Developers Pursuing New HDR Display Work For Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • AMD GPU Driver Developers Pursuing New HDR Display Work For Linux

    Phoronix: AMD GPU Driver Developers Pursuing New HDR Display Work For Linux

    One of the areas of Linux desktop display support that isn't as well supported compared to Windows is high dynamic range (HDR) displays. There have been various vendors and developers over the years working towards Linux desktop HDR improvements but still it hasn't been a fast-advancing area in the open-source ecosystem. At least now AMD Radeon graphics driver developers do appear to be working on HDR improvements...

    https://www.phoronix.com/scan.php?pa...R-Display-2021

  • #2
    Great news. HDR is one of those things that is really nice once you get your set configured for it.

    On my TV the trick was switching to Game Mode. All the other video modes, including PC Input, did some post processing that made reds, oranges, and yellows overly bright and bleed.

    Comment


    • #3
      True HDR displays cost a lot of money. Most of what you found out there is mediocre (for HDR standards) panels with only 300/400 nits, nowhere near de 1000/1400 nits of a real one. Hence the lack of enthusiasm on the Linux (developer) crowd for these things.

      I imagine that, having to choose, most developers will invest on a better CPU/memory, than on a fancy display that features something they don't need on their normal workflow.

      Comment


      • #4
        Originally posted by [email protected] View Post
        True HDR displays cost a lot of money. Most of what you found out there is mediocre (for HDR standards) panels with only 300/400 nits, nowhere near de 1000/1400 nits of a real one. Hence the lack of enthusiasm on the Linux (developer) crowd for these things.

        I imagine that, having to choose, most developers will invest on a better CPU/memory, than on a fancy display that features something they don't need on their normal workflow.
        It is not about enthusiasm, there are people that actually need it and have to use MacOS for that, implementing this would be a step further to have more media content creator users, also why would the big proprietary companies make a port of their proprietary software when its usage would be crippled by the lack of this fundamental feature in system.

        Comment


        • #5
          Good, but what the fuck takes them so long to olve this, at least partially ?
          On Windows, the MadVr developer managed to do it single-handedly and without any funding for movies and AMD with lots of funding takes years to do it.
          And BTW, MadVr does send the video HDR metadata to the TV by itself, no using any Windows API as it works on Windows 7 too.
          AMD should've been able to do it much faster since they are the creator of the hardware (HDMI and Displayport included) and the drivers, so they know everything there is to know.
          The same with CEC, not being able to control Kodi or other media centers from the TVs remote control because AMD doesn't want to implement it.

          Comment


          • #6
            Originally posted by Danny3 View Post
            The same with CEC, not being able to control Kodi or other media centers from the TVs remote control because AMD doesn't want to implement it.
            Does it work in Windows? I thought CEC wasn't possible because the cards don't have a CEC chip to save on license costs? You can use a CEC HDMI->USB splitter to make it work.

            Comment


            • #7
              Originally posted by Danny3 View Post
              Good, but what the fuck takes them so long to olve this, at least partially ?
              On Windows, the MadVr developer managed to do it single-handedly and without any funding for movies and AMD with lots of funding takes years to do it.
              And BTW, MadVr does send the video HDR metadata to the TV by itself, no using any Windows API as it works on Windows 7 too.
              AMD should've been able to do it much faster since they are the creator of the hardware (HDMI and Displayport included) and the drivers, so they know everything there is to know.
              On windows MS decided what the protocol would be and dictated the drivers needs to do. On Linux you have to get all of the various desktops and toolkits and media apps and driver developers on board and all agreeing on a solution. A vendor specific solution won't fly upstream. The code is there in the driver if you want to play with it directly yourself. The trick is choosing the right API that everyone can agree one.

              Originally posted by Danny3 View Post
              The same with CEC, not being able to control Kodi or other media centers from the TVs remote control because AMD doesn't want to implement it.
              AMD had CEC support in several generations of hardware, but no partners chose to productize it. It was eventually removed to save die space.

              Comment


              • #8
                Originally posted by [email protected] View Post
                I imagine that, having to choose, most developers will invest on a better CPU/memory, than on a fancy display that features something they don't need on their normal workflow.
                Yes, unless their workstation is also a gamestation.

                Just a random note: Stadia supports HDR without the need to invest money into a fast GPU - 4K cloud gaming requires only an iGPU or a cheap discrete GPU. Cloud gaming has worse visual quality compared to local gaming, but this will probably improve with new compression algorithms on the server and image post-processing on the client.

                Comment


                • #9
                  Originally posted by [email protected] View Post
                  True HDR displays cost a lot of money. Most of what you found out there is mediocre (for HDR standards) panels with only 300/400 nits, nowhere near de 1000/1400 nits of a real one. Hence the lack of enthusiasm on the Linux (developer) crowd for these things.
                  HDR display in my knowledge is about showing colors brighter than paper-white (or sometimes, also colors darker than ink-black) compared to viewer's surrounding real objects. Unless a monitor / projector is used in darkroom / cinema, dumping a scene-referred signal to the output display is NOT going to work as wished since the lighting environment will mix in and mess things up even if one has a "perfect" display.

                  Now the question is, when one is viewing documents / images (moving images are also images) in daily-life lighting, why would someone want something many times brighter than paper-white shine and burn their eyes? Every time I see talk of those high-nit-display, I feel indifferent. At home I tune my LCD monitor for PC with "0 brightness" and then half the number of each individual RGB channels compared to default setting. If an HDR display can show off their 10bit or 12bit per channel deep color with my taste of #FFFFFF brightness, it probably won't need anything higher than 400 nits. On the other hand, if those "true HDR" displays can only achieve their deep color when #FFFFFF is a lot brighter than my likeness, then that means it is not deep color in my experience, since I won't tune the #FFFFFF to that bright.

                  Maybe if we can think up HDR usage outside gaming / movie, there will be more enthusiasm to implement HDR in open source crowd? How about emulating florescence and metallic ink on paper? We will get syntax highlight that are literally "highlight"~
                  Last edited by billyswong; 30 April 2021, 11:07 AM.

                  Comment


                  • #10
                    Originally posted by billyswong View Post
                    On the other hand, if those "true HDR" displays can only achieve their deep color when #FFFFFF is a lot brighter than my likeness, then that means it is not deep color in my experience, since I won't tune the #FFFFFF to that bright.
                    HDR maps a much wider color and brightness space than standard RGB. Your 24-bit #FFFFFF is only a small portion of the 10 or 12 bit HDR.

                    I have a GSync HDR monitor for my Windows gaming PC. In HDR mode, as far as I can tell, you cannot set the brightness at all. The monitor will display its brightest bright when it is commanded to do it. You adjust the brightness of SDR applications in one of the Windows settings panels.

                    So you can set your 255, 255, 255 VGA RGB to a dimmer setting. But if you create an HDR surface and set the pixels to 1023, 1023, 1023 it is going to try to blind you. Especially because that isn't a linear change. The change from 512 to 1023 is more than twice the brightness change from 255 to 511.

                    What this does is on an HDR 1000 display it really does look like an explosion, or your character just looked directly at the Sun. Sneaking through a dark area and having a flash-bang grenade go off really is a shocking experience.

                    HDR is great. But not for working on text documents. That is not what it's for.

                    Comment

                    Working...
                    X