Announcement

Collapse
No announcement yet.

AMD Continues Working Toward HDR Display Support For The Linux Desktop

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    I really hope they can work together and pull this off. I love Linux, but I feel like it's stuff like this that will forever keep it niche. HDR has been around for at least like 5 years. By the time this works on Linux everything will have moved to 8K and some new UltraHDR standard.

    Comment


    • #32
      Originally posted by Anux View Post
      Sure there is no need for standard office/desktop stuff to be HDR, you don't want to stress your eyes with bright lights while working 8 hours.

      Although HDR is a big improvement in picture quality, you don't need high end quality everywere. The jump from standard television to 1080p was much more valuable and I don't know many that like watching old 480i content, while there is no problem watching something without HDR.
      This shows how misled people are in the expectation of HDR. HDR shouldn't have been about making everything brighter. It should have been about giving bright spots in a scene a chance to be not a patch of white but still contain details or colours. In office / desktop stuff, this translates to a way to present metallic ink.

      But well... this is just my ideal. In real life, the movie studios are using HDR to put dark scenes dark and bright scenes bright, all in absolute luminosity, and only deliver okay in cinema / darkroom environment. For common TV / computer / smartphone environment, the tonemapping should have been done like what I wrote in last paragraph. Let's see if game studios will get it right one day, as game developers are less likely to keep pretending everyone will game in darkrooms.

      Comment


      • #33
        Originally posted by mppix View Post


        This is well in progress but there is a lot to do..
        Such as not limiting it to GNOME.

        Comment


        • #34
          Originally posted by Jabberwocky View Post
          In my biased experience 99% of people that really use high resolution HDR do not use Linux.
          That's what they used to say about gaming on Linux...

          Comment


          • #35
            Originally posted by billyswong View Post
            This shows how misled people are in the expectation of HDR. HDR shouldn't have been about making everything brighter. It should have been about giving bright spots in a scene a chance to be not a patch of white but still contain details or colours. In office / desktop stuff, this translates to a way to present metallic ink.

            But well... this is just my ideal. In real life, the movie studios are using HDR to put dark scenes dark and bright scenes bright, all in absolute luminosity, and only deliver okay in cinema / darkroom environment. For common TV / computer / smartphone environment, the tonemapping should have been done like what I wrote in last paragraph. Let's see if game studios will get it right one day, as game developers are less likely to keep pretending everyone will game in darkrooms.
            Yes studios have been playing around with dark scenes in bright scenes but there is point that the human eye kicks and goes no that does not work. There was work with blender to allow in srgb of getting close. That is when in the AMD presentation they have pictures of X detail with Y brightness.

            There is a nightmare to HDR displaying srgb if it wrong. Example is Darth Vader in star wars in a particular scene. When the scene was shot he was standing in background in front of a black screen so unable to be seen.

            Some of people expectation about HDR is that most HDR monitor technically sux badly.

            We are starting to now see like above with what is called True 10-bit. These have true black and true 10 bits per pixel. The so called HDR monitors with zoned dimming would be like 8bit per pixel with averaged brightness adjustment. It going to be a while before True 10-bit HDR makes it into consumer monitors still.
            Monitor panel bit color depth may seem confusing, but this article will help simplify the pro's and con's of 10-bit vs 8-bit + Frame Rate Control (FRC).


            Yes Monitor makers claim 8-bit + frc has been good enough. Do note this leaded to a horrible problem. Person is mastering video on True 10 bit screen that does not match what the end viewer will have in any way shape or form on 8-bit +frc screen. There are many studios not bothering with HDR production because of the problem. SRGB what you see in production and what the person sees on cheaper screens will be closer.

            billyswong I would not say people have been miss lead as such. Instead I would say the quality of the HDR experience has not been great. Something else to be aware of is the sRGB we see on our current monitors most cases is not idea. People like old CRT monitors because they have bigger contrast range.

            Bigger contrast range equal being able to show more colors as well. One of the horrible realities is we will never be able to display metallic ink correctly on a monitor HDR does not change this. Issue metallic ink is not 1 colour. Its a funny issue left and right eye with metallic ink in fact sees a slightly different colour. HDR VR headset might be able to do metallic ink. Yes a person who has lost an eye has had time telling the difference between metallic ink and non metallic ink of the same colour. The bigger colour space gives HDR better chance of doing one of the colours a metalic ink. Also a CRT monitor or invididual pixel brightness control oled monitor due to higher contrast has a better chance of doing the colour of metallic ink even when srgb.

            The reality is the first 5 years of HDR the monitors have really not been ideal. 8bit+frc solution to emulating 10bit HDR is not that much of a step up from 8bit SRGB. Yes 8bit+frc serous-ally give the appearance that the HDR difference is just brighten up sections of the image. The step from 8bit+frc and 8bit SRGB to True 10bit HDR is quite a step up. Remember we took quite a step down when we went from SRGB on CRT to LCD as well. 8bit+frc mostly end up reducing the difference between CRT SRGB quality and LCD produced quality.

            CRT srgb was very close to the gold standard of srgb for quality. Yes oled individual pixel 10bit HDR is the gold standard HDR. 8 bit LCD screens have not been gold standard in colour display.


            Comment


            • #36
              Originally posted by billyswong View Post
              This shows how misled people are in the expectation of HDR. HDR shouldn't have been about making everything brighter.
              And it never was, it's about extending the SDR brightness of around 100 nits with highlights of 1000 nits and more. I just said that that's not what you want for office stuff because it stresses your eyes.

              and only deliver okay in cinema / darkroom environment.
              There is a reason why even cinemas display content only in dark environments. The surrounding lights limit the black level and in turn the contrast you can display. For HDR in well lit rooms you would need enormous brightness that gets in the realm of potential harm for your eyes.

              Originally posted by oiaohm View Post
              The reality is the first 5 years of HDR the monitors have really not been ideal. 8bit+frc solution to emulating 10bit HDR is not that much of a step up from 8bit SRGB. Yes 8bit+frc serous-ally give the appearance that the HDR difference is just brighten up sections of the image. The step from 8bit+frc and 8bit SRGB to True 10bit HDR is quite a step up. Remember we took quite a step down when we went from SRGB on CRT to LCD as well. 8bit+frc mostly end up reducing the difference between CRT SRGB quality and LCD produced quality.
              Cheap SDR display are still 6 bit + frc and I highly doubt any CRT can hold up to the contrast of a VA display even with 8 bit + frc.​

              Comment


              • #37
                Originally posted by Anux View Post
                Cheap SDR display are still 6 bit + frc and I highly doubt any CRT can hold up to the contrast of a VA display even with 8 bit + frc.​
                Learn what a CRT display is, and how CRTs work.

                The reality is 8bit CRT in fact has higher contrast than 8bit+frc LCD. The issue is true black. CRT has wider viewing angle that it remains color correct at as well.

                Remember the cheapest SVGA CRT were still 8bit SDR. The true 10 bit colour displays with true led per pixel have the same properties as the phosphors of CRT screens of large viewing angles of correct colour display and really close true black.

                VA displays are the best of the LCD tech when it come to how close they get to true black but they are still a long way off compared to CRT.

                Anux it is a surprise to a lot. Massive thing is the old CRT had good colour but is huge on desk and bad on power usage and bad on radiation exposure. LCD provided a set of advantages that we traded away some colour correctness for. Per LED per pixel displays now allow getting those colour corrections back.

                Please note the first ever 10bit HDR screen was not a LCD screen but was in fact a CRT screen so yes there are CRT screens that can absolutely beat 8bit+frc VA displays.

                Comment


                • #38
                  There is good progress already happening in Wayland arround Color Management as well as HDR. There are a lot of edge cases and decisions that have to be made, but they try to do it "right" this time, like having surfaces announce their color space, depth, hdr curve and such to the display manager instead of letting the application do the color management / transformation.
                  But this comes with other challenges like performance, you have to do the color space transformation and gamma curve calculation on the GPU for example, as otherwise the buffer needs to travel betwenn GPU and CPU such that the color tranformation can happen there, which would have a serious performance impact.

                  There are a lot of merged changes already: https://gitlab.freedesktop.org/wayla...sts?label_name[]=Colour%20management
                  Last edited by Spacefish; 07 October 2022, 07:49 PM.

                  Comment


                  • #39
                    This compares a CRT with an unamed LCD from 2005? Probably TN display by the description.

                    The reality is 8bit CRT in fact has higher contrast than 8bit+frc LCD.
                    Most CRTs were analog and therfore had no bit depth. Also contrast has nothing to do with bit depth. Contrast is defined by the brightest and darkest pixel a monitor can display at the same time.

                    The issue is true black
                    Even the phosphor screen glows with black pictures and thats brighter than atleast an OLED. Also you had a glow around high contrast lines. If you would do the ansi checkerboard test on a CRT it would be worse than any LCD type.

                    VA displays are the best of the LCD tech when it come to how close they get to true black but they are still a long way off compared to CRT.
                    What? I had multiple CRTs in my life and now I have a VA display. To not split hairs lets say VA and CRT are equal, but still oled is darker.
                    Edit: Actually your right CRT has darker blacks than VA. But VA is already dark enough that you have more problems with the light reflecting of the walls than with display glow. So unless your walls are black/unreflective and no one sits in front of the CRT you probably have no gain from the darker black.

                    CRT had good colour
                    I replaced my CRT with a IPS display, yes contrast and blacklevels where worse but color was definitly better on the IPS. And that was 8bit vs 8bit (the CRT only ever got 8 bit signals from me).

                    Please note the first ever 10bit HDR screen was not a LCD screen but was in fact a CRT screen
                    That may have been 10 bit (I never saw a high end "photo editing" CRT), but it certainly was not HDR, because the max brightness on CRTs was much to low. Also those weren't screens we had at home for our PCs or TVs.

                    At the time of the change LCD was realy a step back in image quality (mostly TN) and we had another step back in resolution with the 4:3 to 16:9 transition. But today's even modestly priced LCDs are better than CRTs in nearly all categories.

                    Comment


                    • #40
                      Originally posted by Spacefish View Post
                      But this comes with other challenges like performance, you have to do the color space transformation and gamma curve calculation on the GPU for example, as otherwise the buffer needs to travel betwenn GPU and CPU such that the color tranformation can happen there, which would have a serious performance impact.
                      The talk was specifically about doing those with dedicated fixed function hardware.

                      Comment

                      Working...
                      X