Announcement

Collapse
No announcement yet.

AMD GPU Driver Developers Pursuing New HDR Display Work For Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by billyswong View Post
    Now the question is, when one is viewing documents / images (moving images are also images) in daily-life lighting, why would someone want something many times brighter than paper-white shine and burn their eyes?
    Proper HDR mode implies being able to set how to map SDR content to HDR. Windows lets you set the maximum brightness of SDR content when in HDR mode. IOW, your documents will still show the brightness level you want, even in HDR mode. It's a non-issue.

    Comment


    • #12
      Thanks for explanation guys. No, I have no interest in seeing explosion scene harm my eyes realistically. If anyone want HDR screens replace all SDR screens and become the new standard, HDR screens need a knob / setting of clamping the maximum brightness.

      Comment


      • #13
        Originally posted by billyswong View Post
        No, I have no interest in seeing explosion scene harm my eyes realistically.
        It isn't going to physically hurt your eyes. No monitor display is that powerful.

        Comment


        • #14
          Photographers/Cinematographers measure light in stops with one stop being ether a doubling of having of brightness. When you are looking at a sunset with objects that are in casting shadpws the difference in brightness from the deepest back to the brightness of the sun is about 20 stops. That is the dynamic range. An Alexi camera can capture about 18 stops. A normal cinema camera about 14 - 16. A fairly high end DSLR/mirrorless camera ~14. A Rec.709 monitor like the vast majority of people are using is 8 stops. A movie theatre has about 10 stops. Display technology totally sucks compared to camera technology.

          When the scene you are capturing exceeds the dynamic range of the technology what happens is the bright colors are clipped and become white and the darkest colors get clipped and become black. Both lead to a loss of detail. A prime example you see every where is an interior shot where the windows are blown out and go white due to too much dynamic range in the shot. That washed out look you get on a monitor in a brightly lit room isn't a loss of dynamic range it is a loss of contrast. A monitor that has a higher dynamic range can help counteract that by over powering the room light. But yes that will be hard on the eyes if you stare at it for 8 hours straight. But all monitors HDR or SDR allow you to control the brightness. That is needed for calibrating. And no they are going to burn ou your eyes. HDR isn't just pixel values but also includes meta data that the display will use to turn up or down the back lights of the display to help make brights brigter and darks darker. I spend pretty well my entire day looking at HDR monitors and on days when I am color grading I do it in a blacked out room. If you are working on a text document it will just look like an SD display.

          For professionals doing color grading these days they will color grade on an HDR monitor then output different versions depending on where it will be used. Broadcast, movie theaters etc. You can't color grade on an SD monitor and then output HDR and expect to get good results. Color graing in HDR even if your not planning on displaying it now in HDR leaves it future proofed. The fact that you can't use an HDR monitor with Linux means it is not useful for professionals.
          Last edited by MadeUpName; 30 April 2021, 02:19 PM.

          Comment


          • #15
            I am shining on the inside now

            Comment


            • #16
              Originally posted by billyswong View Post

              HDR display in my knowledge is about showing colors brighter than paper-white

              _____________________________

              Maybe if we can think up HDR usage outside gaming / movie, there will be more enthusiasm to implement HDR in open source crowd? How about emulating florescence and metallic ink on paper? We will get syntax highlight that are literally "highlight"~
              about first, it's not apples to apples, but what can be said about bad HDR can be said about good SDR screens:
              https://www.youtube.com/watch?v=0srgPxK-WhQ

              If you dont need games and videos on screen or pro video editing - 1080P 30 hz LCD is enough for you, no?
              most people who actively use computers do at least some of this on daily basis. Heck, even youtube supposedly supports HDR.

              Comment


              • #17
                Originally posted by billyswong View Post
                Thanks for explanation guys. No, I have no interest in seeing explosion scene harm my eyes realistically. If anyone want HDR screens replace all SDR screens and become the new standard, HDR screens need a knob / setting of clamping the maximum brightness.
                Nope, you can't do that really. Someone said something stupid and someone repeated.

                You do have light bulbs in your room or have seen sun, right?
                And even then HDR monitors are NOWHERE as bright as those two. Otherwise everyone would buy screens to light up giant spaces, like football stadions. It's LCD, so it will save tonnes of money on power, right

                Comment


                • #18
                  Originally posted by [email protected] View Post
                  True HDR displays cost a lot of money. Most of what you found out there is mediocre (for HDR standards) panels with only 300/400 nits, nowhere near de 1000/1400 nits of a real one. Hence the lack of enthusiasm on the Linux (developer) crowd for these things.
                  Chicken-egg situation.

                  Comment


                  • #19
                    Originally posted by MadeUpName View Post
                    For professionals doing color grading these days they will color grade on an HDR monitor then output different versions depending on where it will be used. Broadcast, movie theaters etc. You can't color grade on an SD monitor and then output HDR and expect to get good results. Color graing in HDR even if your not planning on displaying it now in HDR leaves it future proofed. The fact that you can't use an HDR monitor with Linux means it is not useful for professionals.
                    These days the monitor probably will be HDR. But back in 2012 there were professional 10-bit color monitors like the HP ZR30w I am using right now. They were used professionally. Worked very well too.

                    One interesting thing is that with the correct software and color profile this monitor can mostly reproduce HDR 400. Back before HDR was implemented on computer hardware, that is how cinema professionals at Pixar, Disney and other places did it.

                    Comment


                    • #20
                      Originally posted by Zan Lynx View Post

                      These days the monitor probably will be HDR. But back in 2012 there were professional 10-bit color monitors like the HP ZR30w I am using right now. They were used professionally. Worked very well too.

                      One interesting thing is that with the correct software and color profile this monitor can mostly reproduce HDR 400. Back before HDR was implemented on computer hardware, that is how cinema professionals at Pixar, Disney and other places did it.
                      Sure and if they were selling it today it would have a big fat HDR sticker on it. The problem is you can't feed it an HDR signal from Linux so it would work as a crazy expensive SDR monitor. Beyond that no one was delivering in Rec.2020 because it didn't exist then never mind the ACES color space which is now industry standard for intermediaries.

                      Comment

                      Working...
                      X