Announcement

Collapse
No announcement yet.

Wayland Color Management Protocol Posted For Weston

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #91
    Originally posted by mSparks View Post
    ahahahaahaaa,

    Quality

    HDR/SDR is a backlighting issue.

    Try again.

    Yes if you with a zoned monitor start displaying some of the REC2020 test images you notice points that should be the same color are not in fact the same color the adjusting the backlight to expand range leads to this problem.
    Its not backlighting as such the hack zoned monitors are using to take a 8bit lcd and attempt to have it produce something like REC2020 does not work.

    I think you need the tripple facepalm.


    Item that can in fact have a REC2020 monitor that correct notice something no backlight. This is not a backlighting issue.

    The reason for using zoned backlight in the first place is the LCD part of the monitor cannot do the color space for HDR.

    Comment


    • #92
      Originally posted by oiaohm View Post

      Try again.



      Its not backlighting as such the hack zoned monitors are using to take a 8bit lcd and attempt to have it produce something like REC2020 does not work.

      I think you need the tripple facepalm.


      Item that can in fact have a REC2020 monitor that correct notice something no backlight. This is not a backlighting issue.

      The reason for using zoned backlight in the first place is the LCD part of the monitor cannot do the color space for HDR.
      Dude, HDR is a completely different color space than SDR.
      the bitwidth of the data is different (needs more than 0-255 per color - actually only 16 to 235 on most TVs, something wayland is completely incapable of dealing with atm (actually you can, on nvidia, if you log in to X11 first and configure the display, then log out and log back in to wayland), and trying to push 0-255 per colour to a display that only renders 16 to 235 looks horrific)
      It has absolutely nothing to do with dark blacks, backlighting or any other tangentially related topic you feel like incorrecting everyone on.
      Last edited by mSparks; 05 October 2023, 12:36 PM.

      Comment


      • #93
        Originally posted by mSparks View Post
        Dude, HDR is a completely different color space than SDR.
        HDR colorspace is meant to be REC2020 that is 10 bits to 12bits per channel. Also just to be horrible REC2100 and REC2020 defines where SDR is in the HDR color-space. So by standard SDR is offically subset of HDR.

        Yes the standard document refers to SDR as REC709 and that is basically srgb. The idea that SDR is a completetly different color space is not right.

        SDR and HDR by specification are meant to be the same white point. By REC2020/REC20100 SDR is by specification meant to be a subset of HDR. So the HDR color space is meant to be able to generate SDR perfectly at the same time as HDR content if the monitor is perfectly to specification. Yes like 90%+ REC2020/REC2100 support.

        The idea that they are completely different color space ignores that the HDR colorspace defines where SDR should be inside it.

        Originally posted by mSparks View Post
        the bitwidth of the data is different (needs more than 0-255 per color - actually only 16 to 235 on most TVs,
        Is a TV with 16 to 235 for SDR output in fact correct by REC2020 the answer is no its not correct.

        Most TV are less than 70% REC2020/REC2100 in HDR when you are checking flat color generation and worse when you start checking individual pixels. Yes most HDR TV and Monitors truly do look horrific when you know what you are looking at for HDR generation they are so far out of color space specification its not funny. All the horrific ones have something in common zoned backlighting.

        Comment


        • #94
          Originally posted by oiaohm View Post
          Is a TV with 16 to 235 for SDR output in fact correct by REC2020 the answer is no its not correct.

          Most TV are less than 70% REC2020/REC2100 in HDR when you are checking flat color generation and worse when you start checking individual pixels. Yes most HDR TV and Monitors truly do look horrific when you know what you are looking at for HDR generation they are so far out of color space specification its not funny. All the horrific ones have something in common zoned backlighting.
          HDR is any display format that uses more than 8 bit per colour. But wayland probably needs to get the basics down for the amateurs before it starts worrying about prosumer use.

          Last edited by mSparks; 05 October 2023, 01:59 PM.

          Comment


          • #95
            Originally posted by mSparks View Post
            HDR is any display format that uses more than 8 bit per colour. But wayland probably needs to get the basics down for the amateurs before it starts worrying about pro use.
            In this post I’ll talk a bit about HDR and color management, and where we are with implementing them in KWin. Before jumping into the topic though, I need to add a disclaimer: I will be simplifying a lot of things significantly, leaving others out entirely and as I am by far not a color expert, almost certainly write a few things that are wrong. If you want more credible sources and dive into the details of how all the color stuff works, I recommend you have a look at the color-and-hdr repository instead of this post.


            Try again. HDR is more than 8 bits per color. SDR on HDR appears washed out and all kind of problems when the monitor itself is not to color space specification that is the reality of it.

            The problem I have been talking about is with the more than 8 bits per colour. Where you need to be displaying HDR and SDR at the same time on the same screen.

            Comment


            • #96
              Originally posted by oiaohm View Post
              Try again. HDR is more than 8 bits per color. .
              Try again to assist you learning to read?
              HDR is any display format that uses more than 8bit per colour.

              aiui, the most common atm is HDR10, aka 10bits per colour.

              To get that, the image needs to be in 10bits per colour, as does the display.
              Originally posted by oiaohm View Post
              The problem I have been talking about is with the more than 8 bits per colour. Where you need to be displaying HDR and SDR at the same time on the same screen
              SDR is 8 bits per colour.
              Most displays (not just monitors) actually use less than the full 8 bits, 16-235

              Its really not that hard to scale between them, although adapting to the proper color corrected space is a very specific knowledge area. unfortunately to date wayland has not even started to try to get the very basics working for 90% of SDR displays in the wild.

              beginning to suspect thats because wayland developers dont know important details like 8bits has a range of 0-255
              Last edited by mSparks; 05 October 2023, 02:51 PM.

              Comment


              • #97
                you guys are neither right first and fore most HDR is not a spec, there are multiple specs that provide an HDR like HDR10(+), Dolby, one of the various HDR400, 600, etc. not every spec shares the same necessities.

                SDR and HDR without any qualifiers, are not themselves Specifications, and can only be taken literally as higher and lower dynamic range, and these are terms with no definition unless you talk about a specific spec. I've alluded to it earlier, but the most common, lowest denominator, for whether or not something is "HDR" is if it is graded, and contains a transfer which allows for a large luminence range.

                as troy sobotka said on the olive video editor discord "The absurdity of labeling something SDR vs HDR is that when one asks someone what they mean, they can’t answer it in any useful manner.​ SDR vs HDR holds meaning in only two cases: 1. dot HDR the file encoding. 2. HDR as a display encoding. Beyond that, it is a meaningless term. The dynamic range of footage exists outside of a display encoding."
                HDR is more than 8 bits per color
                this is not true AT ALL, SOME HDR specs require more then 8bit color depth, and you SHOULD have more then 8bits or else you can have visible issues, but many monitors allow you to trigger "HDR" modes using 8bit color

                > SDR on HDR appears washed out and all kind of problems when the monitor itself is not to color space specification that is the reality of it.
                This is rarely ever the issue, even if you had a perfect grading monitor, unless you are properly using a color managed setup it will always look washed out. on the flip side, on a properly color managed setup that can handle tonemapping, it will never need to look washed out Unless the HDR content looks washed out too.
                > SDR is 8 bits per colour.
                There is no hard requirement on bitdepth for SDR, RGB565 can produce a valid "SDR" picture no problems.

                so first and foremost, decide on a common terminology for what you are discussing, because neither of you two are talking about the same thing and think that you are.

                Comment


                • #98
                  Originally posted by Quackdoc View Post
                  but many monitors allow you to trigger "HDR" modes using 8bit color
                  Dynamic range is 100% determined by how many bits per colour you have.
                  If you only have 8 bits per colour you are 100% constrained to 255 different reds, 255 different greens and 255 different blues.
                  Anything that claims to pack more than 255 steps range per colour into 8 bits is 100% marketing bullshit, of the perpetual motion machine variety.

                  Comment


                  • #99
                    Originally posted by mSparks View Post
                    Dynamic range is 100% determined by how many bits per colour you have.
                    If you only have 8 bits per colour you are 100% constrained to 255 different reds, 255 different greens and 255 different blues.
                    Anything that claims to pack more than 255 steps range per colour into 8 bits is 100% marketing bullshit, of the perpetual motion machine variety.
                    ok yeah sure you are right, you only have 255bits of data to work with, but as I said, dynamic range has no meaning in isolation you can still represent a good chunk of luminance variation with a mere rgb888. THAT can be considered high dynamic range, OR you could have a low luminance range, but a large gamut range, that would still be considered a large dynamic range.

                    you need to define what you are talking about with out that, it's just a bunch of mumbo jumbo that doesnt mean anything. you can apply an ST2084 transfer onto an 8bit image, you are applying an "High dynamic range EOTF" onto the image. Does this make it an HDR image? no one knows! you can agree or disagree with it, but in the end, it is opinion, because there is no specification for it!

                    google's ultraHDR image format is a 8bit jpeg with a monochrome luminance enhancement image/layer, that does NOT make it a 10bit image, but they claim it to be an HDR image, Because it can represent a greater range of luminance.

                    Comment


                    • Originally posted by Quackdoc View Post

                      ok yeah sure you are right, you only have 255bits of data to work with, but as I said, dynamic range has no meaning in isolation you can still represent a good chunk of luminance variation with a mere rgb888. THAT can be considered high dynamic range, OR you could have a low luminance range, but a large gamut range, that would still be considered a large dynamic range.

                      you need to define what you are talking about with out that, it's just a bunch of mumbo jumbo that doesnt mean anything. you can apply an ST2084 transfer onto an 8bit image, you are applying an "High dynamic range EOTF" onto the image. Does this make it an HDR image? no one knows! you can agree or disagree with it, but in the end, it is opinion, because there is no specification for it!

                      google's ultraHDR image format is a 8bit jpeg with a monochrome luminance enhancement image/layer, that does NOT make it a 10bit image, but they claim it to be an HDR image, Because it can represent a greater range of luminance.
                      "Dynamic range (abbreviated DR, DNR,[1] or DYR[2]) is the ratio between the largest and smallest values that a certain quantity can assume.​"

                      8 bits, smallest value=0, largest value= 255

                      if you only have 8 bits, your dynamic range is 0:255.

                      calling any 0:255 dynamic range "high dynamic range" doesn't change the fact it is still exactly the same dynamic range as before. It's just marketing hyperbole, of the
                      MY SHOP https://www.1st.shop/Follow me on tour!: https://linktr.ee/podcastinthebarrelGaming Channel: https://www.youtube.com/chrisramsaytvMy Twitch Channel: ...

                      variety.

                      Comment

                      Working...
                      X