Announcement

Collapse
No announcement yet.

AMD Sends In More "New Stuff" For Radeon Graphics With Linux 5.12

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by aspen View Post

    Many X apps just break if you try to use 10-bit color.
    Yeah discovered this recently, quite sad. I'm not sure if its a problem with GTK3 apps or X in general. I think Firefox is one of those apps that has issues. I should try 10bit again with my new 6800xt, see what happens (probably nothing good).

    Comment


    • #12
      Originally posted by Zan Lynx View Post
      There is a difference between 10-bit color and HDR. It's the gamma curve, I believe it is called. With 10-bit SDR the brightness is identical between 0-1023 and 0-255. But with HDR the brightness accelerates toward the edges of the color space.

      It may be even more confusing after reading https://en.wikipedia.org/wiki/High-dynamic-range_video

      But anyway, the 10-bit / 30 - bit color available in X11 currently is 10-bit SDR.
      Yeah. The overwhelming majority of 10-bit content is not HDR.

      Comment


      • #13
        Originally posted by flashmozzg View Post

        Yeah. The overwhelming majority of 10-bit content is not HDR.
        There's also very few true HDR monitors too. Most are the HDR400/600 type like my own.

        Comment


        • #14
          Originally posted by pete910 View Post

          There's also very few true HDR monitors too. Most are the HDR400/600 type like my own.
          While HDR400 monitors are a joke, and this is basically an SDR monitor that can accept HDR input but the image will look worse than with just SDR, HDR600 starts to make sense and you can buy monitors that HDR will look really well on. 600nits is already bright enough in a dark room. and they have to support multiple backlight dimming zones. My HDR600 monitor looks great for HDR gaming, but of course monitor quality will vary, only some games have HDR and even fewer a good implementation, and for some reason movies don't look as good as games.

          Comment


          • #15
            Originally posted by Zan Lynx View Post
            There is a difference between 10-bit color and HDR. It's the gamma curve, I believe it is called. With 10-bit SDR the brightness is identical between 0-1023 and 0-255. But with HDR the brightness accelerates toward the edges of the color space.

            It may be even more confusing after reading https://en.wikipedia.org/wiki/High-dynamic-range_video

            But anyway, the 10-bit / 30 - bit color available in X11 currently is 10-bit SDR.
            Your kind of on the right track. I'm gonna back you up a bit. Each pixel has 5 channels

            Alpha - How transparent it is.
            Gamma - Brightness
            R - Red
            G - Green
            B - Blue

            RGB make up the chroma channels.

            If you push up gamma and leave the chroma channels the same the picture will get brighter but washed out.
            If you push up the chroma channels but leave gamma the same the image will be more saturated but not brighter.
            If you push up both gamma and chroma then the picture retains it's saturation while getting brighter.

            Bit depth is how many steps there are in each of those channels. 8 bit is 256 and 10bit 1024. In the gamma channel 0 is the darkest black and the highest number (256 or 1024) is the brightest light that the display can produce. The benifit of greater bit depth is that it reduces banding. That doesn't mean some codec isn't going to come along and put banding in where there wasn't any before but a good codec shouldn't. In any case whether SDR or HDR zero still means absolute black or no saturation if a chroma channel. the highest number is pure white or fully saturated if applied to one of the chroma channels.

            HDR is the range between the darkest black and the brightest white. For most displays which are rec.709 standard the brightest is limited to 100 nits. When you master a video for HDR you have to specify how many nits your color grade is designed for. If your grading display can do 300 nits you would tell it 300 nits. That gets embedded in the metadata for the video and the display will use that information to try to match the intention of the color grade to what the the display can handle. If you master for 1000 nits and then send it to a 300 nit display the display will try to remap the signal to make it fit into 300 nits.

            What you were describing was a log curve which is used in cameras to allow them to capture more dynamic range by mapping values to a longer log curve rather than a shorter linear curve. Log footage looks flat, desaturated and the colors are jacked. They need to be remapped to make them useful.

            This guy has a channel that looks at various "HDR" movies to see how they were actually mastered and found a lot of "HDR" movies are what he calls SDR in and HDR container. Meaning we took the same old shit we sold you last year and now we are selling it to you again this year but we are calling it HDR other wise you wouldn't buy it again..
            Enjoy the videos and music you love, upload original content, and share it all with friends, family, and the world on YouTube.


            One thing to keep in mind. If you have two displays with equal bit depth but one is SDR 100 nits and the other is HDR 1000 nits, the HDR display would actually display more banding even though it has more dynamic range because the steps would be bigger.

            The final piece of the puzzle is the color range that a display can produce. This one I am not even going to try to describe with words. This graph will show you what is at play. Suffice it to say that green needs a lot of work in most displays. The bulk of displays still struggle to get to sRGB.



            I hope that helps.

            Comment

            Working...
            X