Announcement

Collapse
No announcement yet.

AMD Sends In More "New Stuff" For Radeon Graphics With Linux 5.12

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • MadeUpName
    replied
    Originally posted by Zan Lynx View Post
    There is a difference between 10-bit color and HDR. It's the gamma curve, I believe it is called. With 10-bit SDR the brightness is identical between 0-1023 and 0-255. But with HDR the brightness accelerates toward the edges of the color space.

    It may be even more confusing after reading https://en.wikipedia.org/wiki/High-dynamic-range_video

    But anyway, the 10-bit / 30 - bit color available in X11 currently is 10-bit SDR.
    Your kind of on the right track. I'm gonna back you up a bit. Each pixel has 5 channels

    Alpha - How transparent it is.
    Gamma - Brightness
    R - Red
    G - Green
    B - Blue

    RGB make up the chroma channels.

    If you push up gamma and leave the chroma channels the same the picture will get brighter but washed out.
    If you push up the chroma channels but leave gamma the same the image will be more saturated but not brighter.
    If you push up both gamma and chroma then the picture retains it's saturation while getting brighter.

    Bit depth is how many steps there are in each of those channels. 8 bit is 256 and 10bit 1024. In the gamma channel 0 is the darkest black and the highest number (256 or 1024) is the brightest light that the display can produce. The benifit of greater bit depth is that it reduces banding. That doesn't mean some codec isn't going to come along and put banding in where there wasn't any before but a good codec shouldn't. In any case whether SDR or HDR zero still means absolute black or no saturation if a chroma channel. the highest number is pure white or fully saturated if applied to one of the chroma channels.

    HDR is the range between the darkest black and the brightest white. For most displays which are rec.709 standard the brightest is limited to 100 nits. When you master a video for HDR you have to specify how many nits your color grade is designed for. If your grading display can do 300 nits you would tell it 300 nits. That gets embedded in the metadata for the video and the display will use that information to try to match the intention of the color grade to what the the display can handle. If you master for 1000 nits and then send it to a 300 nit display the display will try to remap the signal to make it fit into 300 nits.

    What you were describing was a log curve which is used in cameras to allow them to capture more dynamic range by mapping values to a longer log curve rather than a shorter linear curve. Log footage looks flat, desaturated and the colors are jacked. They need to be remapped to make them useful.

    This guy has a channel that looks at various "HDR" movies to see how they were actually mastered and found a lot of "HDR" movies are what he calls SDR in and HDR container. Meaning we took the same old shit we sold you last year and now we are selling it to you again this year but we are calling it HDR other wise you wouldn't buy it again..
    Enjoy the videos and music you love, upload original content, and share it all with friends, family, and the world on YouTube.


    One thing to keep in mind. If you have two displays with equal bit depth but one is SDR 100 nits and the other is HDR 1000 nits, the HDR display would actually display more banding even though it has more dynamic range because the steps would be bigger.

    The final piece of the puzzle is the color range that a display can produce. This one I am not even going to try to describe with words. This graph will show you what is at play. Suffice it to say that green needs a lot of work in most displays. The bulk of displays still struggle to get to sRGB.



    I hope that helps.

    Leave a comment:


  • JacekJagosz
    replied
    Originally posted by pete910 View Post

    There's also very few true HDR monitors too. Most are the HDR400/600 type like my own.
    While HDR400 monitors are a joke, and this is basically an SDR monitor that can accept HDR input but the image will look worse than with just SDR, HDR600 starts to make sense and you can buy monitors that HDR will look really well on. 600nits is already bright enough in a dark room. and they have to support multiple backlight dimming zones. My HDR600 monitor looks great for HDR gaming, but of course monitor quality will vary, only some games have HDR and even fewer a good implementation, and for some reason movies don't look as good as games.

    Leave a comment:


  • pete910
    replied
    Originally posted by flashmozzg View Post

    Yeah. The overwhelming majority of 10-bit content is not HDR.
    There's also very few true HDR monitors too. Most are the HDR400/600 type like my own.

    Leave a comment:


  • flashmozzg
    replied
    Originally posted by Zan Lynx View Post
    There is a difference between 10-bit color and HDR. It's the gamma curve, I believe it is called. With 10-bit SDR the brightness is identical between 0-1023 and 0-255. But with HDR the brightness accelerates toward the edges of the color space.

    It may be even more confusing after reading https://en.wikipedia.org/wiki/High-dynamic-range_video

    But anyway, the 10-bit / 30 - bit color available in X11 currently is 10-bit SDR.
    Yeah. The overwhelming majority of 10-bit content is not HDR.

    Leave a comment:


  • theriddick
    replied
    Originally posted by aspen View Post

    Many X apps just break if you try to use 10-bit color.
    Yeah discovered this recently, quite sad. I'm not sure if its a problem with GTK3 apps or X in general. I think Firefox is one of those apps that has issues. I should try 10bit again with my new 6800xt, see what happens (probably nothing good).

    Leave a comment:


  • Zan Lynx
    replied
    There is a difference between 10-bit color and HDR. It's the gamma curve, I believe it is called. With 10-bit SDR the brightness is identical between 0-1023 and 0-255. But with HDR the brightness accelerates toward the edges of the color space.

    It may be even more confusing after reading https://en.wikipedia.org/wiki/High-dynamic-range_video

    But anyway, the 10-bit / 30 - bit color available in X11 currently is 10-bit SDR.

    Leave a comment:


  • stargeizer
    replied
    Unfortunately, driver support is only part of the puzzle. The desktop manager and underlying GUI (ex. xorg and GTK, wayland and GTK) needs to be updated for the support. I believe Wayland support is on development, and there's no formal xorg support for it and at least one xorg developer that they prefer to work hdr support in wayland rather than add to xorg.

    There's certain software that bypasses all of that and access the hardware directly to provide hdr (most notably Kodi), but results may vary.

    Leave a comment:


  • Ademeion
    replied
    Originally posted by cb88 View Post

    HDR in X is an already proposed X extension and apparently some GTK supprt already for that... it is likely support for X and Wayland happen around the sametime since the underlying guts are the same.

    Also try this

    Creating a /etc/X11/xorg.conf.d/30-screensetup.conf with the following content will give you 10-bit color:

    Section "Screen"
    Identifier "Default Screen"
    Monitor "Configured Monitor"
    Device "Configured Video Device" # 24 for 8-bit or 30 for 10-bit
    DefaultDepth 30
    EndSection

    Note alot of stuff is still broken with this.
    Thanks for the tip. I don't actually have a HDR monitor yet, because I have delayed the purchase until I know that HDR works for sure. Your comment suggests that we are at least almost there. I'll listen to other people's experiences, and perhaps dive in soon myself.

    I wonder if the working of HDR in Linux depends on the monitor model. Other comments suggest that it doesn't yet work for all, so perhaps some sort of technical compatibility (other than the obvious general HDR stuff) is an issue. I want to be sure before buying the new monitor that HDR in Linux works with that specific model. I have limited resources, so unfortunately there's no room for trial and error with this purchase.

    Leave a comment:


  • skeevy420
    replied
    Originally posted by cb88 View Post

    HDR in X is an already proposed X extension and apparently some GTK supprt already for that... it is likely support for X and Wayland happen around the sametime since the underlying guts are the same.

    Also try this

    Creating a /etc/X11/xorg.conf.d/30-screensetup.conf with the following content will give you 10-bit color:

    Section "Screen"
    Identifier "Default Screen"
    Monitor "Configured Monitor"
    Device "Configured Video Device" # 24 for 8-bit or 30 for 10-bit
    DefaultDepth 30
    EndSection

    Note alot of stuff is still broken with this.
    I tried that day one of getting my monitor and it didn't go very well.

    Thanks for the suggestion, though.

    Leave a comment:


  • aspen
    replied
    Originally posted by cb88 View Post

    HDR in X is an already proposed X extension and apparently some GTK supprt already for that... it is likely support for X and Wayland happen around the sametime since the underlying guts are the same.

    Also try this

    Creating a /etc/X11/xorg.conf.d/30-screensetup.conf with the following content will give you 10-bit color:

    Section "Screen"
    Identifier "Default Screen"
    Monitor "Configured Monitor"
    Device "Configured Video Device" # 24 for 8-bit or 30 for 10-bit
    DefaultDepth 30
    EndSection
    Many X apps just break if you try to use 10-bit color.

    Leave a comment:

Working...
X