Announcement

Collapse
No announcement yet.

Ubuntu's Prolific GNOME Developer Is Looking To Tackle Deep Color Support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • bug77
    replied
    Originally posted by FireBurn View Post

    It does at 8bit, not at 10bit for 4:4:4, but using 4:2:2 there is enough (would you call the compression) - I was hoping for a kernel parameter for force 4:2:2 to allow 4K@60Hz with 10bit colour per channel from boot
    Ok, that makes more sense.
    Setting 10bpc is done in the driver, you'd think the moment the drivers sets that, it will talk to the monitor to figure out whether it needs compression or not. Or, at the very least, apply whatever compression the user has set.
    Me, I'm just trying to figure out how to set 10bpc, but the monitor is out for service (came with dead pixels) at the moment.

    Leave a comment:


  • FireBurn
    replied
    Originally posted by bug77 View Post

    HDMI 2.0 doesn't have the bandwidth to handle 4k@60fps without compression. No kernel parameter will fix that.
    It does at 8bit, not at 10bit for 4:4:4, but using 4:2:2 there is enough (would you call the compression) - I was hoping for a kernel parameter for force 4:2:2 to allow 4K@60Hz with 10bit colour per channel from boot

    Leave a comment:


  • bug77
    replied
    Originally posted by FireBurn View Post

    If my TV had DisplayPort I'd already be using it, I'm guessing eventually there will be a kernel parameter to be able to control it
    HDMI 2.0 doesn't have the bandwidth to handle 4k@60fps without compression. No kernel parameter will fix that.

    Leave a comment:


  • FireBurn
    replied
    Originally posted by bug77 View Post

    Probably when you move off HDMI 2.0
    If my TV had DisplayPort I'd already be using it, I'm guessing eventually there will be a kernel parameter to be able to control it

    Leave a comment:


  • bug77
    replied
    Originally posted by FireBurn View Post
    When AMDGPU first flipped on 30bit I stopped being able to display 4k@60Hz, it dropped down to 30Hz - something to do with 444 vs 422 (or something)

    Any idea if they'll be fixing that?
    Probably when you move off HDMI 2.0

    Leave a comment:


  • mroche
    replied
    Originally posted by Unklejoe View Post

    I agree, but it should be noted that those are three unrelated properties. You can have normal gamut SDR 10 bit video, wide gamut SDR 10 bit, wide gamut 8 bit, etc.

    That said, the actual HDR standards (HDR10, HDR10+, etc.) do imply a certain pixel bit depth, color gamut, and dynamic range, so that's the best way to describe it if you ask me.
    Yes, and I didn’t meant to imply the relation of the terms to bit depth. I was touching more-so on what terms you’d hear people using colloquially about the space.

    But I wholeheartedly agree, specifying the protocol would be more apt in discussing HDR related topics due to its encompassing nature.

    Cheers,
    Mike

    Leave a comment:


  • FireBurn
    replied
    When AMDGPU first flipped on 30bit I stopped being able to display 4k@60Hz, it dropped down to 30Hz - something to do with 444 vs 422 (or something)

    Any idea if they'll be fixing that?

    Leave a comment:


  • emblemparade
    replied
    Originally posted by bug77 View Post
    Edit: Also, one guy starting experimental work for one DE does not mean "we are getting very close". Not on Linux.
    I dunno, it seems a small leap from 10 BPC to HDR. If you allow for 10 BPC you must also be allowing for color formats other than RGB 888, so at that point any program would be able to output anything compatible with HDR standards. For example, VLC or mpv could take a BT.2020-encoded HDR video and simply render it to a surface as YCbCr,

    I'm sure it would be very rocky at first as you would need to switch between HDR video mode to SDR desktop mode for something like that. Windows 10 had the same problems early on with HDR. But at this point Windows 10 does let you switch to "HDR mode" where it can emulate SDR content. This lets you still use your desktop as usual (well, with some color weirdness due to emulation) while also enjoying HDR content. I'm sure we could come up with a similar solution in Linux.

    Leave a comment:


  • bug77
    replied
    Originally posted by emblemparade View Post

    That's simply wrong. You can have 10 BPC and even or 12 BPC and not be HDR. One feature is not directly related to other (8 BPC HDR would probably look awful, but that's a different issue).

    I do wonder, though, if this feature is only for 10 BPC RGB. Could GNOME/Mutter also support YCbCr encodings (4/4/4, 4/2/2, 4/2/0) for those channels? It seems we are getting very close to HDR support in Linux!
    Ha, you made the same mistake I did on my first reading. He's not arguing HDR is similar to 10bpc, but rather that HDR and 10bpc are proper terms pr deep color and 30 bit, respectively
    Could have worded that better, apparently.

    Edit: Also, one guy starting experimental work for one DE does not mean "we are getting very close". Not on Linux.

    Leave a comment:


  • emblemparade
    replied
    Originally posted by miabrahams View Post
    I believe the more commonplace terminology for this is “HDR” and “10-bit” color instead of “deep” and “30 bit.”
    That's simply wrong. You can have 10 BPC and even or 12 BPC and not be HDR. One feature is not directly related to other (8 BPC HDR would probably look awful, but that's a different issue).

    I do wonder, though, if this feature is only for 10 BPC RGB. Could GNOME/Mutter also support YCbCr encodings (4/4/4, 4/2/2, 4/2/0) for those channels? It seems we are getting very close to HDR support in Linux!

    Leave a comment:

Working...
X