Announcement

Collapse
No announcement yet.

Linux 5.3 To Enable HDR Metadata Support For AMDGPU Driver

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by jernej View Post

    Where did you get that? HDMI2 supports 2160p@60Hz with YUV4:4:4 or RGB using 594 MHz base clock and few other tricks like scrambling to lower possibility of data corruptions during transfer. I have cheap ARM board which supports that. Even wiki says it's supported (note 24 bits per pixel).
    Not with HDR which is what the context was. HDMI2 is speced at max 18Gbps and 4:4:4 at 60Hz+ and 2160p or 4k requires much more bandwidth than that for HDR.

    When you use 24bits per pixel you only use 8bits per color, for HDR you use 10 or sometimes 12.

    Comment


    • #12
      Originally posted by F.Ultra View Post
      When you use 24bits per pixel you only use 8bits per color, for HDR you use 10 or sometimes 12.
      Well, while it's weird not to use 10-bit (or more) for HDR, it's certainly not required. I'm working on VPU and DRM drivers and currently my TV correctly shows 8-bit 4K HDR videos.

      Comment


      • #13
        Originally posted by jernej View Post

        Well, while it's weird not to use 10-bit (or more) for HDR, it's certainly not required. I'm working on VPU and DRM drivers and currently my TV correctly shows 8-bit 4K HDR videos.
        Of course it's required (and it's required by every single HDR standard that I've ever seen) since the wider gamut is exactly what enables the High Dynamic Range over the Standard Dynamic Range. How else do you propose that the dynamic range would get higher than standard when it's the bits per color that defines the dynamic range?

        Perhaps your TV set supports ITM or similar technology where it simply upconverts SDR to HDR like this: https://www.cnet.com/news/technicolo...ideo-into-hdr/

        Comment

        Working...
        X