Announcement

Collapse
No announcement yet.

AMD Continues Working Toward HDR Display Support For The Linux Desktop

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by brucethemoose View Post
    I get why the linux community has ignored HDR for so long though. The 2 big applications are Windows games and video streams/blu-rays with DRM. Though HDR mastering would be useful in stuff like Blender, I don't think workstation demand is huge.
    Wouldn't proper HDR support be useful for content creation (more than just Blender)? Isn't that an important market for AMD et al.?

    Comment


    • #12
      Originally posted by Anux View Post
      I'm already playing HDR games in a Win VM for years (search for vga passthrough) and someone on phoronix mentioned that mpv can play full screen HDR under Linux (not sure about hard and software specifics).
      The base-problem is that GPUs are unlike CPUs. The behavior of a CPU is well documented, it has virtual memory, and execution is happening in small execution steps (instructions) - creating a CPU emulator is straightforward. On the other hand, creating a GPU emulator is nearly impossible. VGA pass-through is just a way of saying "Look, it is impossible to emulate a GPU, so the only choice that people currently have is to completely give up on the idea of GPU emulation". If GPUs were as well-defined and as easily virtualizable as CPUs then the idea of VGA pass-through wouldn't even occur to people.

      Comment


      • #13
        Originally posted by atmartens View Post
        Wouldn't proper HDR support be useful for content creation (more than just Blender)? Isn't that an important market for AMD et al.?
        Honestly, it seems like most content creators dont really care, or don't have the tools to create/encode/test it. The only functional HDR content I see is from large studios.
        Last edited by brucethemoose; 06 October 2022, 03:21 PM.

        Comment


        • #14
          I read The sad, misleading, and embarrassing state of HDR in PC gaming just this week, which explains that it's a mess even on Windows for a variety of reasons. I looked for this article because I was surprised to find that nearly every monitor review I read said that the HDR support was terrible. I went for the ViewSonic XG270QC in the end, which is apparently better than most. I look forward to receiving it (in 2 weeks! ) even though I won't be able to enjoy the HDR for a little while yet.

          Comment


          • #15
            Originally posted by Chugworth View Post
            I can't really know for sure if I'm viewing HDR content unless its menu specifically says that it is.
            Hu? HDR is obvious to see, no need for a menu to tell you. It sounds more like your monitor doesn't have real HDR capabilitys.

            Originally posted by atmartens View Post
            Wouldn't proper HDR support be useful for content creation (more than just Blender)? Isn't that an important market for AMD et al.?
            Certainly, everything grahics would profit from HDR, GIMP, Rawtherapee, video stuff, ...​

            Originally posted by atomsymbol View Post
            The base-problem is that GPUs are unlike CPUs. The behavior of a CPU is well documented, it has virtual memory, and execution is happening in small execution steps (instructions) - creating a CPU emulator is straightforward. On the other hand, creating a GPU emulator is nearly impossible. VGA pass-through is just a way of saying "Look, it is impossible to emulate a GPU, so the only choice that people currently have is to completely give up on the idea of GPU emulation". If GPUs were as well-defined and as easily virtualizable as CPUs then the idea of VGA pass-through wouldn't even occur to people.
            No, KVM is CPU passthrough, if you compare it with qemus CPU emulation there are worlds between it. Sadly consumer GPUs don't support virtualization and therefore we have to fully pass through the GPU and need a second one for the host. GPU emulation might even work better if we look at stuff like DXVK and I think there is a project for OpenGL emulation in qemu (anyone remember the name?).

            Comment


            • #16
              Originally posted by Anux View Post
              GPU emulation might even work better if we look at stuff like DXVK and I think there is a project for OpenGL emulation in qemu (anyone remember the name?).
              VirGL is the OpenGL project IIRC, and Venus is the corresponding Vulkan project.

              https://www.collabora.com/news-and-b...vulkan-driver/
              Test signature

              Comment


              • #17
                Originally posted by Anux View Post
                Hu? HDR is obvious to see, no need for a menu to tell you. It sounds more like your monitor doesn't have real HDR capabilitys.
                Oh, it's a fairly recent TCL television that supports HDR and Dolby Vision. Like I said, its menu tells you whether the content is HDR or Dolby Vision. But right now I can open an HDR video and.... WOOOOW, look at the colors! Look at the brightness! Oh wait, I'm still in Linux.

                Sure there is a difference, but unless you have two identical displays sitting side-by-side with one playing in HDR and one not, it's not so apparent just how much you're missing or gaining. I have watched movies in HDR, then later watched the same movies without HDR and really didn't feel like I was missing much. Perhaps it's just that much of the current content that calls itself HDR content isn't really taking full advantage of it.

                Comment


                • #18
                  Originally posted by Anux View Post
                  No, KVM is CPU passthrough, if you compare it with qemus CPU emulation there are worlds between it. Sadly consumer GPUs don't support virtualization and therefore we have to fully pass through the GPU and need a second one for the host. GPU emulation might even work better if we look at stuff like DXVK and I think there is a project for OpenGL emulation in qemu (anyone remember the name?).
                  It's technically not as difficult as most think. In fact you would need to have to emulate a pcie device (gpu), which is relatively easy. But you would also need a Windows kernel driver and a Windows user space driver for it and I don't believe that M$ would sign an open source driver for it, as kernel (graphic) drivers (at least a version of it) must be signed. Well if microsoft would sign it, you would have to pass commands to the virtual gpu and translate it to Vulkan/OpenGL. With Linux guests we already have it.

                  The complexity of it on the technical side is nothing compared to quantum mechanics.
                  Last edited by SvenK; 06 October 2022, 06:38 PM.

                  Comment


                  • #19
                    If VA-API/ffmpeg let me stream over OBS with a better result than 1 frame every 6 seconds, I'd already consider that a win.
                    And if my screen didn't turn black as soon as I modified the xconf files to run with VRR, that too...

                    Comment


                    • #20
                      Originally posted by Chugworth View Post
                      The problem with HDR is that it's easy to be fooled. On my television, I can't really know for sure if I'm viewing HDR content unless its menu specifically says that it is. I imagine on a laptop with no display indicator for HDR, it would be quite easy to think you're running with HDR when you're really not. That would be one thing to look out for when HDR arrives in Linux. Hopefully there will be some log entries or something that shows HDR is actually active on the display.
                      Granted there's a lot of SDR content masquerading as HDR (it's encapsulated and not true HDR). Disney has been criticized for doing it with The Mandalorian, and they're not the only ones by far.

                      However, part of the problem is just really bad HDR monitors. The good ones are still out of reach for the average person. I have a $300 HDR-10 4k monitor from LG with contrast so bad I can barely use it even in SDR without cringing. It'll end up being sold or used for something I can marginally tolerate its deficiencies. It's also the only HDR monitor I own and probably won't buy another any time soon. I see no reason to do so with media companies & creators faking HDR content to begin with.
                      Last edited by stormcrow; 06 October 2022, 08:32 PM. Reason: more grammar/spelling

                      Comment

                      Working...
                      X