Announcement

Collapse
No announcement yet.

AMD Hiring For Another Open-Source GPU Driver Developer With Multimedia Expertise

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by Danny3 View Post

    So I really wish AMD would seriously tackle the HDR support issue on Linux seriously, at least for video playback.
    My understanding is, its not drivers that hold back. No apps can take advantage of hardware, X11 wont ever support HDR, as its legacy by now. And Wayland doesn't have it implemented it yet.
    Some apps potentially can implement it, but they have to have whole HDR implementation within their code. There is no separate library for HDR in Linux or anything.
    And AMD folks cant be blamed, because it is possible that Intel/Nvidia wont play ball here. And again, i understand that drivers allow for full HDR content, its just programs don't now how to implement it on Linux.

    Comment


    • #12
      But yeah, HDR is keeping me from upgrade of GPU/monitor. Don't want to buy smth i can't take advantage of.

      Comment


      • #13
        !! no matter what, multimedia or whatever, AMD should hire Lennart Poettering if only was he happy to accept.

        Comment


        • #14
          Originally posted by skeevy420 View Post
          Quackdoc FLAC, AptX, Mp3, AAC, and other popular/highly used codecs outta be in hardware, especially in systems where you want as many CPU/GPU cycles going towards the program than something superfluous like audio.
          audio is essentially free. it takes so little to decode and encode audio all you do is needlessly complicate hardware. even something like a pi pico could be used for audio encoding and decoding if someone wrote up the bits for it.

          Comment


          • #15
            Originally posted by dimko View Post
            My understanding is, its not drivers that hold back. No apps can take advantage of hardware, X11 wont ever support HDR, as its legacy by now. And Wayland doesn't have it implemented it yet.
            It might take 1-2 years for HDR support in Wayland as there is steady progress in both Weston and wayland-protocols for HDR

            Comment


            • #16
              Originally posted by luno View Post

              It might take 1-2 years for HDR support in Wayland as there is steady progress in both Weston and wayland-protocols for HDR
              steady but slow, 1-2 years seems optimistic at the rate it is going.

              Originally posted by agd5f View Post
              the challenge is coming up with a solution that everyone can agree on.
              I think we would have better luck finding a real unicorn

              Comment


              • #17
                Originally posted by agd5f View Post

                FWIW, we've been working with various upstream communities to enable HDR support for several years now. It's not just the driver that needs to expose HDR capabilities, you also have to make use of them in desktop environments. There are lots of ways to expose the underlying hardware capabilities and the challenge is coming up with a solution that everyone can agree on.
                But then how come that both Gnome and KDE Plasma, the most advanced desktop environments for Linux, that also have Wayland support still could not add support for HDR?

                And how come In Linux we need support for HDR in both display server (X or Wayland) and desktop environment (Gnome, KDE Plasma, etc) when in Windows 7, that definitely doesn't have any kind of HDR support in its display server, MPC-HC+MadVr can send the movie and its HDR metadata through the pipe (through HDMI from my tests) to the HDR capable TV and the TV recognizes the HDR properly?
                This part doesn't make any sense to me.
                Why on Windows the player can send HDR metadata through HDMI without any kind of HDR support from the OS and on Linux we need multiple layers to have HDR support and be in sync?
                Why can't a video player on Linux just give the video stream and HDR or 3D metadata to the GPU driver and the driver push it through HDMI or DisplayPort to the monitor or TV?
                Something like a passthrough.
                I don't see the point of why the display server or desktop environment must understand special features like HDR or 3D or whatever special features will be invented in the future, when the video stream is not for them and for sure the HDR or 3D capable monitor or TV will understand display them accordingly if they are sent in the proper format that the device expects.

                Comment


                • #18
                  Originally posted by Danny3 View Post
                  ...
                  you need the display server to talk the to app to see what HDR spec the app wants, win7 needs apps to work in exclusive fullscreen mode for HDR. the closest similar thing to windows exclusive mode that we have on linux when using X or wayland would be swapping to a TTY, launching the app on said TTY. (which already works with mpv + gpu-next and amdvlk). when using wayland or xorg, the app is still going through wayland or xorg, you cannot easily bypass it (kind of except for DRM leasing, which may be a viable short term solution?)

                  neither KDE nor ubuntu can support it because wayland needs to support it first, as you need a way for an app to talk to the compositor so the compositor can figure out what on earth to do with it since it's not just "HDR plox"

                  Comment


                  • #19
                    Originally posted by Danny3 View Post

                    But then how come that both Gnome and KDE Plasma, the most advanced desktop environments for Linux, that also have Wayland support still could not add support for HDR?
                    You need to design a protocol to convey the information between apps and the kernel driver. The compositor has exclusive control of the KMS interface when it's running so arbitrary apps can't just talk directly to the KMS interface. Additionally, HDR has a lot of implications on other things like color management, gamma/degamma, etc. so those need to be factored in as well so you don't end up making incompatible interfaces. Next how do you deal with a mix of SDR and HDR applications running on the same desktop? Next different hardware provides different ways to handling HDR and you'd want to make a protocol that can do as much on hardware as possible on as many SoCs as possible. You don't want a protocol that ends up having to do a lot of stuff using the CPU or shaders when you could use fixed function hardware in the display blocks. The trade off is that using that fixed function hardware may limit the flexibility of the protocol. Finally, wayland is just a protocol, so even if the protocol is extended to support HDR, etc., it would still need to be implemented in the compositors that speak the protocol (like gnome shell and kwin) and then finally apps would have to be updated to make use of it.

                    Comment


                    • #20
                      Originally posted by agd5f View Post
                      The compositor has exclusive control of the KMS interface when it's running so arbitrary apps can't just talk directly to the KMS interface.
                      ...
                      what is the viability of using DRM leasing for this? im not familiar with how it works, but it has been brought up here and there and a yes/no answer from someone who knows would be quite beneficial.

                      for programs like MPV that already work with HDR when running directly from TTY (using -gpu-api=gpu-next -display=vk -target-colorspace-hint) would they be able to use DRM leasing to gain HDR capabilities instead of needing to launch MPV via TTY and use drm output directly?

                      Comment

                      Working...
                      X