Announcement

Collapse
No announcement yet.

Wayland 1.21 Alpha Finally Introduces High-Resolution Scroll Wheel Support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #81
    Originally posted by oiaohm View Post

    Sounds about right it took about 20 years to get somewhat functional color management in SDR screens. Looks like HDR is going to take longer to get somewhat sorted out.

    There are a lot of issues that people point to with Wayland that when you look closer there was not a good implementation anywhere to copy. Yes those areas are fairly much start from scratch and hope to get it right this time.. Yes starting from scratch on something complex like color management is a path to years of arguments just getting the core design sorted out before coding anything.
    honestly I think we do need a short term solution a temporary solution that wouldn't fuck us over down the line would be DRM leasing, but it's unreasonable to ask apps to add DRM support. I think if possible, HDR with direct scanout would work fine. and shouldn't get in the way of things down the line.

    Comment


    • #82
      Originally posted by Quackdoc View Post
      honestly I think we do need a short term solution a temporary solution that wouldn't fuck us over down the line would be DRM leasing, but it's unreasonable to ask apps to add DRM support. I think if possible, HDR with direct scanout would work fine. and shouldn't get in the way of things down the line.
      That the problem. Trying to part solution that end up not being cursed in future. DRM Leading is basically full screen we don't have a window mode of it.

      DRM support would not be a big problem if Nvidia was on the same page as everyone else with their implementation. The reality is if direct DRM out was nice stable and uniform applications most likely would not have to pick up the bits it would be the toolkits.

      Graphic stack is a true head ache location.

      Comment


      • #83
        Originally posted by oiaohm View Post

        That the problem. Trying to part solution that end up not being cursed in future. DRM Leading is basically full screen we don't have a window mode of it.

        DRM support would not be a big problem if Nvidia was on the same page as everyone else with their implementation. The reality is if direct DRM out was nice stable and uniform applications most likely would not have to pick up the bits it would be the toolkits.

        Graphic stack is a true head ache location.
        Direct scan out is not DRM leasing. Here is a brief write from Zamunda (sorry if I spelt it wrong) from the gaming on wayland blog post

        application contents can be put directly on the screen with the display hardware (which is called “direct scanout”. You may have heard the X related term “unredirection” used for the same thing, too). This removes unnecessary copies and provides some nice efficiency and latency benefits - even in windowed mode, if the compositor and hardware support it
        the goal would if possible, to use this to allow the app to do it's own color management, since many apps already do to some degree, and use direct scan out, even if fullscreen, all the compositor would need to do is send HDR metadata.

        Comment


        • #84
          Originally posted by Quackdoc View Post
          Direct scan out is not DRM leasing. Here is a brief write from Zamunda (sorry if I spelt it wrong) from the gaming on wayland blog post

          the goal would if possible, to use this to allow the app to do it's own color management, since many apps already do to some degree, and use direct scan out, even if fullscreen, all the compositor would need to do is send HDR metadata.
          Direct scan out the compositor need to be HDR aware and know to switch it back. DRM leasing that could be done around the kernel doing TTY switch so compositor does not need to know what the HDR state is that way instead it then comes the kernel that manages the HDR metadata so extending DRM layers in kernel to handle it.

          Yes of course direct scan out could be based on what you would need for DRM leading on a single monitor.

          Yes doing direct scan out there is still questions of how and needing the different GPU vendors to agree.

          Comment


          • #85
            Originally posted by oiaohm View Post

            Direct scan out the compositor need to be HDR aware and know to switch it back. DRM leasing that could be done around the kernel doing TTY switch so compositor does not need to know what the HDR state is that way instead it then comes the kernel that manages the HDR metadata so extending DRM layers in kernel to handle it.

            Yes of course direct scan out could be based on what you would need for DRM leading on a single monitor.

            Yes doing direct scan out there is still questions of how and needing the different GPU vendors to agree.
            We have a way for vendors to agree, it's called DRM. it already exists. and if a vendor doesn't follow it, don't support said vendor. seems simple to me.

            and define "HDR aware". if by HDR aware, you mean adding a pipe between a direct scanout app and the backend then sure. but that shouldn't really be all that much work. It really wouldn't be that hard, a simple "if receive HDR metadata from a fullscreen app, forward HDR metadata to appropriate DRM interface. when app is terminated or no longer fullscreen, revert to SDR"

            wayland is designed around things like this, the issue is that it doesn't exist yet, why? I'm sure it has to do with it being difficult in fact you can see plenty of progress being done on it by tracking the weston issue, but I can't help but feel it also has to do with how slow wayland development is for things that matter to the average desktop user. we still don't have a fractional scaling protocol despite it being a massive pain point and arguably more simple to implement. and it was only recently we have a proposal.

            wayland is about doing things correct for sure, but it is incredibly slow moving. the wayland tablet protocol has been in unstable for ages, despite every compositor pretty much implementing it because it is something quite important. Don't get me wrong, I know that the devs are working hard on it, but I also know that many of the devs are probably also working hard on other, and probably more important things. I don't think there is enough hours being put into HDR myself. but my development skills are a joke, so I certainly can't help.

            but for people who are trying to use linux for working with HDR, or for content consumption, none of that matters. It doesn't matter that HDR is a buggy shitshow on windows and mac. what matters is that you have something usable (even if barely). and at this rate I doubt we will get something useful for a long time.

            Personally I resent using pretty much anything but linux when I can help it. so I am fine with running kodi and mpv on another TTY (A dual seat setup could be ideal for this but dual seat is a hassle unless you use VMs which is it's own can of worms imo anyone know of an "easy" solution for this aside from VMs?). but I certainly won't be telling anyone to migrate their living room pcs to linux anytime soon.

            Comment


            • #86
              I disagree on the DRM direkt scanout solution.
              That way every application needs to know how to do the color transform to the output display, controls on how this is done would per per application / a broken mess / inconsistent between applications.

              IMHO the best solution would be to when registering a surface additionally to asking for a specific pixel format like RGB888 to tell the compositor about the colorspace and gamma curve the content is encoded in.
              The compositor should do the transformation to the outputs device colorspace / gamma curves.

              That way an app can request a 10bpp surface with like REC.2020 colorspace and HLG or PQ gamma curve and the compositor can decide how this content is rendered to the output device which might have 12bpp and another gamma curve.

              We will end in a broken mess situation like we are in today, if all the color and gamma transformations are implemented by individual applications / libraries, as they all will behave somehow differently / support different set of output devices and so on.. The right place to do it is in the compositor and the way how to signal color / gamma information of an output surface is via a standard protocol like wayland.

              Yes it´s moving slowly, but it should be done right and implementing a tunnel arround the compositor via direct DRM scanout will lead to a broken fragemented mess which you have to care about for forever, as there will always be this one old application which does not implement the correct wayland protocol, but uses the "old" DRM way.

              Comment


              • #87
                Originally posted by Spacefish View Post
                I disagree on the DRM direkt scanout solution.
                That way every application needs to know how to do the color transform to the output display, controls on how this is done would per per application / a broken mess / inconsistent between applications.

                IMHO the best solution would be to when registering a surface additionally to asking for a specific pixel format like RGB888 to tell the compositor about the colorspace and gamma curve the content is encoded in.
                The compositor should do the transformation to the outputs device colorspace / gamma curves.

                That way an app can request a 10bpp surface with like REC.2020 colorspace and HLG or PQ gamma curve and the compositor can decide how this content is rendered to the output device which might have 12bpp and another gamma curve.

                We will end in a broken mess situation like we are in today, if all the color and gamma transformations are implemented by individual applications / libraries, as they all will behave somehow differently / support different set of output devices and so on.. The right place to do it is in the compositor and the way how to signal color / gamma information of an output surface is via a standard protocol like wayland.

                Yes it´s moving slowly, but it should be done right and implementing a tunnel arround the compositor via direct DRM scanout will lead to a broken fragemented mess which you have to care about for forever, as there will always be this one old application which does not implement the correct wayland protocol, but uses the "old" DRM way.
                Everything you wrote here could be implemented at the DMA direct layer. Remember you most likely want your colour conversions being GPU accelerated so having color information able to be connected to output buffer would be the best outcome.

                You have to remember DRM is a part of the Linux graphics stack the DMABUF fd that everyone is basically using with wayland is lower down. DMABUF fd metadata where you most likely want to store the colour space information.

                DRM direct scanout means you would be after the processing interfaces for color space to be GPU side if possible. Remember the compositor need to function on top of DRM so DRM needs all the correct levels of function. DRM direct work would be imperfect at first but that would be able to work to laying down the foundations to have gpu accelerated colour conversions and uniform colour if running two TTY of compositors on the same monitor.

                This problem extends out side the Wayland protocol as well if you want it done well.

                Comment


                • #88
                  Originally posted by Spacefish View Post
                  I disagree on the DRM direkt scanout solution.
                  That way every application needs to know how to do the color transform to the output display, controls on how this is done would per per application / a broken mess / inconsistent between applications.
                  IMO when working with hdr, or color accurate stuff, applications SHOULD do the color management themselves. and it's not like this would hinder other applications down the line, the idea is to give us something we can actually use. both drm leasing and direct scanout capabilites would not conflict whatsoever with any proper implementations down the line if done right, and in fact is something necessary anyways since a user doing colour accurate would should NOT trust the compositor to do it for them.

                  IMHO the best solution would be to when registering a surface additionally to asking for a specific pixel format like RGB888 to tell the compositor about the colorspace and gamma curve the content is encoded in.
                  The compositor should do the transformation to the outputs device colorspace / gamma curves.

                  That way an app can request a 10bpp surface with like REC.2020 colorspace and HLG or PQ gamma curve and the compositor can decide how this content is rendered to the output device which might have 12bpp and another gamma curve.

                  We will end in a broken mess situation like we are in today, if all the color and gamma transformations are implemented by individual applications / libraries, as they all will behave somehow differently / support different set of output devices and so on.. The right place to do it is in the compositor and the way how to signal color / gamma information of an output surface is via a standard protocol like wayland.

                  Yes it´s moving slowly, but it should be done right and implementing a tunnel arround the compositor via direct DRM scanout will lead to a broken fragemented mess which you have to care about for forever, as there will always be this one old application which does not implement the correct wayland protocol, but uses the "old" DRM way.
                  I want to be able to use the compositor as little as possible. and besides a broken mess is better then having nothing, which is what we have now. I should be able to trust the app to do it's own color management if It want's to, and I think it is a necessity for people serious about colour work to have the choice to do it. since colour work is as much as an art as it is a science.

                  Comment


                  • #89
                    Quackdoc How does the KODI HDR passthrough work? I have installed AMDVLK, and run KODI in TTY, what next?

                    Comment


                    • #90
                      Originally posted by s9209122222 View Post
                      Quackdoc How does the KODI HDR passthrough work? I have installed AMDVLK, and run KODI in TTY, what next?
                      id only used it on a arm device, but apparently libreelec has support. I don't think most distros have kodi built with support for it. FFMPEG needs to be built using libdrm, then kodi needs to be built on that. and I think there is work being done to make it all easier to do, but I went to re-verify it working but am running into build errors now because ffmpeg can't keep a stable API for the life of it -_-

                      but basically you need to make sure you have DRMPRIME renderer.

                      I'd seen some people reccomend building these against master, but Im still running into build issues https://github.com/lrusak/xbmc/commi...no-ffmpeg-bump

                      Comment

                      Working...
                      X