Originally posted by oiaohm
View Post
Announcement
Collapse
No announcement yet.
Wayland 1.21 Alpha Finally Introduces High-Resolution Scroll Wheel Support
Collapse
X
-
-
Originally posted by Quackdoc View Posthonestly I think we do need a short term solution a temporary solution that wouldn't fuck us over down the line would be DRM leasing, but it's unreasonable to ask apps to add DRM support. I think if possible, HDR with direct scanout would work fine. and shouldn't get in the way of things down the line.
DRM support would not be a big problem if Nvidia was on the same page as everyone else with their implementation. The reality is if direct DRM out was nice stable and uniform applications most likely would not have to pick up the bits it would be the toolkits.
Graphic stack is a true head ache location.
Comment
-
Originally posted by oiaohm View Post
That the problem. Trying to part solution that end up not being cursed in future. DRM Leading is basically full screen we don't have a window mode of it.
DRM support would not be a big problem if Nvidia was on the same page as everyone else with their implementation. The reality is if direct DRM out was nice stable and uniform applications most likely would not have to pick up the bits it would be the toolkits.
Graphic stack is a true head ache location.
application contents can be put directly on the screen with the display hardware (which is called “direct scanout”. You may have heard the X related term “unredirection” used for the same thing, too). This removes unnecessary copies and provides some nice efficiency and latency benefits - even in windowed mode, if the compositor and hardware support it
Comment
-
Originally posted by Quackdoc View PostDirect scan out is not DRM leasing. Here is a brief write from Zamunda (sorry if I spelt it wrong) from the gaming on wayland blog post
the goal would if possible, to use this to allow the app to do it's own color management, since many apps already do to some degree, and use direct scan out, even if fullscreen, all the compositor would need to do is send HDR metadata.
Yes of course direct scan out could be based on what you would need for DRM leading on a single monitor.
Yes doing direct scan out there is still questions of how and needing the different GPU vendors to agree.
Comment
-
Originally posted by oiaohm View Post
Direct scan out the compositor need to be HDR aware and know to switch it back. DRM leasing that could be done around the kernel doing TTY switch so compositor does not need to know what the HDR state is that way instead it then comes the kernel that manages the HDR metadata so extending DRM layers in kernel to handle it.
Yes of course direct scan out could be based on what you would need for DRM leading on a single monitor.
Yes doing direct scan out there is still questions of how and needing the different GPU vendors to agree.
and define "HDR aware". if by HDR aware, you mean adding a pipe between a direct scanout app and the backend then sure. but that shouldn't really be all that much work. It really wouldn't be that hard, a simple "if receive HDR metadata from a fullscreen app, forward HDR metadata to appropriate DRM interface. when app is terminated or no longer fullscreen, revert to SDR"
wayland is designed around things like this, the issue is that it doesn't exist yet, why? I'm sure it has to do with it being difficult in fact you can see plenty of progress being done on it by tracking the weston issue, but I can't help but feel it also has to do with how slow wayland development is for things that matter to the average desktop user. we still don't have a fractional scaling protocol despite it being a massive pain point and arguably more simple to implement. and it was only recently we have a proposal.
wayland is about doing things correct for sure, but it is incredibly slow moving. the wayland tablet protocol has been in unstable for ages, despite every compositor pretty much implementing it because it is something quite important. Don't get me wrong, I know that the devs are working hard on it, but I also know that many of the devs are probably also working hard on other, and probably more important things. I don't think there is enough hours being put into HDR myself. but my development skills are a joke, so I certainly can't help.
but for people who are trying to use linux for working with HDR, or for content consumption, none of that matters. It doesn't matter that HDR is a buggy shitshow on windows and mac. what matters is that you have something usable (even if barely). and at this rate I doubt we will get something useful for a long time.
Personally I resent using pretty much anything but linux when I can help it. so I am fine with running kodi and mpv on another TTY (A dual seat setup could be ideal for this but dual seat is a hassle unless you use VMs which is it's own can of worms imo anyone know of an "easy" solution for this aside from VMs?). but I certainly won't be telling anyone to migrate their living room pcs to linux anytime soon.
Comment
-
I disagree on the DRM direkt scanout solution.
That way every application needs to know how to do the color transform to the output display, controls on how this is done would per per application / a broken mess / inconsistent between applications.
IMHO the best solution would be to when registering a surface additionally to asking for a specific pixel format like RGB888 to tell the compositor about the colorspace and gamma curve the content is encoded in.
The compositor should do the transformation to the outputs device colorspace / gamma curves.
That way an app can request a 10bpp surface with like REC.2020 colorspace and HLG or PQ gamma curve and the compositor can decide how this content is rendered to the output device which might have 12bpp and another gamma curve.
We will end in a broken mess situation like we are in today, if all the color and gamma transformations are implemented by individual applications / libraries, as they all will behave somehow differently / support different set of output devices and so on.. The right place to do it is in the compositor and the way how to signal color / gamma information of an output surface is via a standard protocol like wayland.
Yes it´s moving slowly, but it should be done right and implementing a tunnel arround the compositor via direct DRM scanout will lead to a broken fragemented mess which you have to care about for forever, as there will always be this one old application which does not implement the correct wayland protocol, but uses the "old" DRM way.
- Likes 1
Comment
-
Originally posted by Spacefish View PostI disagree on the DRM direkt scanout solution.
That way every application needs to know how to do the color transform to the output display, controls on how this is done would per per application / a broken mess / inconsistent between applications.
IMHO the best solution would be to when registering a surface additionally to asking for a specific pixel format like RGB888 to tell the compositor about the colorspace and gamma curve the content is encoded in.
The compositor should do the transformation to the outputs device colorspace / gamma curves.
That way an app can request a 10bpp surface with like REC.2020 colorspace and HLG or PQ gamma curve and the compositor can decide how this content is rendered to the output device which might have 12bpp and another gamma curve.
We will end in a broken mess situation like we are in today, if all the color and gamma transformations are implemented by individual applications / libraries, as they all will behave somehow differently / support different set of output devices and so on.. The right place to do it is in the compositor and the way how to signal color / gamma information of an output surface is via a standard protocol like wayland.
Yes it´s moving slowly, but it should be done right and implementing a tunnel arround the compositor via direct DRM scanout will lead to a broken fragemented mess which you have to care about for forever, as there will always be this one old application which does not implement the correct wayland protocol, but uses the "old" DRM way.
You have to remember DRM is a part of the Linux graphics stack the DMABUF fd that everyone is basically using with wayland is lower down. DMABUF fd metadata where you most likely want to store the colour space information.
DRM direct scanout means you would be after the processing interfaces for color space to be GPU side if possible. Remember the compositor need to function on top of DRM so DRM needs all the correct levels of function. DRM direct work would be imperfect at first but that would be able to work to laying down the foundations to have gpu accelerated colour conversions and uniform colour if running two TTY of compositors on the same monitor.
This problem extends out side the Wayland protocol as well if you want it done well.
Comment
-
Originally posted by Spacefish View PostI disagree on the DRM direkt scanout solution.
That way every application needs to know how to do the color transform to the output display, controls on how this is done would per per application / a broken mess / inconsistent between applications.
IMHO the best solution would be to when registering a surface additionally to asking for a specific pixel format like RGB888 to tell the compositor about the colorspace and gamma curve the content is encoded in.
The compositor should do the transformation to the outputs device colorspace / gamma curves.
That way an app can request a 10bpp surface with like REC.2020 colorspace and HLG or PQ gamma curve and the compositor can decide how this content is rendered to the output device which might have 12bpp and another gamma curve.
We will end in a broken mess situation like we are in today, if all the color and gamma transformations are implemented by individual applications / libraries, as they all will behave somehow differently / support different set of output devices and so on.. The right place to do it is in the compositor and the way how to signal color / gamma information of an output surface is via a standard protocol like wayland.
Yes it´s moving slowly, but it should be done right and implementing a tunnel arround the compositor via direct DRM scanout will lead to a broken fragemented mess which you have to care about for forever, as there will always be this one old application which does not implement the correct wayland protocol, but uses the "old" DRM way.
Comment
-
but basically you need to make sure you have DRMPRIME renderer.
I'd seen some people reccomend building these against master, but Im still running into build issues https://github.com/lrusak/xbmc/commi...no-ffmpeg-bump
Comment
Comment