Announcement

Collapse
No announcement yet.

Wayland 1.21 Alpha Finally Introduces High-Resolution Scroll Wheel Support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • oiaohm
    replied
    Originally posted by Quackdoc View Post
    I see drm leasing as the nuclear option that devs implement to get around waylands... slow development, and that is IF drm leasing supports metadata passthrough, which i have no idea if it does
    drm leasing gives you a total control of the output so able to use all the interfaces as if you are compositor. Yes when you return the lease the compositor/X11 server is meant to restore everything to the way it was before. Yes the the nuclear option but it the generic option that can work be you running x11 or wayland.

    Basically drm leasing gives everything direct scanout does but with a polite way to ask the compositor/X11 server to let the device go so application can use it.

    Leave a comment:


  • Quackdoc
    replied
    Originally posted by tildearrow View Post

    DRM leasing is an extreme solution to a simple problem...
    We have direct scanout already.

    Doing the former means giving developers an extra burden.
    I see drm leasing as the nuclear option that devs implement to get around waylands... slow development, and that is IF drm leasing supports metadata passthrough, which i have no idea if it does

    Leave a comment:


  • tildearrow
    replied
    Originally posted by oiaohm View Post

    That the problem. Trying to part solution that end up not being cursed in future. DRM Leading is basically full screen we don't have a window mode of it.

    DRM support would not be a big problem if Nvidia was on the same page as everyone else with their implementation. The reality is if direct DRM out was nice stable and uniform applications most likely would not have to pick up the bits it would be the toolkits.

    Graphic stack is a true head ache location.
    DRM leasing is an extreme solution to a simple problem...
    We have direct scanout already.

    Doing the former means giving developers an extra burden.

    Leave a comment:


  • Quackdoc
    replied
    Originally posted by s9209122222 View Post
    Quackdoc How does the KODI HDR passthrough work? I have installed AMDVLK, and run KODI in TTY, what next?
    id only used it on a arm device, but apparently libreelec has support. I don't think most distros have kodi built with support for it. FFMPEG needs to be built using libdrm, then kodi needs to be built on that. and I think there is work being done to make it all easier to do, but I went to re-verify it working but am running into build errors now because ffmpeg can't keep a stable API for the life of it -_-

    but basically you need to make sure you have DRMPRIME renderer.

    I'd seen some people reccomend building these against master, but Im still running into build issues https://github.com/lrusak/xbmc/commi...no-ffmpeg-bump

    Leave a comment:


  • s9209122222
    replied
    Quackdoc How does the KODI HDR passthrough work? I have installed AMDVLK, and run KODI in TTY, what next?

    Leave a comment:


  • Quackdoc
    replied
    Originally posted by Spacefish View Post
    I disagree on the DRM direkt scanout solution.
    That way every application needs to know how to do the color transform to the output display, controls on how this is done would per per application / a broken mess / inconsistent between applications.
    IMO when working with hdr, or color accurate stuff, applications SHOULD do the color management themselves. and it's not like this would hinder other applications down the line, the idea is to give us something we can actually use. both drm leasing and direct scanout capabilites would not conflict whatsoever with any proper implementations down the line if done right, and in fact is something necessary anyways since a user doing colour accurate would should NOT trust the compositor to do it for them.

    IMHO the best solution would be to when registering a surface additionally to asking for a specific pixel format like RGB888 to tell the compositor about the colorspace and gamma curve the content is encoded in.
    The compositor should do the transformation to the outputs device colorspace / gamma curves.

    That way an app can request a 10bpp surface with like REC.2020 colorspace and HLG or PQ gamma curve and the compositor can decide how this content is rendered to the output device which might have 12bpp and another gamma curve.

    We will end in a broken mess situation like we are in today, if all the color and gamma transformations are implemented by individual applications / libraries, as they all will behave somehow differently / support different set of output devices and so on.. The right place to do it is in the compositor and the way how to signal color / gamma information of an output surface is via a standard protocol like wayland.

    Yes it´s moving slowly, but it should be done right and implementing a tunnel arround the compositor via direct DRM scanout will lead to a broken fragemented mess which you have to care about for forever, as there will always be this one old application which does not implement the correct wayland protocol, but uses the "old" DRM way.
    I want to be able to use the compositor as little as possible. and besides a broken mess is better then having nothing, which is what we have now. I should be able to trust the app to do it's own color management if It want's to, and I think it is a necessity for people serious about colour work to have the choice to do it. since colour work is as much as an art as it is a science.

    Leave a comment:


  • oiaohm
    replied
    Originally posted by Spacefish View Post
    I disagree on the DRM direkt scanout solution.
    That way every application needs to know how to do the color transform to the output display, controls on how this is done would per per application / a broken mess / inconsistent between applications.

    IMHO the best solution would be to when registering a surface additionally to asking for a specific pixel format like RGB888 to tell the compositor about the colorspace and gamma curve the content is encoded in.
    The compositor should do the transformation to the outputs device colorspace / gamma curves.

    That way an app can request a 10bpp surface with like REC.2020 colorspace and HLG or PQ gamma curve and the compositor can decide how this content is rendered to the output device which might have 12bpp and another gamma curve.

    We will end in a broken mess situation like we are in today, if all the color and gamma transformations are implemented by individual applications / libraries, as they all will behave somehow differently / support different set of output devices and so on.. The right place to do it is in the compositor and the way how to signal color / gamma information of an output surface is via a standard protocol like wayland.

    Yes it´s moving slowly, but it should be done right and implementing a tunnel arround the compositor via direct DRM scanout will lead to a broken fragemented mess which you have to care about for forever, as there will always be this one old application which does not implement the correct wayland protocol, but uses the "old" DRM way.
    Everything you wrote here could be implemented at the DMA direct layer. Remember you most likely want your colour conversions being GPU accelerated so having color information able to be connected to output buffer would be the best outcome.

    You have to remember DRM is a part of the Linux graphics stack the DMABUF fd that everyone is basically using with wayland is lower down. DMABUF fd metadata where you most likely want to store the colour space information.

    DRM direct scanout means you would be after the processing interfaces for color space to be GPU side if possible. Remember the compositor need to function on top of DRM so DRM needs all the correct levels of function. DRM direct work would be imperfect at first but that would be able to work to laying down the foundations to have gpu accelerated colour conversions and uniform colour if running two TTY of compositors on the same monitor.

    This problem extends out side the Wayland protocol as well if you want it done well.

    Leave a comment:


  • Spacefish
    replied
    I disagree on the DRM direkt scanout solution.
    That way every application needs to know how to do the color transform to the output display, controls on how this is done would per per application / a broken mess / inconsistent between applications.

    IMHO the best solution would be to when registering a surface additionally to asking for a specific pixel format like RGB888 to tell the compositor about the colorspace and gamma curve the content is encoded in.
    The compositor should do the transformation to the outputs device colorspace / gamma curves.

    That way an app can request a 10bpp surface with like REC.2020 colorspace and HLG or PQ gamma curve and the compositor can decide how this content is rendered to the output device which might have 12bpp and another gamma curve.

    We will end in a broken mess situation like we are in today, if all the color and gamma transformations are implemented by individual applications / libraries, as they all will behave somehow differently / support different set of output devices and so on.. The right place to do it is in the compositor and the way how to signal color / gamma information of an output surface is via a standard protocol like wayland.

    Yes it´s moving slowly, but it should be done right and implementing a tunnel arround the compositor via direct DRM scanout will lead to a broken fragemented mess which you have to care about for forever, as there will always be this one old application which does not implement the correct wayland protocol, but uses the "old" DRM way.

    Leave a comment:


  • Quackdoc
    replied
    Originally posted by oiaohm View Post

    Direct scan out the compositor need to be HDR aware and know to switch it back. DRM leasing that could be done around the kernel doing TTY switch so compositor does not need to know what the HDR state is that way instead it then comes the kernel that manages the HDR metadata so extending DRM layers in kernel to handle it.

    Yes of course direct scan out could be based on what you would need for DRM leading on a single monitor.

    Yes doing direct scan out there is still questions of how and needing the different GPU vendors to agree.
    We have a way for vendors to agree, it's called DRM. it already exists. and if a vendor doesn't follow it, don't support said vendor. seems simple to me.

    and define "HDR aware". if by HDR aware, you mean adding a pipe between a direct scanout app and the backend then sure. but that shouldn't really be all that much work. It really wouldn't be that hard, a simple "if receive HDR metadata from a fullscreen app, forward HDR metadata to appropriate DRM interface. when app is terminated or no longer fullscreen, revert to SDR"

    wayland is designed around things like this, the issue is that it doesn't exist yet, why? I'm sure it has to do with it being difficult in fact you can see plenty of progress being done on it by tracking the weston issue, but I can't help but feel it also has to do with how slow wayland development is for things that matter to the average desktop user. we still don't have a fractional scaling protocol despite it being a massive pain point and arguably more simple to implement. and it was only recently we have a proposal.

    wayland is about doing things correct for sure, but it is incredibly slow moving. the wayland tablet protocol has been in unstable for ages, despite every compositor pretty much implementing it because it is something quite important. Don't get me wrong, I know that the devs are working hard on it, but I also know that many of the devs are probably also working hard on other, and probably more important things. I don't think there is enough hours being put into HDR myself. but my development skills are a joke, so I certainly can't help.

    but for people who are trying to use linux for working with HDR, or for content consumption, none of that matters. It doesn't matter that HDR is a buggy shitshow on windows and mac. what matters is that you have something usable (even if barely). and at this rate I doubt we will get something useful for a long time.

    Personally I resent using pretty much anything but linux when I can help it. so I am fine with running kodi and mpv on another TTY (A dual seat setup could be ideal for this but dual seat is a hassle unless you use VMs which is it's own can of worms imo anyone know of an "easy" solution for this aside from VMs?). but I certainly won't be telling anyone to migrate their living room pcs to linux anytime soon.

    Leave a comment:


  • oiaohm
    replied
    Originally posted by Quackdoc View Post
    Direct scan out is not DRM leasing. Here is a brief write from Zamunda (sorry if I spelt it wrong) from the gaming on wayland blog post

    the goal would if possible, to use this to allow the app to do it's own color management, since many apps already do to some degree, and use direct scan out, even if fullscreen, all the compositor would need to do is send HDR metadata.
    Direct scan out the compositor need to be HDR aware and know to switch it back. DRM leasing that could be done around the kernel doing TTY switch so compositor does not need to know what the HDR state is that way instead it then comes the kernel that manages the HDR metadata so extending DRM layers in kernel to handle it.

    Yes of course direct scan out could be based on what you would need for DRM leading on a single monitor.

    Yes doing direct scan out there is still questions of how and needing the different GPU vendors to agree.

    Leave a comment:

Working...
X