Announcement

Collapse
No announcement yet.

KDE Plasma 5.19 Rolls Out In Beta Form With Many Improvements, Better Wayland Support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    Originally posted by zexelon View Post

    Still not sure why wayland is important... other than stripping the client/server aspect of Xorg it just seems like a re-write for the sake of re-write... but in the interest of full disclosure I have only looked at it from the outside. Mostly because KDE and Nvidia were not supported on it.
    Do you know why companies demolish condemned buildings instead of trying to restore it? Well the analogy is pretty much clear.

    Comment


    • #52
      Originally posted by mdedetrich View Post

      This is crap, and there are technical reasons why...
      Everybody knows you must support your own crap or make it clear (open, fro bottom up) if you want to dance out of pace in open source community, and this is beyond good or bad, it is the way a community can work, it is not a world of infinity developers that don't eat ant like to work for free. So, if it was standardized before or after or if it is technically better is something too much sentimental in a survival world.
      Last edited by RomuloP; 16 May 2020, 03:57 AM.

      Comment


      • #53
        Originally posted by ssokolow View Post

        Please read the posts which came before yours, which cover how GBM shows up in Mesa years before either it or EGLStreams show up in Khronos documents and how using EGLStreams for Wayland wasn't proposed until a year after it was submitted to Khronos.
        Of course GBM existed before Wayland, I never said it didn't. I am responding to claims that people are saying that NVidia created EGLStreams after Wayland was designed because they didn't want to use GBM.

        Both GBM and EGLStreams existed before Wayland, thats the point. EGLStreams always was an open Khronos standard, GBM is a linux orientated technology that they cooked up. of course linux developers picked GBM because it was linux based and they were familiar with it, problem with GBM is with drivers that are designed to be cross platform (which NVidia's blob is). Basically if you want to use GBM and have any reasonable sense of performance, you have to implement drivers in a linux specific way.

        Originally posted by ssokolow View Post
        Again, it came after there was already a lot of GBM buy-in from Mesa.
        It didn't, read the reference. NVidia mentioned this but was ignored, everyone wanted to force NVidia to use GBM even though its technically impossible for them to do so without a massive performance hit.

        Originally posted by ssokolow View Post
        If Wayland were "antagonistic" toward GBM, we wouldn't be in this situation. I'm pretty sure the word you meant is "agnostic" (has no stance on).
        No I really did mean technologically agnostic. Do you know what Wayland is? (hint: its just a display protocol). How you display buffers that you receive from the protocol is completely up to you, you can do it with GBM or EGLStreams or any other method. You can read about how Wayland works on a high level from the wikipedia page https://en.wikipedia.org/wiki/Waylan...rver_protocol)

        FreeBSD for example was exploring Wayland and there is no such things as GBM in FreeBSD.

        Originally posted by ssokolow View Post
        You're downplaying KDE's experience, which has basically been "nVidia threw some code over the wall, expects us to maintain it going forward, and we keep tripping over bugs that are hard to fix without being able to look at the driver source to see why it's behaving the way it is."
        GBM and associated display drivers also has bugs and no one complains about it, people just fix it and move along. The reason why people are complaining is not due to rational reasons, but because its NVidia.


        Originally posted by RomuloP View Post

        Everybody knows you must support your own crap or make it clear (open, fro bottom up) if you want to dance out of pace in open source community, and this is beyond good or bad, it is the way a community can work, it is not a world of infinity developers that don't eat ant like to work for free. So, if it was standardized before or after or if it is technically better is something too much sentimental in a survival world.
        You do realize I did add a reference?
        Last edited by mdedetrich; 16 May 2020, 10:37 AM.

        Comment


        • #54
          Originally posted by mdedetrich View Post
          Of course GBM existed before Wayland, I never said it didn't. I am responding to claims that people are saying that NVidia created EGLStreams after Wayland was designed because they didn't want to use GBM.

          Both GBM and EGLStreams existed before Wayland, thats the point. EGLStreams always was an open Khronos standard, GBM is a linux orientated technology that they cooked up. of course linux developers picked GBM because it was linux based and they were familiar with it, problem with GBM is with drivers that are designed to be cross platform (which NVidia's blob is). Basically if you want to use GBM and have any reasonable sense of performance, you have to implement drivers in a linux specific way.

          It didn't, read the reference. NVidia mentioned this but was ignored, everyone wanted to force NVidia to use GBM even though its technically impossible for them to do so without a massive performance hit.
          I don't feel like becoming the one responsible for digging up citations to either confirm or refute that, so I'm just going to move on.

          Originally posted by mdedetrich View Post
          No I really did mean technologically agnostic. Do you know what Wayland is? (hint: its just a display protocol). How you display buffers that you receive from the protocol is completely up to you, you can do it with GBM or EGLStreams or any other method. You can read about how Wayland works on a high level from the wikipedia page https://en.wikipedia.org/wiki/Waylan...rver_protocol)
          You wrote "antagonistic" the first time and I corrected you. What you just said boils down to. "You're wrong. Now I'm going to argue for a fleshed out version of the point you argued for."

          Originally posted by mdedetrich View Post
          GBM and associated display drivers also has bugs and no one complains about it, people just fix it and move along. The reason why people are complaining is not due to rational reasons, but because its NVidia.
          Aside from moments like this...

          But also adjustments for one driver are problematic. The latest NVIDIA driver caused a regression in KWin. On Quadro hardware (other hardware seems to be not affected) our shader self test fails which results in compositing disabled. If one removes the shader self test everything works fine, though. I assume that there is a bug in KWin’s rendering of the self test which is triggered only with this driver. But as I don’t have such hardware I cannot verify. Yes, I did pass multiple patches for investigating and trying to fix it to a colleague with such hardware. No, please don’t donate me hardware.

          In the end, after spending more than half a day on it, we had to do the worst option which is to add a driver and hardware specific check to disable the self test and ship it with the 5.7.5 release. It’s super problematic for the code maintainability to add such checks. We are hiding a bug and we cannot investigate it. We are now stuck with an implementation where we will never be able to say “we can remove that again”. Driver specific workarounds tend to stick around.
          -- https://blog.martin-graesslin.com/bl...stream-or-not/
          ...I remember that KDE had some specific technical complaints about EGLStreams. Something about the API falling short on the guarantees it makes in some way.

          I'll try to dig it up but It wouldn't surprise me if it traces back to the same design decision in nVidia's driver that gave X11 "Performance or no tearing. Pick one."

          That said, do read that blog post I quoted. It goes into detail on why nVidia had to throw code over the wall to get KDE to support EGLStreams. (TL;DR: It required a lot of internal re-architecting busywork to support choosing between GBM and EGLStreams at runtime.)

          EDIT: Also, remember that the Mesa people were receptive to nVidia's idea to produce a successor to GBM and EGLStreams which both parties would use, similar to how D-Bus replaced KDE's use of DCOP and GNOME's use of CORBA... but nothing has yet materialized.
          Last edited by ssokolow; 16 May 2020, 04:06 PM.

          Comment


          • #55
            Originally posted by ssokolow View Post
            I'll try to dig it up but It wouldn't surprise me if it traces back to the same design decision in nVidia's driver that gave X11 "Performance or no tearing. Pick one."
            Not what I remember reading, but a good overview of some of the problems with EGLStreams from KDE's perspective. I'll keep looking for what I read.

            11:59 daniels: the tl;dr is that EGLStreams really is a very fully-encapsulated stream, and this breaks some parts of the protocol (subsurfaces in particular) which rely on frame-by-frame visibility
            11:59 romangg: pq: Ok, can I read a summary somewhere. We are currently discussing the EGLStreams patches to KWin and I'm not so happy about some details, in particular that stuff is hidden. I fear we can't do future improvements to our DRM backend afterwards in regards to multi gpu, framerate sync and so on.
            11:59 daniels: fully tying everything into EGL also seems like a bad idea when we have Vulkan on the horizon, not to mention everything else, like how do you handle interop with PipeWire ... ?
            11:59 romangg: Also I would like at one point overlay scanout.
            11:59 daniels: hah
            12:00 daniels: the answer in that thread is that it would require several as-yet-unwritten extensions
            12:00 daniels: a couple to implement dynamic consumer switching (between GL texture and KMS overlay), since right now the producer -> consumer relationship is fixed for the lifetime of the stream
            12:01 daniels: a couple more to support atomic
            12:02 romangg: Erik said the Nvidia driver supports atomic mode setting internally. How we in KWin can then interact with it I don't know yet.
            12:02 pq: romangg, I'd say your fears of a technical dead-end are well founded, IMO.
            12:03 daniels: well, you'd need new extensions to allow EGL to build up a configuration, then apply it all at once - not sure what those would even look like tbh. plus the interop thing was a big one for me: how does that work with multiple GPUs? or media codecs which give you dmabuf? or PipeWire or any other kind of external streaming?
            12:04 ascent12: Is the plan to implement every single other API in EGL?
            12:04 daniels: essentially all the infrastructure we've built for years has been based around surfacing as much knowledge and visibility into the pipeline as possible, whereas Streams fully encapsulates that into a closed abstraction, which either perfectly supports your usecase or does not support your usecase at all, nothing in between
            12:05 ascent12: (that was supposed to be facetious, it may not have come across in text form)
            12:05 daniels: heh
            12:05 romangg: daniels: Yea, when looking at the code I felt the same. While libdrm gives me ample control, EGLStreams interfaces hide central parts of the pipeline.
            12:06 romangg: Since GNOME has support for EGLStreams afaik how do they cope with these limitations?
            12:07 daniels: gnome doesn't do overlays since it's too hard to pull them out from their scene graph atm
            12:07 emersion: do they use atomic?
            12:07 daniels: no
            12:07 daniels: (not yet)
            12:07 emersion: they do direct scan-out though, right?
            12:07 daniels: i'm not sure how it works with pipewire - either there's some kind of vdpau interop, or perhaps more likely, they just ReadPixels and then that will be fast enough
            -- https://dri.freedesktop.org/~cbrill/...9-02-21#t-1155
            (Found in a list of links at https://news.ycombinator.com/item?id=19213950)

            EDIT: More examples from the Wayland mailing list:

            So I guess the top level issue with eglstreams+kms that at least I see is that if we really want to do this, we would need to terminate the eglstream in the kernel. Since with kms really only the kernel knows why exactly a buffer isn't the right one, and what the producer should change to get to a more optimal setup. But the problem is that KMS is ABI and vendor-neutral, which means all that fancy metadata that you want to attach would need to be standardized in some way. And we'd need to have in-kernel eglstreams. So you'd face both the problem of getting a new primitive into upstream (dma-buf took massive efforts, same for fences going on now). And you'd lose the benefit of eglstreams being able to encapsulate vendor metadata. And we need to figure out how to standardize this a bit better even without eglstreams, so that's why I don't really understand why eglstreams has benefits. It's clearly a nice concept if your in a world of one-vendor-only, but that's not what KMS is aiming for really.
            -- https://lists.freedesktop.org/archiv...ch/027598.html
            > The nice thing about EGLStreams here is that if the consumer (the Wayland
            > compositor) wants to use the content in a different way, the producer
            > must be notified first, in order to produce something suitable for the
            > new consumer.
            that's the problem... the compositor (consumer) makes this decision LATER, not BEFORE. things have to work, efficiently or not, regardless of the compositor (consumer) decisions. adapting to become more efficient is far more than a stream of 1 surface and a stream of buffers.
            -- https://lists.freedesktop.org/archiv...ch/027647.html
            Last edited by ssokolow; 16 May 2020, 07:58 PM.

            Comment


            • #56
              Originally posted by mdedetrich
              of course linux developers picked GBM because it was linux based and they were familiar with it
              You mean because it fits well with their architecture?

              Comment


              • #57
                Originally posted by ssokolow View Post

                I don't feel like becoming the one responsible for digging up citations to either confirm or refute that, so I'm just going to move on.
                Originally posted by ssokolow View Post
                You wrote "antagonistic" the first time and I corrected you. What you just said boils down to. "You're wrong. Now I'm going to argue for a fleshed out version of the point you argued for."
                My bad, I meant agnostic not antagonistic. I think it might have been the result of a bad auto correct that I missed.


                Originally posted by ssokolow View Post
                Aside from moments like this...



                ...I remember that KDE had some specific technical complaints about EGLStreams. Something about the API falling short on the guarantees it makes in some way.

                I'll try to dig it up but It wouldn't surprise me if it traces back to the same design decision in nVidia's driver that gave X11 "Performance or no tearing. Pick one."

                That said, do read that blog post I quoted. It goes into detail on why nVidia had to throw code over the wall to get KDE to support EGLStreams. (TL;DR: It required a lot of internal re-architecting busywork to support choosing between GBM and EGLStreams at runtime.)

                EDIT: Also, remember that the Mesa people were receptive to nVidia's idea to produce a successor to GBM and EGLStreams which both parties would use, similar to how D-Bus replaced KDE's use of DCOP and GNOME's use of CORBA... but nothing has yet materialized.
                Of course it takes effort for KDE to implement EGLStreams, like most of the Linux ecosystem they implemented their systems to only take into account Linux and nothing else (i.e. they coded assuming everything is based around GBM).

                I think the important thing here is to see the forest from the trees. There are 2 attitudes here, one is to only take into account how Linux views things and the other thing it to take into account how things are done in general (i.e. cross platform standards). NVidia designed their proprietary driver deliberately to be cross platform, they do this for obvious reasons (they have a high standard in terms of performance for every platform they support). This means that their driver had to cater to the lower common denominator interface that is available on all OS's (Mac/Intel/Linux/Windows/FreeBSD) and because of this NVidia's blob has a preference for generic graphics driver interfaces that don't make many assumptions about the underlying architecture (talking completely technically here, EGLStreams is purposely designed to be generic which is related to the same problems that the KDE people talk about which you quoted earlier, GBM however is not as generic and makes more strict assumptions about driver architectures, one that NVidia blob doesn't fit). People in the Linux ecosystem are aware of this, they just made a conscious choice to ignore this and thought that their way is the only/best way (either intentionally or not).

                The result of this are simple, NVidia's was going to lose no matter what when it came to Wayland because Linux people already decided on GBM and they never were intentionally honest about changing their mind. No matter what NVidia said, the linux community would have ignored them, even if there were real technical problems with GBM which was stated (by NVidia) many times. NVidia is also partly responsible for this, there is so much bad blood here that most Linux driver developers just ignore NVidia, regardless if they make valid points (or not) and NVidia helped created this situation by deliberately not revealing anything about the driver (for this NVidia is completely responsible).

                This quote from my reference earlier states this in a different way,

                Quote from ddevault (creator of Sway)
                I think they may have had more of a point if they had been pushing EGLStreams back when these APIs were initially under discussion, before everyone else had gone the GBM route. And I question the technical merit of EGLStreams - it’s not like the Linux graphics community is full of ignorant people who are pushing for the wrong technology because… well, I don’t know what reason you think we have.
                And the response

                Well, the FOSS graphics “community” (as in loosely coupled tiny incestuous fractions that just barely tolerate each-other) pretty much represents the quintessential definition of an underdog in what is arguably the largest, most lock-in happy, control surface there is. The lack of information, hardware and human resources at every stage is sufficient explanation for the current state of affairs without claiming ignorance or incompetence. I don’t doubt the competence of nvidia driver teams in this regard either and they hardly just act out of spite or malice - their higher level managers otoh - that’s a different story with about as broken a plot as the corresponding one had been at AMD or Intel in other areas where they are not getting their collective asses handed to them.
                TL;DR because of circumstances the linux FOSS graphics community never understood the broad picture when it comes to how graphics drivers are done (specifically the interfaces that are designed to work with drivers). They also unintentionally never really cared about efforts from other organizations standards that address cross platform concerns (i.e. EGLStreams), instead Linux rather did things "their own way".

                Honestly have a read of https://lobste.rs/s/cfatri/nvidia_sucks_i_m_sick_it, is really enlightening.
                Last edited by mdedetrich; 18 May 2020, 02:58 PM.

                Comment

                Working...
                X