Announcement

Collapse
No announcement yet.

NVIDIA Working On An EGLStreams Back-End For KDE On Wayland

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by FireBurn View Post

    Again what has he broken?
    It's a fact of life regressions happen, go over to KDE bugzilla and look for unintentional regressions in Kwin. I won't do the legwork for you as I assume you are a grown up and are more than capable of finding examples yourself.

    Comment


    • #42
      Originally posted by kaprikawn View Post
      Nobody cares
      about your opinion.

      Comment


      • #43
        Originally posted by shmerl View Post

        Vulkan compositor won't need EGLstreams or GBM. But someone needs to implement it first: https://cgit.kde.org/kwin.git/log/?h=fredrik/vulkan
        The Vulkan approach would offer many benefits to all. I think as well Nvidia should focus on that and this would be something where they would be taken serious.

        Comment


        • #44
          Originally posted by R41N3R View Post

          The Vulkan approach would offer many benefits to all. I think as well Nvidia should focus on that and this would be something where they would be taken serious.
          At the same time, KDE developers don't seem very eager to work on it, or see potential in such work.

          Comment


          • #45
            The comments by KDE developers about the (lack of) news about their proposed allocator library are on point. Is NVidia still working on this so both GBM and EGLStreams can be abandoned in the foreseeable future, or will both code paths need to be maintained indefinitely?

            Originally posted by Slartifartblast View Post
            Originally posted by FireBurn View Post
            Any evidence of this? Aside from Martin being stubborn (and he's no longer the lead maintainer) he has, as far as I'm aware, never deliberately broken things
            Or to put it another way, never ascribe maliciousness to that which can be explained by incompetence.
            I think there needs no malice nor incompetence involved, and it can be actually a smart decision to have interfaces unstable. Because this allows you to avoid accumulating technical debt in your codebase.

            This is also the stance that the Linux kernel takes: some interfaces are stable and some are not.

            Originally posted by Britoid View Post
            There was no "standard", which is why there's now a mess. Each EGL implementation made their own, Mesa created GBM and nVidia EGLStreams.
            Well, in the beginning there was GBM and nothing else. So I suppose you could call it a standard, especially as other graphics vendors like ARM (Mali) and Qualcomm (Adreno) started making their proprietary drivers support GBM too.

            Then NVidia came up with EGLStreams.

            Originally posted by Britoid View Post
            For x reasons nVidia apparently can't use GBM, which is why they suggested making a new universal method that supports all use cases.
            If you look at their first two XDC presentations on the topic they state that they could have used GBM but it would not fit very well with their current driver design.

            Originally posted by Slartifartblast View Post
            It's a fact of life regressions happen, go over to KDE bugzilla and look for unintentional regressions in Kwin. I won't do the legwork for you as I assume you are a grown up and are more than capable of finding examples yourself.
            What's that, a new "it's not my job to educate you"?

            Yes, regressions in general do happen. But the claim was very specific, namely that regressions happen in a way that the NVidia EGLStreams code needs to be updated every release because reasons (among those was later suggested animosity of KDE developers against NVidia). And substantiation was requested from those who make this claim. None has come forward so far.

            Comment


            • #46
              Originally posted by duby229 View Post

              What are you saying? Nvidia is writing this code because -they- caused this problem, and -does not- fix it, it only makes the problem more complicated.
              I am saying that they are more cooperative than before and this is good to see. Granted, it would be nicer to see proper GBM support but this is still much better than the big nothing they provided before.

              If they get a good enough citizen in the OSS world, I may buy from them again. Currently, I only buy AMD graphics cards since they are the ones best supporting their hw with OSS drivers and tooling on Linux. They were also a driving force behind Vulkan so they deserve our purchase.

              I think, I will need AMD-level of OSS contributions from Nvidia before I consider them again.

              Comment


              • #47
                Originally posted by LinAGKar View Post

                I fail to see the benefit of high latency, severe CPU overhead, and poor OpenGL/Vulkan support.
                I think we are yet to see Wayland actually perform rendering faster than X11..

                Comment


                • #48
                  Originally posted by bearoso View Post
                  EGLStreams doesn’t maintain buffer contents after composition, which is kind of weird, and I highly suspect it’s a workaround for a serious deficiency with nvidia hardware. Because of this, I don’t think that nvidia can implement GBM or any proposed universal buffer format without it taking a more constricted approach than GBM.
                  I would say it might be down to nVidia sharing a large portion of driver code between their Windows and Linux drivers.

                  Originally posted by reavertm View Post

                  I think we are yet to see Wayland actually perform rendering faster than X11..
                  Resizing Windows on GNOME is much more smoother under Wayland than X11.

                  Comment


                  • #49
                    Originally posted by reavertm View Post

                    I think we are yet to see Wayland actually perform rendering faster than X11..
                    I'm talking about direct vs indirect rendering.

                    Comment


                    • #50
                      Originally posted by chithanh View Post
                      I think there needs no malice nor incompetence involved, and it can be actually a smart decision to have interfaces unstable. Because this allows you to avoid accumulating technical debt in your codebase.
                      Yeah, very smart, that's why Linux is still such a niche OS.

                      Comment

                      Working...
                      X