Announcement

Collapse
No announcement yet.

NVIDIA Lands Fix To Avoid High CPU Usage When Using The KDE Desktop

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by bug77 View Post

    I don't think it works like that. I have a script that exports the variable in ~/.config/plasma-workspace/env.
    Kwin may not pick it up in time if you export it like a regular user variable.
    Yeah, I noticed after I posted that that sudo didn't pick up some variables that were in set in ~/.profile, sourced by .bashrc, and ended up hacking them into /etc/environment. It would probably make sense to put variables like these one into an early sourced file like environment as well. I assume it makes sense to put something that effects a graphics driver and a window manager in pretty early.

    Comment


    • #22
      Originally posted by sarmad View Post
      If you are using KDE with nVidia then the best approach is to simply use triple buffering. That's what KDE developers recommend.
      It incurs extra input lag, so it's not a good solution.

      Comment


      • #23
        RealNC - It's a small amount of lag, it will be unnoticeable during average use (browsing, document editing etc.). It's what Android does by default, and it works great there without much noticeable input lag.

        Comment


        • #24
          I don’t buy this. __GL_MaxFramesAllowed has never been there before as an option, it’s brand new. And the KWIN_TRIPLE_BUFFER option does not cause kwin to use triple buffering. No, enabling it tells kwin “the nvidia OpenGL implementation is mandating triple buffering, so work around it.” Up until this point, the only methods nvidia made available to not let the driver queue extra frames and increase input lag significantly had the side effect of a busy wait and 100% CPU.

          This mandatory triple buffering is a bad driver optimization trick to increase FPS numbers at the cost of latency. The nvidia drivers in Windows do it, too. If __GL_MaxFramesAllowed works like it’s supposed to, 1 should be the default.

          I’ve used only nvidia cards for 2 decades, and I always thought something strange was up with this. It was only after buying a cheap AMD card to try the new open source drivers that I finally knew that NVIDIA cards had more input lag. I suspect this is a side effect of the way the hardware operates, and that deficiency is also the reason that they can only do a Wayland implementation with non-coherent buffers.

          Comment


          • #25
            Afaik the frame output delay isn't higher with Nvidia on Gnome (both Xorg and Wayland). It's "just" extremely stuttery on Xorg as soon as anything is going on in more windows than one, while this is no problem at all with Mesa.

            Comment


            • #26
              Originally posted by aufkrawall View Post
              Afaik the frame output delay isn't higher with Nvidia on Gnome (both Xorg and Wayland). It's "just" extremely stuttery on Xorg as soon as anything is going on in more windows than one, while this is no problem at all with Mesa.
              That's a a more recent addition, if you can call it that. A change made in GNOME a year and a half ago made Mutter with nvidia pretty responsive. Unfortunately, some distros like to revert that patch, and forks like Cinnamon's Muffin still don't have the changes. And yeah, anything with OpenGL ruins compositor performance on NVIDIA. The driver can't handle more than one instance at a time or encapsulation. I still hope they change their minds and just implement GEM, despite any hardware deficiencies that would make it slightly slower.

              Comment


              • #27
                Originally posted by skeevy420 View Post

                Yeah, I noticed after I posted that that sudo didn't pick up some variables that were in set in ~/.profile, sourced by .bashrc, and ended up hacking them into /etc/environment. It would probably make sense to put variables like these one into an early sourced file like environment as well. I assume it makes sense to put something that effects a graphics driver and a window manager in pretty early.
                Well, it would be a pretty serious bug if sudo picked up any of your user's profile variables, wouldn't it?

                Comment


                • #28
                  Originally posted by bearoso View Post
                  I don’t buy this. __GL_MaxFramesAllowed has never been there before as an option, it’s brand new. And the KWIN_TRIPLE_BUFFER option does not cause kwin to use triple buffering. No, enabling it tells kwin “the nvidia OpenGL implementation is mandating triple buffering, so work around it.” Up until this point, the only methods nvidia made available to not let the driver queue extra frames and increase input lag significantly had the side effect of a busy wait and 100% CPU.

                  This mandatory triple buffering is a bad driver optimization trick to increase FPS numbers at the cost of latency. The nvidia drivers in Windows do it, too. If __GL_MaxFramesAllowed works like it’s supposed to, 1 should be the default.

                  I’ve used only nvidia cards for 2 decades, and I always thought something strange was up with this. It was only after buying a cheap AMD card to try the new open source drivers that I finally knew that NVIDIA cards had more input lag. I suspect this is a side effect of the way the hardware operates, and that deficiency is also the reason that they can only do a Wayland implementation with non-coherent buffers.
                  Dunno, the kwin devs seem pretty on board with it (and the existing call was working to spec).

                  Comment


                  • #29
                    Originally posted by bearoso View Post
                    I don’t buy this. __GL_MaxFramesAllowed has never been there before as an option, it’s brand new.
                    What do we have here? https://hardforum.com/threads/how-vs...ost-1035923030
                    A 9 year old mention of MaxFramesAllowed?

                    Comment


                    • #30
                      Originally posted by RealNC View Post
                      It incurs extra input lag, so it's not a good solution.
                      No, NVidia is _already_ tripple buffering. That is the underlying issue. Telling KDE to use tripple-buffer just make it work-around the fact that the NVidia driver is tripple buffering. So it doesn't change any lag.

                      Comment

                      Working...
                      X