Announcement

Collapse
No announcement yet.

NVIDIA Lands Fix To Avoid High CPU Usage When Using The KDE Desktop

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • bearoso
    replied
    Originally posted by aufkrawall View Post
    Afaik the frame output delay isn't higher with Nvidia on Gnome (both Xorg and Wayland). It's "just" extremely stuttery on Xorg as soon as anything is going on in more windows than one, while this is no problem at all with Mesa.
    That's a a more recent addition, if you can call it that. A change made in GNOME a year and a half ago made Mutter with nvidia pretty responsive. Unfortunately, some distros like to revert that patch, and forks like Cinnamon's Muffin still don't have the changes. And yeah, anything with OpenGL ruins compositor performance on NVIDIA. The driver can't handle more than one instance at a time or encapsulation. I still hope they change their minds and just implement GEM, despite any hardware deficiencies that would make it slightly slower.

    Leave a comment:


  • aufkrawall
    replied
    Afaik the frame output delay isn't higher with Nvidia on Gnome (both Xorg and Wayland). It's "just" extremely stuttery on Xorg as soon as anything is going on in more windows than one, while this is no problem at all with Mesa.

    Leave a comment:


  • bearoso
    replied
    I don’t buy this. __GL_MaxFramesAllowed has never been there before as an option, it’s brand new. And the KWIN_TRIPLE_BUFFER option does not cause kwin to use triple buffering. No, enabling it tells kwin “the nvidia OpenGL implementation is mandating triple buffering, so work around it.” Up until this point, the only methods nvidia made available to not let the driver queue extra frames and increase input lag significantly had the side effect of a busy wait and 100% CPU.

    This mandatory triple buffering is a bad driver optimization trick to increase FPS numbers at the cost of latency. The nvidia drivers in Windows do it, too. If __GL_MaxFramesAllowed works like it’s supposed to, 1 should be the default.

    I’ve used only nvidia cards for 2 decades, and I always thought something strange was up with this. It was only after buying a cheap AMD card to try the new open source drivers that I finally knew that NVIDIA cards had more input lag. I suspect this is a side effect of the way the hardware operates, and that deficiency is also the reason that they can only do a Wayland implementation with non-coherent buffers.

    Leave a comment:


  • Guest
    Guest replied
    RealNC - It's a small amount of lag, it will be unnoticeable during average use (browsing, document editing etc.). It's what Android does by default, and it works great there without much noticeable input lag.

    Leave a comment:


  • RealNC
    replied
    Originally posted by sarmad View Post
    If you are using KDE with nVidia then the best approach is to simply use triple buffering. That's what KDE developers recommend.
    It incurs extra input lag, so it's not a good solution.

    Leave a comment:


  • skeevy420
    replied
    Originally posted by bug77 View Post

    I don't think it works like that. I have a script that exports the variable in ~/.config/plasma-workspace/env.
    Kwin may not pick it up in time if you export it like a regular user variable.
    Yeah, I noticed after I posted that that sudo didn't pick up some variables that were in set in ~/.profile, sourced by .bashrc, and ended up hacking them into /etc/environment. It would probably make sense to put variables like these one into an early sourced file like environment as well. I assume it makes sense to put something that effects a graphics driver and a window manager in pretty early.

    Leave a comment:


  • xorbe
    replied
    Originally posted by birdie View Post
    Multiple people have already tried it and it does nothing.
    Look at the patch. It sets the env var, but it also removes the undesirable work-around code that can inflate cpu time. Did those people make both changes?

    Leave a comment:


  • sarmad
    replied
    If you are using KDE with nVidia then the best approach is to simply use triple buffering. That's what KDE developers recommend.

    Leave a comment:


  • JeansenVaars
    replied
    I actually moved away from KDE because of the bad NVIDIA support (whoever fault it is). Just so many things to tweak for it to work fine, and even a black screen in the session manager when logging out. Had to go to Cinnamon/GNOME and things work better over there.

    Someday I'll come back to KDE to try again.

    Leave a comment:


  • tildearrow
    replied
    Originally posted by remenic View Post
    I have this gut feeling that this wasn't the only wrong assumption made by KWin's timing logic. Even when using Intel HD graphics, it's not able to render at a steady 60fps for too long. Stutters are the norm. Not sure what to think of that, but it makes me sad that this is how the kwin maintaners deal with it.
    Hey, I have sent you a private message since you requested me some patches but apparently you have notifications turned off.

    Edit: Enough waiting, here is the repo: https://github.com/tildearrow/kwin-lowlatency

    (this may require tinkering if you have multiple graphics cards, and is not guaranteed to work, especially on low-end systems)
    Last edited by tildearrow; 27 March 2019, 03:23 AM.

    Leave a comment:

Working...
X