Originally posted by aufkrawall
View Post
Announcement
Collapse
No announcement yet.
NVIDIA Lands Fix To Avoid High CPU Usage When Using The KDE Desktop
Collapse
X
-
- Likes 1
-
Afaik the frame output delay isn't higher with Nvidia on Gnome (both Xorg and Wayland). It's "just" extremely stuttery on Xorg as soon as anything is going on in more windows than one, while this is no problem at all with Mesa.
Leave a comment:
-
I don’t buy this. __GL_MaxFramesAllowed has never been there before as an option, it’s brand new. And the KWIN_TRIPLE_BUFFER option does not cause kwin to use triple buffering. No, enabling it tells kwin “the nvidia OpenGL implementation is mandating triple buffering, so work around it.” Up until this point, the only methods nvidia made available to not let the driver queue extra frames and increase input lag significantly had the side effect of a busy wait and 100% CPU.
This mandatory triple buffering is a bad driver optimization trick to increase FPS numbers at the cost of latency. The nvidia drivers in Windows do it, too. If __GL_MaxFramesAllowed works like it’s supposed to, 1 should be the default.
I’ve used only nvidia cards for 2 decades, and I always thought something strange was up with this. It was only after buying a cheap AMD card to try the new open source drivers that I finally knew that NVIDIA cards had more input lag. I suspect this is a side effect of the way the hardware operates, and that deficiency is also the reason that they can only do a Wayland implementation with non-coherent buffers.
- Likes 6
Leave a comment:
-
Guest repliedRealNC - It's a small amount of lag, it will be unnoticeable during average use (browsing, document editing etc.). It's what Android does by default, and it works great there without much noticeable input lag.
Leave a comment:
-
Originally posted by bug77 View Post
I don't think it works like that. I have a script that exports the variable in ~/.config/plasma-workspace/env.
Kwin may not pick it up in time if you export it like a regular user variable.
Leave a comment:
-
If you are using KDE with nVidia then the best approach is to simply use triple buffering. That's what KDE developers recommend.
- Likes 3
Leave a comment:
-
I actually moved away from KDE because of the bad NVIDIA support (whoever fault it is). Just so many things to tweak for it to work fine, and even a black screen in the session manager when logging out. Had to go to Cinnamon/GNOME and things work better over there.
Someday I'll come back to KDE to try again.
Leave a comment:
-
Originally posted by remenic View PostI have this gut feeling that this wasn't the only wrong assumption made by KWin's timing logic. Even when using Intel HD graphics, it's not able to render at a steady 60fps for too long. Stutters are the norm. Not sure what to think of that, but it makes me sad that this is how the kwin maintaners deal with it.
Edit: Enough waiting, here is the repo: https://github.com/tildearrow/kwin-lowlatency
(this may require tinkering if you have multiple graphics cards, and is not guaranteed to work, especially on low-end systems)Last edited by tildearrow; 27 March 2019, 03:23 AM.
- Likes 2
Leave a comment:
Leave a comment: