Announcement

Collapse
No announcement yet.

NVIDIA Lands Fix To Avoid High CPU Usage When Using The KDE Desktop

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • skeevy420
    replied
    Originally posted by bug77 View Post

    Well, it would be a pretty serious bug if sudo picked up any of your user's profile variables, wouldn't it?
    Not if that is the goal you want to achieve.

    But yeah, it actually would be a pretty serious bug.

    Leave a comment:


  • skeevy420
    replied
    Originally posted by discordian View Post
    Thats a good one. I completely regret buying an RX570 (replacing a GTX 760), and now I am regularly dualbooting to Windows for anything that uses the GPU.
    Electrical noise on the line out, OS drivers are way worse than Nvidias, and closed source aren't available for newer kernels. It does boot faster, other than that: eF it.
    And I have nothing but great things to say about my RX 580. The only Linux specific issue I have is having to try 3 different Vulkan implementations when I go to play games: AMDVLK, AMDVLK-Pro, and RADV. The only GPU specific issue I have is that MSI set the damn voltages up a little too high so my GPU requires an undervolt to run at full performance...seriously...stock voltages trigger thermal throttling and that results in horrible performance...

    FWIW, outside of AMDVLK-Pro, you really aren't missing anything that important from AMD's closed source driver in regards to playing games. Also, a distribution like Suse Tumbleweed or Manjaro is a good idea to use (Manjaro user myself) since Polaris (and Vega) cards really like rolling release distributions.

    if you actually do need the AMDGPU-Pro driver for, like, OpenCL & not playing games, you're better off targeting an OS it directly supports.

    Electrical noise could be a sign of bad grounding, your power supply going out, etc. On my PC before last, I started getting the electrical noise a few months before my PS went out. Replaced it and it was somewhat fixed but I could tell that there were grounding issues still. Ended up having to run new 110 to fully fix it. Had to run wire in my current house to fix grounding issues too. You'd be surprised how many places have old style 2 wire with fake grounds.

    Anyhoo, anecdotally, I've noticed Linux exacerbates these issues more than Windows does -- each time above where I'd hear the grounding noise, it was only when running Linux. It wasn't until it hit the fan that Windows acted up.

    Leave a comment:


  • bug77
    replied
    Originally posted by bearoso View Post
    That's for Windows. I definitely played with all the hidden options with the nvidia driver I could find. This would've been more helpful sooner, having saved me from writing all those wrappers to insert glFinish in things.
    Hidden option? The setting is in the header files. I find it highly unlikely (though not impossible) Nvidia neglected to expose that for Linux for so many years.

    Leave a comment:


  • discordian
    replied
    Originally posted by Charlie68 View Post
    I can personally say that getting rid of Nvidia a year ago was the best thing I did on the computer. To everyone I know who uses Linux I advised against buying PCs with Nvidia cards and they thanked me. Finite problems.
    Thats a good one. I completely regret buying an RX570 (replacing a GTX 760), and now I am regularly dualbooting to Windows for anything that uses the GPU.
    Electrical noise on the line out, OS drivers are way worse than Nvidias, and closed source aren't available for newer kernels. It does boot faster, other than that: eF it.

    Leave a comment:


  • Charlie68
    replied
    I can personally say that getting rid of Nvidia a year ago was the best thing I did on the computer. To everyone I know who uses Linux I advised against buying PCs with Nvidia cards and they thanked me. Finite problems.
    I have nothing against Nvidia, it's a company and does what it wants, but using Nvidia on Gnu / Linux is a contradiction.

    Leave a comment:


  • bearoso
    replied
    Originally posted by bosjc View Post

    Dunno, the kwin devs seem pretty on board with it (and the existing call was working to spec).
    glXSwapBuffers was in spec (glFlush is only command-queue complete, not buffer-complete), but there's no triple-buffering in the OpenGL spec. kwin was requesting double-buffering from GLX and EGL, so it should have gotten them. Plus, when glXSwapBuffers is hit the second time, glFlush is supposed to make sure the previous one, which is in the command queue, is complete before proceeding. And doing unremovable triple buffering automatically and having an option in the driver called "triplebuffer" that makes it quad-buffering is kind of odd.

    Originally posted by bug77 View Post

    What do we have here? https://hardforum.com/threads/how-vs...ost-1035923030
    A 9 year old mention of MaxFramesAllowed?
    That's for Windows. I definitely played with all the hidden options with the nvidia driver I could find. This would've been more helpful sooner, having saved me from writing all those wrappers to insert glFinish in things.

    Leave a comment:


  • carewolf
    replied
    Originally posted by RealNC View Post
    It incurs extra input lag, so it's not a good solution.
    No, NVidia is _already_ tripple buffering. That is the underlying issue. Telling KDE to use tripple-buffer just make it work-around the fact that the NVidia driver is tripple buffering. So it doesn't change any lag.

    Leave a comment:


  • bug77
    replied
    Originally posted by bearoso View Post
    I don’t buy this. __GL_MaxFramesAllowed has never been there before as an option, it’s brand new.
    What do we have here? https://hardforum.com/threads/how-vs...ost-1035923030
    A 9 year old mention of MaxFramesAllowed?

    Leave a comment:


  • bosjc
    replied
    Originally posted by bearoso View Post
    I don’t buy this. __GL_MaxFramesAllowed has never been there before as an option, it’s brand new. And the KWIN_TRIPLE_BUFFER option does not cause kwin to use triple buffering. No, enabling it tells kwin “the nvidia OpenGL implementation is mandating triple buffering, so work around it.” Up until this point, the only methods nvidia made available to not let the driver queue extra frames and increase input lag significantly had the side effect of a busy wait and 100% CPU.

    This mandatory triple buffering is a bad driver optimization trick to increase FPS numbers at the cost of latency. The nvidia drivers in Windows do it, too. If __GL_MaxFramesAllowed works like it’s supposed to, 1 should be the default.

    I’ve used only nvidia cards for 2 decades, and I always thought something strange was up with this. It was only after buying a cheap AMD card to try the new open source drivers that I finally knew that NVIDIA cards had more input lag. I suspect this is a side effect of the way the hardware operates, and that deficiency is also the reason that they can only do a Wayland implementation with non-coherent buffers.
    Dunno, the kwin devs seem pretty on board with it (and the existing call was working to spec).

    Leave a comment:


  • bug77
    replied
    Originally posted by skeevy420 View Post

    Yeah, I noticed after I posted that that sudo didn't pick up some variables that were in set in ~/.profile, sourced by .bashrc, and ended up hacking them into /etc/environment. It would probably make sense to put variables like these one into an early sourced file like environment as well. I assume it makes sense to put something that effects a graphics driver and a window manager in pretty early.
    Well, it would be a pretty serious bug if sudo picked up any of your user's profile variables, wouldn't it?

    Leave a comment:

Working...
X