Announcement

Collapse
No announcement yet.

NVIDIA Lands Fix To Avoid High CPU Usage When Using The KDE Desktop

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • NVIDIA Lands Fix To Avoid High CPU Usage When Using The KDE Desktop

    Phoronix: NVIDIA Lands Fix To Avoid High CPU Usage When Using The KDE Desktop

    For nearly six years there has been a bug report about high CPU load when using the NVIDIA proprietary driver causing high CPU load when running the KDE desktop and making use of double buffering. This issue has been finally resolved...

    http://www.phoronix.com/scan.php?pag...E-High-CPU-Fix

  • HenryM
    replied
    piotrj3 was your CPU made before or after the Ryzen Linux segfault bug fix? (2017 production week 25 I think?)
    https://www.phoronix.com/scan.php?pa...en-fixed&num=1


    BTW I just noticed tildearrow 's repo is in the AUR
    Last edited by HenryM; 03-31-2019, 11:35 PM. Reason: clarity

    Leave a comment:


  • piotrj3
    replied
    Thing is AMD seems to have lower quality control recently both in CPU and GPU. Recently I was diagnozing a problem when Ryzen 1700x without any OC was unstable ... on 2 diffrent motherboards, 2 diffrent sets of RAM and powersupply way over the top. And honestly do I care about seting up nvidia driver right for a while or having ultra hot GPU and having fan noise? Well I prefer green team there.

    Leave a comment:


  • RealNC
    replied
    Originally posted by carewolf View Post
    This entire issue is that it will cache more than one frame... Read the summary again.
    I did read it. It would cache more than one frame because kwin would allow it to cache more than one frame. The patch is a quick workaround that is supposed to fix that without requiring major kwin changes.

    Leave a comment:


  • carewolf
    replied
    Originally posted by RealNC View Post
    Not sure what you mean. If you set:

    Option "TripleBuffer" "True"

    in the nvidia driver, then it triple buffers. Otherwise, it doesn't.
    This entire issue is that it will cache more than one frame... Read the summary again.

    Leave a comment:


  • bearoso
    replied
    Originally posted by bug77 View Post
    Hidden option? The setting is in the header files. I find it highly unlikely (though not impossible) Nvidia neglected to expose that for Linux for so many years.
    It's possible it was exposed, but certainly not documented, and it still isn't. I refer to settings that aren't directly exposed to users as "hidden." i.e. The stuff MESA hides behind driconf is "hidden." I knew Windows had a registry setting for it you could set and certainly looked around for something similar.

    Leave a comment:


  • RealNC
    replied
    Originally posted by sandy8925 View Post
    RealNC - It's a small amount of lag, it will be unnoticeable during average use (browsing, document editing etc.).
    It is noticeable when playing games and when moving windows around. And that's at 120Hz. At 60Hz, it is clearly noticeable pretty much everywhere.

    Leave a comment:


  • RealNC
    replied
    Originally posted by carewolf View Post
    No, NVidia is _already_ tripple buffering.
    Not sure what you mean. If you set:

    Option "TripleBuffer" "True"

    in the nvidia driver, then it triple buffers. Otherwise, it doesn't.

    Leave a comment:


  • skeevy420
    replied
    Originally posted by discordian View Post
    You are lucky, mine crashes in games unless I limit the threshold (heck it sometimes crashes still). You are aware that AMD have bad efficiency because a portion of their chips need that high voltage?
    The only actual crashes I've had were related to undervolting too much. Yep, I'm aware which is why the first thing I did was to undervolt to fix thermal issues. Guess I got lucky on the silicone lottery...but my standard UV is to lower states 3 & 4 to 1100mv and 5 - 7 to 1125mv, everything else stays the same; nothing major or drastic at all. Give that a try...it's basically the first step you'd do if you followed a random guide (also, use wattmanGTK launched from the terminal and read its output). I could probably push the UV more, but it's good enough to fix all of my thermal issues so that's where I'm going to stop.

    Originally posted by discordian View Post
    Well, with the OS driver I am not playing games...
    Without knowing specifics, best I can say is update your kernel and graphics stack. My 580 runs damn well on 5.0.2 with Mesa 18.3.4 using AMDVLK by default and RADV as needed on a per-game basis. KDE has never been better than it is for me now.

    Originally posted by discordian View Post
    Could be but is not for me, its a result of poorly matched Voltage regulators (unlikely as MSI is not known for that, the Geforce was a MSI aswell) or a chip that does a horrible job of limiting EMI.
    My last card was an MSI R7 260x and is also the one that had the noises from both bad wiring and a bad power supply. It lasted 6 years and three different PCs...the power supply until I retired that PC two years ago and got the one I have now. The 260x ran very hot...Linux undervolting came too late to save it...

    Originally posted by discordian View Post
    Fine, but not in my flat.
    You'd be surprised. 15 years ago the place I lived I'd get PC electrical noise if the wind blew too hard. After 4 years and an ice storm the power lines running to that house were fixed and that fixed my PC electrical noise. All I'm saying is don't rule anything out. That PC had an MSI Geforce 8400 (eventually sli...it sucked). A few months after upgrading that to the R7 260x is when I discovered the power supply and then wiring issues at my next house.

    Originally posted by discordian View Post
    For me the noise is a matter of RX570-or-not, while the crashes are RX570+Linux-or-not.
    To me it sounds like your power supply is starting to go...possibly with an LTS distribution using an older graphics stack that'll exacerbate issues on the Linux side. All my GPUs acted like that when the power supply was starting to go and my 260x did not like LTS distributions with AMDGPU after 2016. I've never bothered with less than Tumbleweed or Manjaro with the 580 since I bought it almost two months ago.

    Like you said, AMD cards are power hungry. Using something like that is where you'll discover power supply problems. That's how it was for me back in the day.

    Leave a comment:


  • Charlie68
    replied
    Originally posted by discordian View Post
    Thats a good one. I completely regret buying an RX570 (replacing a GTX 760), and now I am regularly dualbooting to Windows for anything that uses the GPU.
    Electrical noise on the line out, OS drivers are way worse than Nvidias, and closed source aren't available for newer kernels. It does boot faster, other than that: eF it.
    I talked about my experience and I don't have the problems you're referring to and I didn't buy an RX570. The fact that there are no proprietary drivers for Amd is the reason that prompted me to buy an Amd card, the drivers are developed by Amd with the help of the community and this is much better than proprietary shit.

    Leave a comment:

Working...
X