Announcement

Collapse
No announcement yet.

NVIDIA Lands Fix To Avoid High CPU Usage When Using The KDE Desktop

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    I've found, at least with X.org, compositing hurts performance across the board. All hardware, drivers, software. I generally disable it and don't use anything that requires it.

    Comment


    • #12
      Before I respond to that, I'd just like to say I'm glad some of the Nvidia/KDE stuff is being resolved.

      Originally posted by linner View Post
      I've found, at least with X.org, compositing hurts performance across the board. All hardware, drivers, software. I generally disable it and don't use anything that requires it.
      In my experiences with my past two AMD GPUs over 7 years using KDE, compositing really didn't matter all that much to most activities that weren't games and it helped with applications like Firefox and SMPlayer without having to manually create an xorg.conf to enable vsync to prevent tearing. For the times compositing did get effect things, which was only games for me, I preferred disabling it as needed using KWin rules.

      These days I launch all of my games with Lutris and I have Lutris disable and enable compositing when games launch and close. I haven't had to bother with KWin rules since I started using Lutris. I also build Wine with patches that disable compositing with full screen applications as well as using programs like libstrangle to control vsync and limit framerates. With those ways I get the best of both worlds -- a nice pretty desktop with awesome transparency effects and games that run with full performance.

      Comment


      • #13
        Multiple people have already tried it and it does nothing.

        Comment


        • #14
          where should i put this line so it takes effect?

          Comment


          • #15
            Originally posted by smartalgorithm View Post
            where should i put this line so it takes effect?
            ~/.profile
            /etc/environment
            ~/.bashrc
            ~/.zshrc
            KDE Settings>Startup And Shutdown>Environment Variables

            EDIT: I use ~/.profile and have my shell rc source that.

            Comment


            • #16
              Originally posted by skeevy420 View Post

              ~/.profile
              /etc/environment
              ~/.bashrc
              ~/.zshrc
              KDE Settings>Startup And Shutdown>Environment Variables

              EDIT: I use ~/.profile and have my shell rc source that.
              I don't think it works like that. I have a script that exports the variable in ~/.config/plasma-workspace/env.
              Kwin may not pick it up in time if you export it like a regular user variable.

              Comment


              • #17
                Originally posted by remenic View Post
                I have this gut feeling that this wasn't the only wrong assumption made by KWin's timing logic. Even when using Intel HD graphics, it's not able to render at a steady 60fps for too long. Stutters are the norm. Not sure what to think of that, but it makes me sad that this is how the kwin maintaners deal with it.
                Hey, I have sent you a private message since you requested me some patches but apparently you have notifications turned off.

                Edit: Enough waiting, here is the repo: https://github.com/tildearrow/kwin-lowlatency

                (this may require tinkering if you have multiple graphics cards, and is not guaranteed to work, especially on low-end systems)
                Last edited by tildearrow; 27 March 2019, 03:23 AM.

                Comment


                • #18
                  I actually moved away from KDE because of the bad NVIDIA support (whoever fault it is). Just so many things to tweak for it to work fine, and even a black screen in the session manager when logging out. Had to go to Cinnamon/GNOME and things work better over there.

                  Someday I'll come back to KDE to try again.

                  Comment


                  • #19
                    If you are using KDE with nVidia then the best approach is to simply use triple buffering. That's what KDE developers recommend.

                    Comment


                    • #20
                      Originally posted by birdie View Post
                      Multiple people have already tried it and it does nothing.
                      Look at the patch. It sets the env var, but it also removes the undesirable work-around code that can inflate cpu time. Did those people make both changes?

                      Comment

                      Working...
                      X