Announcement

Collapse
No announcement yet.

NVIDIA Lands Fix To Avoid High CPU Usage When Using The KDE Desktop

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • NVIDIA Lands Fix To Avoid High CPU Usage When Using The KDE Desktop

    Phoronix: NVIDIA Lands Fix To Avoid High CPU Usage When Using The KDE Desktop

    For nearly six years there has been a bug report about high CPU load when using the NVIDIA proprietary driver causing high CPU load when running the KDE desktop and making use of double buffering. This issue has been finally resolved...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Is there something like __GL_MaxFramesAllowed for Mesa.. ?

    Comment


    • #3
      So, either set __GL_YIELD or set __GL_MaxFramesAllowed. Any insights on why the latter is better? Because from a user's or packager's point of view, the solutions look pretty much the same.

      Edit: Apparently if you don't set _GL_YIELD to USLEEP, Kwin will disable Vsync altogether, so until that's fixed, the new fix will probably still leave you with no Vsync.
      Last edited by bug77; 26 March 2019, 06:59 AM.

      Comment


      • #4
        Hip hip... alright...

        Comment


        • #5
          That still doesn't sound like a bug in KWin, it's more like NVIDIA does something custom and they're adding an environment variable for KWin, to tell NVIDIA's closed-source OpenGL driver to do something different.

          Comment


          • #6
            Originally posted by sandy8925 View Post
            That still doesn't sound like a bug in KWin, it's more like NVIDIA does something custom and they're adding an environment variable for KWin, to tell NVIDIA's closed-source OpenGL driver to do something different.
            This assumption isn't valid, as glXSwapBuffers is specified as being an implicit glFlush, not an implicit glFinish, and so it isn't required to block. When this assumption is violated, KWin's frame timing logic will break."


            glXSwapBuffers performs an implicit glFlush before it returns.

            Comment


            • #7
              IT appears to be an excellent news. Will it be provided since the next PLASMA Release!?

              Comment


              • #8
                I have this gut feeling that this wasn't the only wrong assumption made by KWin's timing logic. Even when using Intel HD graphics, it's not able to render at a steady 60fps for too long. Stutters are the norm. Not sure what to think of that, but it makes me sad that this is how the kwin maintaners deal with it.

                Comment


                • #9
                  Originally posted by remenic View Post
                  I have this gut feeling that this wasn't the only wrong assumption made by KWin's timing logic. Even when using Intel HD graphics, it's not able to render at a steady 60fps for too long. Stutters are the norm. Not sure what to think of that, but it makes me sad that this is how the kwin maintaners deal with it.
                  Full ack. It's impossible to use Xorg KWin compositing on Mesa without it introducing stutter into various applications like even Firefox. That's really unfriendly ignorance towards the users.
                  Luckily, yshui has continued Compton development and is achieving greatness:

                  Comment


                  • #10
                    Originally posted by Azrael5 View Post
                    IT appears to be an excellent news. Will it be provided since the next PLASMA Release!?
                    Why would you care about that? Just the variable yourself and be done with it today.
                    I don't remember otoh where you need to set it, I can check when I get back home if you want.

                    Comment

                    Working...
                    X