Announcement

Collapse
No announcement yet.

Triple Buffering Likely Not Landing Until GNOME 42

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Triple Buffering Likely Not Landing Until GNOME 42

    Phoronix: Triple Buffering Likely Not Landing Until GNOME 42

    In the works over the past year for the GNOME desktop environment is dynamic triple buffering when the GPU is running behind in rendering the desktop. In doing so, the GPU utilization should increase and the GPU clock frequencies in turn should ramp up to meet the demand - thereby ideally getting the rendering back on track if prior frames were running late. That triple buffering support has been re-based to the GNOME 40 code-base but still is unlikely to land until the next cycle with GNOME 42...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    It will never land as the entire MR is bogus.

    Comment


    • #3
      Originally posted by karolherbst View Post
      It will never land as the entire MR is bogus.
      Yeah, this sounds like a horrible workaround to a much deeper problem.

      Comment


      • #4
        If my GPU is lagging behind and cannot keep up with rendering GNOME then maybe something is wrong with GNOME?

        Comment


        • #5
          GNOME news are like next gen AAA title games... so what kind of GPU i need to run it at 30+ fps?

          Comment


          • #6
            Originally posted by 144Hz View Post
            Triple buffering to circumvent driver problems? No thanks. Upstream wouldn’t take it anyway.
            Driver problems? Maybe that's a problem with DE that cannot draw itself efficiently? For example, Windows 7 fancy blurry animated things can run itself constant 30fps on GMA950 with 32-bit Celeron 1.6 GHz, while GNOME cannot handle 60 fps on UHD630 in Coffee Lake.

            Comment


            • #7
              Originally posted by smartalgorithm View Post
              GNOME news are like next gen AAA title games... so what kind of GPU i need to run it at 30+ fps?
              welcome to 2021

              Comment


              • #8
                they tried addressing it as a driver issue, but found out driver is not to be blamed. Shell frame scheduler is trying to be too smart for it's own good, and ends up holding back frames instead of ramping up the gpu. My UHD 620 iGPu is costantly clocked at 300mhz because gnome prefers to drop frames than bother it, for some reason. I don't think gnome devs would have agreed to any changes in the scheduler, so Van Vugt went the triple buffer way. Ubuntu will ship it even if it's not accepted upstream. Arch will deliver it via AUR. Everyone and their dog will be using the MR eventually, except gnome devs

                Comment


                • #9
                  Originally posted by ciupenhauer View Post
                  they tried addressing it as a driver issue, but found out driver is not to be blamed. Shell frame scheduler is trying to be too smart for it's own good, and ends up holding back frames instead of ramping up the gpu. My UHD 620 iGPu is costantly clocked at 300mhz because gnome prefers to drop frames than bother it, for some reason. I don't think gnome devs would have agreed to any changes in the scheduler, so Van Vugt went the triple buffer way. Ubuntu will ship it even if it's not accepted upstream. Arch will deliver it via AUR. Everyone and their dog will be using the MR eventually, except gnome devs
                  It honestly wouldn't surprise me. GNOME devs have a history of digging in their heels on ideas other people don't want.

                  Comment


                  • #10
                    I find this solution pretty hackish. Apparently, the GPU is effortlessly capable of rendering the desktop with the required frame timings. However, artificially increasing the workload so that the GPU realizes that it has to power up seems more like avoiding the real problem than a decent fix. I mean, why doesn't the GPU realize that it is lagging behind? It appears to me that the performance profiles aren't quite right.

                    Comment

                    Working...
                    X