Announcement

Collapse
No announcement yet.

Ubuntu Talks Up Its GNOME Dynamic Triple Buffering Support In 22.04/22.10

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by lumks View Post
    Lots and lots of people. Especially people with bad eyesight tend to use lower resolutions.
    Things change over time. In the past the only effective scale adjustment for performance was in the monitor. We are not in the past. We have items like gamescope and technologies like FSR using the GPU as the scaler.

    Those with bad eyesight it is better its better to either get the application to scale up itself or use something like FSR that in the GPU by gamescope or equal. The issue is scaler in the monitor is that it has to be generic. FSR and other scalers done in the GPU can be using custom rules per application to get better quality output.

    Reason to use lower resolution is very quickly coming only to be done deal with defective monitor cable or defective monitor mode or possible save power. Reality here majority of the time the monitor scaler is going to generate worst quality output by large margin than the other options using the GPU offer.

    Comment


    • #32
      Originally posted by horizonbrave View Post
      Is it possible to disable animation from gnome preferences GUI or tweak tool?
      Yes via Settings -> Accessibility

      Comment


      • #33
        Originally posted by lumks View Post
        Lots and lots of people. Especially people with bad eyesight tend to use lower resolutions.
        That hasn't been true since the CRT era, and even then it wasn't something that "lots and lots of people" did.

        LCD panels look like ass at anything other than native resolution, and always have. Increasing the font size provides far better results than screwing around with picture size does, and even Linux DEs have handled that competently for years now. There's no reason to opt for the vastly inferior solution unless there's some secondary factor involved.

        Comment


        • #34
          lumks
          It shouldn't increase battery usage unless the GPU can't keep up with monitor refresh rate. If it can't it would drop to, say, 30FPS for a 60Hz display which might reduce power usage compared to rendering at 50FPS with a triple buffer but that's usually not what you want. If you wanted that limiting FPS or setting the refresh rate to 30Hz is the better option.

          You-
          karolherbst
          No, I'm not understanding this wrong. I know about the clocking issue on Intel. This patch also helps other underpowered hardware like ARM SBCs. It helps anywhere where the GPU can't keep up with monitor refresh rate. With double buffering you drop to an integer fraction like 1/2 (30FPS), 1/3 (20FPS), 1/4 (15FPS), etc. With a triple buffer you can render at any refresh rate in between, like 50FPS or 47FPS or whatever. That's much better looking, improves responsiveness and is less jarring when frame rate drops. The reason it's bad to have triple buffer on all the time is that it increases latency by a frame when the GPU *does* keep up with monitor refresh. It leads to a bufferbloat-like filling of the render queue. There's going to be a frame waiting in the queue waiting to be scanned out. On the other hand this isn't the case when the GPU renders slower than the refresh rate. In that case there is n no additional latency. It's lower in fact.
          The reason I suggested putting this in mesa directly (or possibly in a LD_PELOAD shim) is that this is also what you'd want for games, for example. Lower latency, but still able to render as fast as the GPU can, even if it's slower than the monitor, without dropping all the way to 1/2, 1/3 of the refresh like with double buffering. Maybe some games already do this, I don't know.
          Also, this only applies with vsync on, obviously.

          Comment


          • #35
            Originally posted by binarybanana View Post
            The reason I suggested putting this in mesa directly (or possibly in a LD_PELOAD shim) is that this is also what you'd want for games, for example. Lower latency, but still able to render as fast as the GPU can, even if it's slower than the monitor, without dropping all the way to 1/2, 1/3 of the refresh like with double buffering. Maybe some games already do this, I don't know.
            Also, this only applies with vsync on, obviously.
            You can actually easily do this with DXVK by setting the number of framebuffers to 3.

            Note that this also has the side-effect of letting your GPU always work with 100% utilization, even with active Vsync.

            At least on my old nVidia 750 Ti crammed into a cube because HTPC, it would regularly lead to throttling because of bad airflow.

            And when your GPU works with a 100% load all the time while gaming, you will most likely lose out on the boost clocks eventually during that session, so experience atleast some amount of downclocking.

            And if you are really unlucky: coil whine...

            Comment


            • #36
              Linuxxx
              You can also get something similar with TearFree and vsync off on Intel/AMD(/nouveau), or ForceCompositionPipeline on Nvidia. But it's not optimal, even if you add an FPS limiter into the mix.

              Comment


              • #37
                Originally posted by binarybanana View Post
                Linuxxx
                You can also get something similar with TearFree and vsync off on Intel/AMD(/nouveau), or ForceCompositionPipeline on Nvidia. But it's not optimal, even if you add an FPS limiter into the mix.
                Yeah, I remember trying out ForceCompositionPipeline and that always lead to horrible frame-pacing issues, even with a FPS limiter, like you said.

                I figure the same is also true with the TearFree option on Mesa drivers;
                either way, this is not an adequate solution to the problem.

                The objectively best solution is of course having a VRR screen, but if that is not available, then Valve's GameScope is the next best thing for a rather smooth experience, as evident on the Steam Deck.

                Comment


                • #38
                  Originally posted by Mario Junior View Post

                  Why do you think a bunch of people hates Gnome developers and Gnome itself? 😂
                  Makes me wonder, why didnt Canonical went with another DE instead of Gnome?

                  I mean, those devs has been ignoring their userbase for a very long time now.

                  Comment


                  • #39
                    The best part of Ubuntu 22 04.1 is. It makes a fantastic OS for KDE-5.24.6

                    Comment


                    • #40
                      I am so glad I went with KDE when it first came out. 5.24.6 is as good as it gets.

                      Comment

                      Working...
                      X