Announcement

Collapse
No announcement yet.

Proposed GNOME Patches Would Switch To Triple Buffering When The GPU Is Running Behind

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by caligula View Post
    How do I explain this to you - this is NOT about games, desktop != games. You don't need to simulate anything. For example when playing a video, even in realtime video conferencing, somewhat larger latency is acceptable. Playing back existing video files tolerates as much latency as you want. As a result, the rendering does not need to predict anything.
    You fail to see the big picture, and to draw parallels. I'm done with you.
    Last edited by gens; 28 July 2020, 10:24 AM.

    Comment


    • #22
      Maybe they need to talk with Microsoft or Apple developers and ask how to improve Gnome performance.

      Comment


      • #23
        Originally posted by caligula View Post
        How do I explain this to you - this is NOT about games, desktop != games. You don't need to simulate anything. For example when playing a video, even in realtime video conferencing, somewhat larger latency is acceptable. Playing back existing video files tolerates as much latency as you want. As a result, the rendering does not need to predict anything.
        Rendering does not tolerate any arbitrary amount of latency. In your example above, latency would have to be compensated to keep in sync with said video's audio. A desktop compositor won't do that. This is about the compositor only. You are the one who seem to be assuming this software has complete control over all other parts of the system, like in a game engine. Reality is the opposite of what you said. A game engine can compensate for it and make some latency acceptable. A desktop compositor can't.

        Adding latency in the compositor is not an acceptable solution, it's a last resort solution. It will be detrimental to the user experience.

        (I'm not talking about the MR from the article, I'm addressing caligula's comments, only.)

        Comment


        • #24
          It does seem insane that a GPU struggles at all moving some rectangles around a display, regardless of whether it's integrated or not.

          Comment


          • #25
            Originally posted by gens View Post

            Triple buffering is basically rendering all the time and showing only the (theoretically) latest drawn frame. It increases the load on the system as much as turning off vsync does. While it, in the case of gpu being slow, will provide more fps, the result will be jittery and the latency has a great chance of being bigger. Think about the timings, it should be clearer after a while.

            https://www.youtube.com/watch?v=seyAzw9zEoY Tech Focus - V-Sync: What Is It - And Should You Use It?

            https://twitter.com/ID_AA_Carmack/st...11153509249025
            Triple buffering in the classic sense means a 3 buffer swap queue. This means you have one buffer being scanned out, one complete, waiting for the next vblank to set the page flip address, and one being drawn to by the application. This is what Carmack is talking about. A third buffer in the queue means an extra frame before action is reflected on the display. The jitter comes from situations where the queue oscillates between full and not-full, or where the animations or physics in a game run at a non-fixed time interval, making the render time stamps uneven.

            What you describe in your first sentence is called fast-sync or enhanced-sync in drivers. The point of that method is strictly to reduce latency, so it would obviously not have a negative effect on it. The disadvantages, as you suggest, are jitter, just like above, and excessive GPU waste.

            A compositor can’t use the second method when it encapsulates other renderers. Its job is to display every frame it gets, and display them on time. In a completely integrated compositing stack, it can do this with very little extra latency, but in practice causes a frame of extra latency.

            I don’t like the idea of another extra buffer in the compositor. It’s already bad enough that there’s extra latency. Try something else first.

            Edit: I’m not contradicting your opinion, which I agree with, just providing extra info. I disagree with caligula’s assertion that the extra latency doesn’t matter in desktop scenarios. I see it when moving windows around and just get a really non-snappy feeling on other interactions. If you games or whatever don’t or can’t bypass the compositor, they’ll get worse as well.
            Last edited by bearoso; 28 July 2020, 03:02 PM.

            Comment


            • #26
              Originally posted by caligula View Post
              Triple buffering should provide better / more consistent performance / latencies overall without increasing the computational load. It just requires a bit more memory for the extra buffer.
              Actually it adds latency.

              Comment


              • #27
                Originally posted by tildearrow

                Oh, of course, now you blame it in the GPU drivers.

                "We're perfect; any fault is somebody else's." Can you think out of the box for a moment?
                Can you?

                "GNOME is always guilty. No exception."

                Comment


                • #28
                  Originally posted by tildearrow
                  Oh, of course, now you blame it in the GPU drivers.
                  "We're perfect; any fault is somebody else's." Can you think out of the box for a moment?
                  The KDE trolls striking back. In my opinion you're really as worse as 144Hz, intelfx put that right.



                  Comment


                  • #29
                    Originally posted by Hibbelharry View Post

                    The KDE trolls striking back. In my opinion you're really as worse as 144Hz, intelfx put that right.


                    I just deleted the post. :l

                    The open-source drivers are well-optimized already, and while there is more room for optimization, GNOME traditionally has been slow and a resource hog (but this has been changing lately).

                    Edit: Oh wait you're part of the clan... never mind....

                    Comment


                    • #30
                      Originally posted by SkyWarrior View Post
                      ... KDE has never been finalized and I don't believe that I will be able to see a proper stable release where you can upgrade without breaking the whole installation of your OS
                      FWIW (and after doing a full backup, of course) I've upgraded Kubuntu from 19.04 forwards, and am now even using the "Groovy" development repos and it's been flawless (and I've modified a handful of system files even).

                      Comment

                      Working...
                      X