Announcement

Collapse
No announcement yet.

VRR, Lower Latency Likely Coming For KDE's KWin Wayland Compositor

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • VRR, Lower Latency Likely Coming For KDE's KWin Wayland Compositor

    Phoronix: VRR, Lower Latency Likely Coming For KDE's KWin Wayland Compositor

    Following the recent major rewrite to KDE's KWin compositor code there are more exciting improvements likely to come for KWin in improving its Wayland compositor support...

    http://www.phoronix.com/scan.php?pag...n-VRR-Feb-2021

  • #2
    The direct scanout in 5.22 will be great going alongside this!

    Comment


    • #3
      It'd be really cool if somebody used an NVIDIA Reflex Latency analyzer to analyze these competing compositors latency and also compare it to X.org. Future Phoronix benchmark maybe?

      Comment


      • #4
        Will latency be also low with adaptive sync enabled?

        Comment


        • #5
          Originally posted by shmerl View Post
          Will latency be also low with adaptive sync enabled?
          I believe adaptive sync lowers latency by lowering the amount of frame misses especially in games since it adjusts the refresh rate dynamically. Not sure if this is a good analogy, but it's like being late for the bus. Instead of having to wait another hour for the next bus, adaptive sync has the bus wait a couple of minutes (depending on the refresh rate range) for you if you're late so you don't miss it entirely

          Comment


          • #6
            Originally posted by SethDusek View Post
            I believe adaptive sync lowers latency by lowering the amount of frame misses especially in games since it adjusts the refresh rate dynamically. Not sure if this is a good analogy, but it's like being late for the bus. Instead of having to wait another hour for the next bus, adaptive sync has the bus wait a couple of minutes (depending on the refresh rate range) for you if you're late so you don't miss it entirely
            Really no. Adaptive sync/VRR does not make your LCD panel any faster than its top speed so VRR it self does not directly fix latency. There is a different problem completely.

            The problem is watts of power consume. Sending 30 Hz signal consumes less power in the GPU to sending 120Hz or faster. So if the game/program cannot reach the max speed of the LCD panel is it worth doing the max speed of the LCD panel. Think GPU doing a lower refresh rate in CRT(the output generation circuit parts) is generating less heat this gives more thermal space for the compute sections of the GPU to run frequency before being thermal throttled.

            Another reason why you might want VRR is in a mobile phone/laptop to save on power usage.

            adaptive sync and VRR is more about power usage and heat generation than performance. Yes the lower heat generation of the CRT parts can result in a few extra frame in some games due to the GPU in others being able to clock faster due to slightly lower heat generation.

            Remember if the game/program is able to keep up with the max speed from the GPU of the screen and you have VRR on it will be in fact sitting at max frame rate of the screen so it would make no difference to how the game is compared to VRR off and refresh rate enabled.

            There is a lot of people with the mistake that VRR improves latency without understanding how and because of the how the improvement is almost nothing. Only way VRR results in improved latency is that the CRT part of the GPU has produced less heat due to running at a lower HZ output speed resulting in the GPU processing parts being able to run at a slightly higher clock-speed. The latency improvement of VRR turns out close to nothing that is very hard to pick out from run to run variation absolutely going to be too small to be human noticeable in most cases.

            VRR should be very important to laptop users for battery life.

            Comment


            • #7
              Originally posted by oiaohm View Post

              Remember if the game/program is able to keep up with the max speed from the GPU of the screen and you have VRR on it will be in fact sitting at max frame rate of the screen so it would make no difference to how the game is compared to VRR off and refresh rate enabled.

              There is a lot of people with the mistake that VRR improves latency without understanding how and because of the how the improvement is almost nothing. Only way VRR results in improved latency is that the CRT part of the GPU has produced less heat due to running at a lower HZ output speed resulting in the GPU processing parts being able to run at a slightly higher clock-speed. The latency improvement of VRR turns out close to nothing that is very hard to pick out from run to run variation absolutely going to be too small to be human noticeable in most cases.

              VRR should be very important to laptop users for battery life.


              This is true, but no game runs at perfect framerate. Adaptive sync *does* help when you miss a frame, since instead of missing it completely it lowers the refresh rate. This lowers frame judder. If you don't believe me, look at this XDC talk by an AMD developer

              https://xdc2019.x.org/event/5/contri...c-20191003.pdf

              Comment


              • #8
                Originally posted by oiaohm View Post
                Only way VRR results in improved latency is that the CRT part of the GPU has produced less heat due to running at a lower HZ output speed resulting in the GPU processing parts being able to run at a slightly higher clock-speed. The latency improvement of VRR turns out close to nothing that is very hard to pick out from run to run variation absolutely going to be too small to be human noticeable in most cases.
                This is very wrong in practice, as with an fps limiter you can stay inside the VRR range without waiting for a at least one full frame in a backbuffer and doing so with a game's integrated fps limiter often also reduces CPU prerender. As a result, tearing-free playing with a 60Hz display always has atrocious lag or stutter without VRR, while with VRR it doesn't.

                Comment


                • #9
                  Originally posted by oiaohm View Post

                  Really no. Adaptive sync/VRR does not make your LCD panel any faster than its top speed so VRR it self does not directly fix latency. There is a different problem completely.

                  The problem is watts of power consume. Sending 30 Hz signal consumes less power in the GPU to sending 120Hz or faster. So if the game/program cannot reach the max speed of the LCD panel is it worth doing the max speed of the LCD panel. Think GPU doing a lower refresh rate in CRT(the output generation circuit parts) is generating less heat this gives more thermal space for the compute sections of the GPU to run frequency before being thermal throttled.

                  Another reason why you might want VRR is in a mobile phone/laptop to save on power usage.

                  adaptive sync and VRR is more about power usage and heat generation than performance. Yes the lower heat generation of the CRT parts can result in a few extra frame in some games due to the GPU in others being able to clock faster due to slightly lower heat generation.

                  Remember if the game/program is able to keep up with the max speed from the GPU of the screen and you have VRR on it will be in fact sitting at max frame rate of the screen so it would make no difference to how the game is compared to VRR off and refresh rate enabled.

                  There is a lot of people with the mistake that VRR improves latency without understanding how and because of the how the improvement is almost nothing. Only way VRR results in improved latency is that the CRT part of the GPU has produced less heat due to running at a lower HZ output speed resulting in the GPU processing parts being able to run at a slightly higher clock-speed. The latency improvement of VRR turns out close to nothing that is very hard to pick out from run to run variation absolutely going to be too small to be human noticeable in most cases.

                  VRR should be very important to laptop users for battery life.
                  You wrote wrong. VRR generally don't save power.
                  Classic problem of refresh rate is:
                  - if you don't want tearing, you need to wait for entire frame to be ready but that gives latency
                  - if you want lowest possible refresh rate, you might send half new frame and half of old frame, because GPU and monitor is not synchronized.

                  VRR simply means that in certain refresh rate ranges you can send entire frame straight away and monitor adjusts its refresh rate in real time making frame appear straight away and since its full frame it is not suffering from tearing. Higher HZ/lower HZ output doesn't matter, all matters is how many FPS card produces, and VRR+Vsync produce same number of frames as V-sync alone.

                  General idea is that best expierience with at least G-sync is using G-sync+V-sync + frame limiter to around 3 frames below max refresh rate. This gives only marginally bigger latency then unrestricted everything, but literally 0 tearing and frame limiter makes sure you don't produce too many frames (frame rate limiter actually gives you power savings there). however it should still apply to Freesync.
                  https://blurbusters.com/gsync/gsync1...nd-settings/14
                  Last edited by piotrj3; 22 February 2021, 07:30 AM.

                  Comment


                  • #10
                    Is VRR available for video/desktop already or still only gaming? I remember there were talks that usecases other then gaming aren't yet covered.

                    Comment

                    Working...
                    X