Announcement

Collapse
No announcement yet.

VRR, Lower Latency Likely Coming For KDE's KWin Wayland Compositor

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by thxcv View Post
    Is VRR available for video/desktop already or still only gaming? I remember there were talks that usecases other then gaming aren't yet covered.
    A kernel API for that usecase is still in the making. Though theoretically every video player could already utilize VRR, at least in fullscreen (and on Wayland probably also windowed, afair Sway does support that).

    Comment


    • #12
      Originally posted by thxcv View Post
      Is VRR available for video/desktop already or still only gaming? I remember there were talks that usecases other then gaming aren't yet covered.
      I believe Sway on Wayland does support it outside of fullscreen games somewhat. If I open my monitor's in-built framerate monitor, I can see the framerate moving, indicating that freesync is active. The same does not happen on X.org outside of fullscreen games

      Comment


      • #13
        Originally posted by SethDusek View Post
        This is true, but no game runs at perfect framerate. Adaptive sync *does* help when you miss a frame, since instead of missing it completely it lowers the refresh rate. This lowers frame judder. If you don't believe me, look at this XDC talk by an AMD developer
        There is a problem. How do you solve slide 19. How as a monitor maker you solve slide 19 brightness problems completely is in fact run the LCD screen at max in VRR mode then do the frame to max refresh rate in the monitor. So on good monitor without lighting issues missed frame still results up with the same latency to end user.

        Lowering the frame judder sent from the GPU does not mean you have fixed the judder being displayed to the user because the monitor is a factor that can put the judder you have removed at the GPU straight back in with the monitor controller. Yes two to tango problem here.

        Max refresh rate of the screen is max refresh rate of the screen that your best possible latency. Adaptive sync helps you reduce missed frames by saving on power usage so the GPU can clock higher.

        Sorry to say that AMD XDC talk mostly right but when it comes to the judder problem is very deceptive. Most of the appearance of judder improvement is the power saving VRR causes to allow the GPU to clock slightly faster.

        VRR is not improving the judder the way people like you are thinking SethDusek. AMD developers were missing the slightly increased GPU clocks as well being caused by VRR using a lower refresh rate results in better frame average render times. Better cooling on the GPU could also see the same gains.

        Nvidia made tools for measuring latency have be useful to catch some of these differences where people think they have improved something by improving the signal but its ruined again by the monitor controller inside the monitor.

        This catch is if you GPU is doing max refresh rate of your monitor the GPU can do what many monitors do to level out brightness problem in GPU. Issue is heat generated in the GPU doing 120/240Hz signal to monitor is this heat worth it when game can only do 60 frames per second GPU rendering and this heat is reducing GPU clock speed.

        Now the trick you walk into is person has a 120hz monitor and decides to run it at only 60 fixed now vs VRR on the same monitor there can be a judder improvement. Why would the user be under clocking the monitor down to 60 Hz fixed they have a 120Hz monitor to save on heat in the CRT part of the GPU to get higher GPU clock speeds.

        VRR is about being able to save power in the CRT part of the GPU so you can have higher GPU clock-speeds for the other processing with the same amount of cooling when you cannot reach max refresh rate of monitor.

        Basically its not helping you how you exactly think it is. Power usage and gpu clock speeds shows the big picture here.

        Yes it really simple to forget a GPU when you have a static image on screen just put the same buffer out over and over again but that is in fact cutting into your TDP budget. The amount of improvement VRR gives you is directly linked to how good of cooling your card has.

        Comment


        • #14
          Is this about windowed VRR, or does VRR not work at all on Wayland currently?

          Comment


          • #15
            Originally posted by piotrj3 View Post
            VRR simply means that in certain refresh rate ranges you can send entire frame straight away and monitor adjusts its refresh rate in real time making frame appear straight away and since its full frame it is not suffering from tearing. Higher HZ/lower HZ output doesn't matter, all matters is how many FPS card produces, and VRR+Vsync produce same number of frames as V-sync alone.
            This bit turns out not to be true majority of monitors. Lot of monitors don't adjust their internal refresh rate instead when you send a frame it comes assigned to the next monitor Vsync of it internal speed. So monitors 240hz VRR is providing 60 each frame is being displayed 4 times by monitor. Early VRR stuff yes the monitor was attempting to adjust its refresh rate all the time but then you end up with unpredictable brightness problems as in static flicker and dynamic flicker. Yes the method to fix those means the monitor is not really dynamically changing its refresh rate just pretends it is.

            Originally posted by piotrj3 View Post
            FPS card produces, and VRR+Vsync produce same number of frames as V-sync alone.
            That does not turn out to be true on AMD cards. Nvidia does there video out differently to AMD so it does not effect the core GPU silicon temperature as much.

            Comment


            • #16
              Originally posted by thxcv View Post
              Is VRR available for video/desktop already or still only gaming? I remember there were talks that usecases other then gaming aren't yet covered.
              There is a use case for full screen video playback, especially for videos with frame-rates that normally require interpolation or 3:2 pull-down to fit onto a computer monitor. Using VRR can both enable smoother video playback while at the same time lowering power consumption. The (otherwise useless) built in video player in Windows 10 does this, but that's the only one I know of. Hopefully we'll see the same feature in some open source video players one day.

              Comment


              • #17
                Originally posted by Brisse View Post
                The (otherwise useless) built in video player in Windows 10 does this, but that's the only one I know of. Hopefully we'll see the same feature in some open source video players one day.
                New Edge and Amazon video app support it too.

                Power saving is also by far the least interesting aspect, you can also set to a fixed lower refresh rate by yourself (or let a script do it automatically). Perfect frame presentation on screen without drawbacks is far more difficult to achieve and thus the much more interesting part of this feature...

                Intel supported automatically throttled output transmission to the display before VRR, and even that caused flickering issues. Variable refresh rate apart from games/video often (always?) causes more issues than perhaps 0-2W of power savings are worth it.
                Last edited by aufkrawall; 22 February 2021, 09:12 AM.

                Comment


                • #18
                  Originally posted by oiaohm View Post

                  This bit turns out not to be true majority of monitors. Lot of monitors don't adjust their internal refresh rate instead when you send a frame it comes assigned to the next monitor Vsync of it internal speed. So monitors 240hz VRR is providing 60 each frame is being displayed 4 times by monitor. Early VRR stuff yes the monitor was attempting to adjust its refresh rate all the time but then you end up with unpredictable brightness problems as in static flicker and dynamic flicker. Yes the method to fix those means the monitor is not really dynamically changing its refresh rate just pretends it is.


                  That does not turn out to be true on AMD cards. Nvidia does there video out differently to AMD so it does not effect the core GPU silicon temperature as much.
                  you confuse a bit terms. At least for Gsync compatible (what is simply certified Freesync over DP) or better monitors, they all adjust refresh rate. What is important is that they adjust refresh rate only in certain range. When refresh rate goes outside range, thing you said happens, internal refresh rate doesn't go too far below and if FPS is smaller then minimum refresh rate range, frames gets duplicated OR you go too high fps, and frames gets skipped until next good frame appears. Vsync and frame rate limiter secures that you won't get additional inconsitent latency from too high fps.

                  Comment


                  • #19
                    Originally posted by oiaohm View Post

                    Really no. Adaptive sync/VRR does not make your LCD panel any faster than its top speed so VRR it self does not directly fix latency. There is a different problem completely.

                    The problem is watts of power consume. Sending 30 Hz signal consumes less power in the GPU to sending 120Hz or faster. So if the game/program cannot reach the max speed of the LCD panel is it worth doing the max speed of the LCD panel. Think GPU doing a lower refresh rate in CRT(the output generation circuit parts) is generating less heat this gives more thermal space for the compute sections of the GPU to run frequency before being thermal throttled.

                    Another reason why you might want VRR is in a mobile phone/laptop to save on power usage.

                    adaptive sync and VRR is more about power usage and heat generation than performance. Yes the lower heat generation of the CRT parts can result in a few extra frame in some games due to the GPU in others being able to clock faster due to slightly lower heat generation.

                    Remember if the game/program is able to keep up with the max speed from the GPU of the screen and you have VRR on it will be in fact sitting at max frame rate of the screen so it would make no difference to how the game is compared to VRR off and refresh rate enabled.

                    There is a lot of people with the mistake that VRR improves latency without understanding how and because of the how the improvement is almost nothing. Only way VRR results in improved latency is that the CRT part of the GPU has produced less heat due to running at a lower HZ output speed resulting in the GPU processing parts being able to run at a slightly higher clock-speed. The latency improvement of VRR turns out close to nothing that is very hard to pick out from run to run variation absolutely going to be too small to be human noticeable in most cases.

                    VRR should be very important to laptop users for battery life.
                    You are talking about VRR and PSR (panel self-refresh) in combination.

                    Comment


                    • #20
                      It would be nice if they got KDE actually usably working on Wayland before focusing all their resources on gaming. You know minor things like actually being able to click buttons.

                      Comment

                      Working...
                      X