Announcement

Collapse
No announcement yet.

VRR, Lower Latency Likely Coming For KDE's KWin Wayland Compositor

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • oiaohm
    replied
    Originally posted by SethDusek View Post
    This is true, but no game runs at perfect framerate. Adaptive sync *does* help when you miss a frame, since instead of missing it completely it lowers the refresh rate. This lowers frame judder. If you don't believe me, look at this XDC talk by an AMD developer
    There is a problem. How do you solve slide 19. How as a monitor maker you solve slide 19 brightness problems completely is in fact run the LCD screen at max in VRR mode then do the frame to max refresh rate in the monitor. So on good monitor without lighting issues missed frame still results up with the same latency to end user.

    Lowering the frame judder sent from the GPU does not mean you have fixed the judder being displayed to the user because the monitor is a factor that can put the judder you have removed at the GPU straight back in with the monitor controller. Yes two to tango problem here.

    Max refresh rate of the screen is max refresh rate of the screen that your best possible latency. Adaptive sync helps you reduce missed frames by saving on power usage so the GPU can clock higher.

    Sorry to say that AMD XDC talk mostly right but when it comes to the judder problem is very deceptive. Most of the appearance of judder improvement is the power saving VRR causes to allow the GPU to clock slightly faster.

    VRR is not improving the judder the way people like you are thinking SethDusek. AMD developers were missing the slightly increased GPU clocks as well being caused by VRR using a lower refresh rate results in better frame average render times. Better cooling on the GPU could also see the same gains.

    Nvidia made tools for measuring latency have be useful to catch some of these differences where people think they have improved something by improving the signal but its ruined again by the monitor controller inside the monitor.

    This catch is if you GPU is doing max refresh rate of your monitor the GPU can do what many monitors do to level out brightness problem in GPU. Issue is heat generated in the GPU doing 120/240Hz signal to monitor is this heat worth it when game can only do 60 frames per second GPU rendering and this heat is reducing GPU clock speed.

    Now the trick you walk into is person has a 120hz monitor and decides to run it at only 60 fixed now vs VRR on the same monitor there can be a judder improvement. Why would the user be under clocking the monitor down to 60 Hz fixed they have a 120Hz monitor to save on heat in the CRT part of the GPU to get higher GPU clock speeds.

    VRR is about being able to save power in the CRT part of the GPU so you can have higher GPU clock-speeds for the other processing with the same amount of cooling when you cannot reach max refresh rate of monitor.

    Basically its not helping you how you exactly think it is. Power usage and gpu clock speeds shows the big picture here.

    Yes it really simple to forget a GPU when you have a static image on screen just put the same buffer out over and over again but that is in fact cutting into your TDP budget. The amount of improvement VRR gives you is directly linked to how good of cooling your card has.

    Leave a comment:


  • SethDusek
    replied
    Originally posted by thxcv View Post
    Is VRR available for video/desktop already or still only gaming? I remember there were talks that usecases other then gaming aren't yet covered.
    I believe Sway on Wayland does support it outside of fullscreen games somewhat. If I open my monitor's in-built framerate monitor, I can see the framerate moving, indicating that freesync is active. The same does not happen on X.org outside of fullscreen games

    Leave a comment:


  • aufkrawall
    replied
    Originally posted by thxcv View Post
    Is VRR available for video/desktop already or still only gaming? I remember there were talks that usecases other then gaming aren't yet covered.
    A kernel API for that usecase is still in the making. Though theoretically every video player could already utilize VRR, at least in fullscreen (and on Wayland probably also windowed, afair Sway does support that).

    Leave a comment:


  • thxcv
    replied
    Is VRR available for video/desktop already or still only gaming? I remember there were talks that usecases other then gaming aren't yet covered.

    Leave a comment:


  • piotrj3
    replied
    Originally posted by oiaohm View Post

    Really no. Adaptive sync/VRR does not make your LCD panel any faster than its top speed so VRR it self does not directly fix latency. There is a different problem completely.

    The problem is watts of power consume. Sending 30 Hz signal consumes less power in the GPU to sending 120Hz or faster. So if the game/program cannot reach the max speed of the LCD panel is it worth doing the max speed of the LCD panel. Think GPU doing a lower refresh rate in CRT(the output generation circuit parts) is generating less heat this gives more thermal space for the compute sections of the GPU to run frequency before being thermal throttled.

    Another reason why you might want VRR is in a mobile phone/laptop to save on power usage.

    adaptive sync and VRR is more about power usage and heat generation than performance. Yes the lower heat generation of the CRT parts can result in a few extra frame in some games due to the GPU in others being able to clock faster due to slightly lower heat generation.

    Remember if the game/program is able to keep up with the max speed from the GPU of the screen and you have VRR on it will be in fact sitting at max frame rate of the screen so it would make no difference to how the game is compared to VRR off and refresh rate enabled.

    There is a lot of people with the mistake that VRR improves latency without understanding how and because of the how the improvement is almost nothing. Only way VRR results in improved latency is that the CRT part of the GPU has produced less heat due to running at a lower HZ output speed resulting in the GPU processing parts being able to run at a slightly higher clock-speed. The latency improvement of VRR turns out close to nothing that is very hard to pick out from run to run variation absolutely going to be too small to be human noticeable in most cases.

    VRR should be very important to laptop users for battery life.
    You wrote wrong. VRR generally don't save power.
    Classic problem of refresh rate is:
    - if you don't want tearing, you need to wait for entire frame to be ready but that gives latency
    - if you want lowest possible refresh rate, you might send half new frame and half of old frame, because GPU and monitor is not synchronized.

    VRR simply means that in certain refresh rate ranges you can send entire frame straight away and monitor adjusts its refresh rate in real time making frame appear straight away and since its full frame it is not suffering from tearing. Higher HZ/lower HZ output doesn't matter, all matters is how many FPS card produces, and VRR+Vsync produce same number of frames as V-sync alone.

    General idea is that best expierience with at least G-sync is using G-sync+V-sync + frame limiter to around 3 frames below max refresh rate. This gives only marginally bigger latency then unrestricted everything, but literally 0 tearing and frame limiter makes sure you don't produce too many frames (frame rate limiter actually gives you power savings there). however it should still apply to Freesync.
    https://blurbusters.com/gsync/gsync1...nd-settings/14
    Last edited by piotrj3; 22 February 2021, 07:30 AM.

    Leave a comment:


  • aufkrawall
    replied
    Originally posted by oiaohm View Post
    Only way VRR results in improved latency is that the CRT part of the GPU has produced less heat due to running at a lower HZ output speed resulting in the GPU processing parts being able to run at a slightly higher clock-speed. The latency improvement of VRR turns out close to nothing that is very hard to pick out from run to run variation absolutely going to be too small to be human noticeable in most cases.
    This is very wrong in practice, as with an fps limiter you can stay inside the VRR range without waiting for a at least one full frame in a backbuffer and doing so with a game's integrated fps limiter often also reduces CPU prerender. As a result, tearing-free playing with a 60Hz display always has atrocious lag or stutter without VRR, while with VRR it doesn't.

    Leave a comment:


  • SethDusek
    replied
    Originally posted by oiaohm View Post

    Remember if the game/program is able to keep up with the max speed from the GPU of the screen and you have VRR on it will be in fact sitting at max frame rate of the screen so it would make no difference to how the game is compared to VRR off and refresh rate enabled.

    There is a lot of people with the mistake that VRR improves latency without understanding how and because of the how the improvement is almost nothing. Only way VRR results in improved latency is that the CRT part of the GPU has produced less heat due to running at a lower HZ output speed resulting in the GPU processing parts being able to run at a slightly higher clock-speed. The latency improvement of VRR turns out close to nothing that is very hard to pick out from run to run variation absolutely going to be too small to be human noticeable in most cases.

    VRR should be very important to laptop users for battery life.


    This is true, but no game runs at perfect framerate. Adaptive sync *does* help when you miss a frame, since instead of missing it completely it lowers the refresh rate. This lowers frame judder. If you don't believe me, look at this XDC talk by an AMD developer

    https://xdc2019.x.org/event/5/contri...c-20191003.pdf

    Leave a comment:


  • oiaohm
    replied
    Originally posted by SethDusek View Post
    I believe adaptive sync lowers latency by lowering the amount of frame misses especially in games since it adjusts the refresh rate dynamically. Not sure if this is a good analogy, but it's like being late for the bus. Instead of having to wait another hour for the next bus, adaptive sync has the bus wait a couple of minutes (depending on the refresh rate range) for you if you're late so you don't miss it entirely
    Really no. Adaptive sync/VRR does not make your LCD panel any faster than its top speed so VRR it self does not directly fix latency. There is a different problem completely.

    The problem is watts of power consume. Sending 30 Hz signal consumes less power in the GPU to sending 120Hz or faster. So if the game/program cannot reach the max speed of the LCD panel is it worth doing the max speed of the LCD panel. Think GPU doing a lower refresh rate in CRT(the output generation circuit parts) is generating less heat this gives more thermal space for the compute sections of the GPU to run frequency before being thermal throttled.

    Another reason why you might want VRR is in a mobile phone/laptop to save on power usage.

    adaptive sync and VRR is more about power usage and heat generation than performance. Yes the lower heat generation of the CRT parts can result in a few extra frame in some games due to the GPU in others being able to clock faster due to slightly lower heat generation.

    Remember if the game/program is able to keep up with the max speed from the GPU of the screen and you have VRR on it will be in fact sitting at max frame rate of the screen so it would make no difference to how the game is compared to VRR off and refresh rate enabled.

    There is a lot of people with the mistake that VRR improves latency without understanding how and because of the how the improvement is almost nothing. Only way VRR results in improved latency is that the CRT part of the GPU has produced less heat due to running at a lower HZ output speed resulting in the GPU processing parts being able to run at a slightly higher clock-speed. The latency improvement of VRR turns out close to nothing that is very hard to pick out from run to run variation absolutely going to be too small to be human noticeable in most cases.

    VRR should be very important to laptop users for battery life.

    Leave a comment:


  • SethDusek
    replied
    Originally posted by shmerl View Post
    Will latency be also low with adaptive sync enabled?
    I believe adaptive sync lowers latency by lowering the amount of frame misses especially in games since it adjusts the refresh rate dynamically. Not sure if this is a good analogy, but it's like being late for the bus. Instead of having to wait another hour for the next bus, adaptive sync has the bus wait a couple of minutes (depending on the refresh rate range) for you if you're late so you don't miss it entirely

    Leave a comment:


  • shmerl
    replied
    Will latency be also low with adaptive sync enabled?

    Leave a comment:

Working...
X