Announcement

Collapse
No announcement yet.

VRR, Lower Latency Likely Coming For KDE's KWin Wayland Compositor

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • oiaohm
    replied
    Originally posted by Zan Lynx View Post

    VRR is not only for gaming. It is also useful for power saving and video playback.

    Intel laptop GPUs already do VRR internally. Their PSR (Panel Self Refresh) mode runs the laptop or tablet panel at whatever its lowest refresh rate is, until the framebuffer changes. Doing it with explicit software control should allow things like Wayland compositors to run at minimum (30 or 40 Hz usually) and still update at 120 Hz when moving the mouse or other interactive uses.

    Much like Android uses VRR. Many phone displays now run at 0 FPS (as far as Surface Flinger is concerned anyway. the hardware does even more power saving than Intel PSR.) and burst to 90 or 120 Hz while scrolling.
    PSR is the Panel Self Refresh rate. Surface Flinger reporting 0 FPS is the VRR reporting these two figures don't align. Allowing you to power down the GPU and have the LCD controller keep on drawing the screen does serous-ally save power.

    Intel PSR adjustment is not pure VRR. What you will see is PSR being a set of fixed speeds like 30, 60, 120. Yes picking 40Hz as your wayland compositor slow mode could be really bad idea as in some cases that that can be basically locking you at 80/120 now not all LCD screens do 80.

    Yes it possible on Intel to have a VRR of zero and a PSR of 240Hz at the same time because the LCD panel only mode is 240Hz. Fixed PSR rates still exist even in a VRR supporting monitor. Fixed PSR on LCD screens is simple to calibrate brightness configurations compared truly dynamically changing PSR. The complexity configuring dynamically changing PSR brightness resulting in either too dark or too bright at different times that in fact generate flicker. Yes when PSR goes up to 120Hz then you are rendering by VRR at 30 frames of second following a that the PSR can remain up at 120Hz for a while before deciding to switch back with Intel just in case another 120Hz VRR will be coming soon so reducing brightness adjustments and reducing flicker risk that in worse case you have human laying on ground having epilepsy attack.

    PSR and VRR are these days running out of alignment with each other. Yes the flicker of early VRR monitors is in fact dangerous the correction of it results in PSR not running in alignment with the VRR. Yes the cheaper monitors still doing the older method will have to disappear in time they are not safe monitors.

    Leave a comment:


  • oiaohm
    replied
    Originally posted by piotrj3 View Post
    you confuse a bit terms. At least for Gsync compatible (what is simply certified Freesync over DP) or better monitors, they all adjust refresh rate. What is important is that they adjust refresh rate only in certain range. When refresh rate goes outside range, thing you said happens, internal refresh rate doesn't go too far below and if FPS is smaller then minimum refresh rate range, frames gets duplicated OR you go too high fps, and frames gets skipped until next good frame appears. Vsync and frame rate limiter secures that you won't get additional inconsitent latency from too high fps.
    Gsync compatible even a full Gsync is not required to adjust the refresh PSR rate to be certified. Yes a monitor with a fixed PSR can claim to VRR. If you have to worked out why 3 frames slower VRR than the PSR works is simple that the PSR is not changing. So you need spaces to display the off time frames.

    Yes being 3 frames per second slower means that 3 frames are getting display twice so you do in places have a extra frame of latency.

    Originally posted by tildearrow View Post
    You are talking about VRR and PSR (panel self-refresh) in combination.
    Exactly. When you start with original fixed refresh rates the PSR and the fixed refresh rate matched. With VRR early on they were tried to keep matched but different flicker problems kept on happening and it turns out changing the PSR constantly is not good for part lifespan due to causing greater thermal changes. Even new monitors with Nvidia chipset in them for gsync are normally don't have the feature of change the PSR any more.

    Change the PSR on the fly turned out that its been a best laid plans of mice and men. On paper it seamed like a good idea in real world where you have to deal with thermals and other things its not such a good idea. Once PSR stopped being changed exactly the effect of VRR just moves the miss aligned frame problem out of the GPU and into the Monitor so to end users not fixing that problem to their eyes just changing where the problem is generated.

    Leave a comment:


  • Zan Lynx
    replied
    Originally posted by MadeUpName View Post
    It would be nice if they got KDE actually usably working on Wayland before focusing all their resources on gaming. You know minor things like actually being able to click buttons.
    VRR is not only for gaming. It is also useful for power saving and video playback.

    Intel laptop GPUs already do VRR internally. Their PSR (Panel Self Refresh) mode runs the laptop or tablet panel at whatever its lowest refresh rate is, until the framebuffer changes. Doing it with explicit software control should allow things like Wayland compositors to run at minimum (30 or 40 Hz usually) and still update at 120 Hz when moving the mouse or other interactive uses.

    Much like Android uses VRR. Many phone displays now run at 0 FPS (as far as Surface Flinger is concerned anyway. the hardware does even more power saving than Intel PSR.) and burst to 90 or 120 Hz while scrolling.

    Leave a comment:


  • MadeUpName
    replied
    It would be nice if they got KDE actually usably working on Wayland before focusing all their resources on gaming. You know minor things like actually being able to click buttons.

    Leave a comment:


  • tildearrow
    replied
    Originally posted by oiaohm View Post

    Really no. Adaptive sync/VRR does not make your LCD panel any faster than its top speed so VRR it self does not directly fix latency. There is a different problem completely.

    The problem is watts of power consume. Sending 30 Hz signal consumes less power in the GPU to sending 120Hz or faster. So if the game/program cannot reach the max speed of the LCD panel is it worth doing the max speed of the LCD panel. Think GPU doing a lower refresh rate in CRT(the output generation circuit parts) is generating less heat this gives more thermal space for the compute sections of the GPU to run frequency before being thermal throttled.

    Another reason why you might want VRR is in a mobile phone/laptop to save on power usage.

    adaptive sync and VRR is more about power usage and heat generation than performance. Yes the lower heat generation of the CRT parts can result in a few extra frame in some games due to the GPU in others being able to clock faster due to slightly lower heat generation.

    Remember if the game/program is able to keep up with the max speed from the GPU of the screen and you have VRR on it will be in fact sitting at max frame rate of the screen so it would make no difference to how the game is compared to VRR off and refresh rate enabled.

    There is a lot of people with the mistake that VRR improves latency without understanding how and because of the how the improvement is almost nothing. Only way VRR results in improved latency is that the CRT part of the GPU has produced less heat due to running at a lower HZ output speed resulting in the GPU processing parts being able to run at a slightly higher clock-speed. The latency improvement of VRR turns out close to nothing that is very hard to pick out from run to run variation absolutely going to be too small to be human noticeable in most cases.

    VRR should be very important to laptop users for battery life.
    You are talking about VRR and PSR (panel self-refresh) in combination.

    Leave a comment:


  • piotrj3
    replied
    Originally posted by oiaohm View Post

    This bit turns out not to be true majority of monitors. Lot of monitors don't adjust their internal refresh rate instead when you send a frame it comes assigned to the next monitor Vsync of it internal speed. So monitors 240hz VRR is providing 60 each frame is being displayed 4 times by monitor. Early VRR stuff yes the monitor was attempting to adjust its refresh rate all the time but then you end up with unpredictable brightness problems as in static flicker and dynamic flicker. Yes the method to fix those means the monitor is not really dynamically changing its refresh rate just pretends it is.


    That does not turn out to be true on AMD cards. Nvidia does there video out differently to AMD so it does not effect the core GPU silicon temperature as much.
    you confuse a bit terms. At least for Gsync compatible (what is simply certified Freesync over DP) or better monitors, they all adjust refresh rate. What is important is that they adjust refresh rate only in certain range. When refresh rate goes outside range, thing you said happens, internal refresh rate doesn't go too far below and if FPS is smaller then minimum refresh rate range, frames gets duplicated OR you go too high fps, and frames gets skipped until next good frame appears. Vsync and frame rate limiter secures that you won't get additional inconsitent latency from too high fps.

    Leave a comment:


  • aufkrawall
    replied
    Originally posted by Brisse View Post
    The (otherwise useless) built in video player in Windows 10 does this, but that's the only one I know of. Hopefully we'll see the same feature in some open source video players one day.
    New Edge and Amazon video app support it too.

    Power saving is also by far the least interesting aspect, you can also set to a fixed lower refresh rate by yourself (or let a script do it automatically). Perfect frame presentation on screen without drawbacks is far more difficult to achieve and thus the much more interesting part of this feature...

    Intel supported automatically throttled output transmission to the display before VRR, and even that caused flickering issues. Variable refresh rate apart from games/video often (always?) causes more issues than perhaps 0-2W of power savings are worth it.
    Last edited by aufkrawall; 22 February 2021, 09:12 AM.

    Leave a comment:


  • Brisse
    replied
    Originally posted by thxcv View Post
    Is VRR available for video/desktop already or still only gaming? I remember there were talks that usecases other then gaming aren't yet covered.
    There is a use case for full screen video playback, especially for videos with frame-rates that normally require interpolation or 3:2 pull-down to fit onto a computer monitor. Using VRR can both enable smoother video playback while at the same time lowering power consumption. The (otherwise useless) built in video player in Windows 10 does this, but that's the only one I know of. Hopefully we'll see the same feature in some open source video players one day.

    Leave a comment:


  • oiaohm
    replied
    Originally posted by piotrj3 View Post
    VRR simply means that in certain refresh rate ranges you can send entire frame straight away and monitor adjusts its refresh rate in real time making frame appear straight away and since its full frame it is not suffering from tearing. Higher HZ/lower HZ output doesn't matter, all matters is how many FPS card produces, and VRR+Vsync produce same number of frames as V-sync alone.
    This bit turns out not to be true majority of monitors. Lot of monitors don't adjust their internal refresh rate instead when you send a frame it comes assigned to the next monitor Vsync of it internal speed. So monitors 240hz VRR is providing 60 each frame is being displayed 4 times by monitor. Early VRR stuff yes the monitor was attempting to adjust its refresh rate all the time but then you end up with unpredictable brightness problems as in static flicker and dynamic flicker. Yes the method to fix those means the monitor is not really dynamically changing its refresh rate just pretends it is.

    Originally posted by piotrj3 View Post
    FPS card produces, and VRR+Vsync produce same number of frames as V-sync alone.
    That does not turn out to be true on AMD cards. Nvidia does there video out differently to AMD so it does not effect the core GPU silicon temperature as much.

    Leave a comment:


  • LinAGKar
    replied
    Is this about windowed VRR, or does VRR not work at all on Wayland currently?

    Leave a comment:

Working...
X