Originally posted by thxcv
View Post
Announcement
Collapse
No announcement yet.
VRR, Lower Latency Likely Coming For KDE's KWin Wayland Compositor
Collapse
X
-
Originally posted by thxcv View PostIs VRR available for video/desktop already or still only gaming? I remember there were talks that usecases other then gaming aren't yet covered.
Comment
-
Originally posted by SethDusek View PostThis is true, but no game runs at perfect framerate. Adaptive sync *does* help when you miss a frame, since instead of missing it completely it lowers the refresh rate. This lowers frame judder. If you don't believe me, look at this XDC talk by an AMD developer
Lowering the frame judder sent from the GPU does not mean you have fixed the judder being displayed to the user because the monitor is a factor that can put the judder you have removed at the GPU straight back in with the monitor controller. Yes two to tango problem here.
Max refresh rate of the screen is max refresh rate of the screen that your best possible latency. Adaptive sync helps you reduce missed frames by saving on power usage so the GPU can clock higher.
Sorry to say that AMD XDC talk mostly right but when it comes to the judder problem is very deceptive. Most of the appearance of judder improvement is the power saving VRR causes to allow the GPU to clock slightly faster.
VRR is not improving the judder the way people like you are thinking SethDusek. AMD developers were missing the slightly increased GPU clocks as well being caused by VRR using a lower refresh rate results in better frame average render times. Better cooling on the GPU could also see the same gains.
Nvidia made tools for measuring latency have be useful to catch some of these differences where people think they have improved something by improving the signal but its ruined again by the monitor controller inside the monitor.
This catch is if you GPU is doing max refresh rate of your monitor the GPU can do what many monitors do to level out brightness problem in GPU. Issue is heat generated in the GPU doing 120/240Hz signal to monitor is this heat worth it when game can only do 60 frames per second GPU rendering and this heat is reducing GPU clock speed.
Now the trick you walk into is person has a 120hz monitor and decides to run it at only 60 fixed now vs VRR on the same monitor there can be a judder improvement. Why would the user be under clocking the monitor down to 60 Hz fixed they have a 120Hz monitor to save on heat in the CRT part of the GPU to get higher GPU clock speeds.
VRR is about being able to save power in the CRT part of the GPU so you can have higher GPU clock-speeds for the other processing with the same amount of cooling when you cannot reach max refresh rate of monitor.
Basically its not helping you how you exactly think it is. Power usage and gpu clock speeds shows the big picture here.
Yes it really simple to forget a GPU when you have a static image on screen just put the same buffer out over and over again but that is in fact cutting into your TDP budget. The amount of improvement VRR gives you is directly linked to how good of cooling your card has.
- Likes 1
Comment
-
Originally posted by piotrj3 View PostVRR simply means that in certain refresh rate ranges you can send entire frame straight away and monitor adjusts its refresh rate in real time making frame appear straight away and since its full frame it is not suffering from tearing. Higher HZ/lower HZ output doesn't matter, all matters is how many FPS card produces, and VRR+Vsync produce same number of frames as V-sync alone.
Originally posted by piotrj3 View PostFPS card produces, and VRR+Vsync produce same number of frames as V-sync alone.
Comment
-
Originally posted by thxcv View PostIs VRR available for video/desktop already or still only gaming? I remember there were talks that usecases other then gaming aren't yet covered.
- Likes 1
Comment
-
Originally posted by Brisse View PostThe (otherwise useless) built in video player in Windows 10 does this, but that's the only one I know of. Hopefully we'll see the same feature in some open source video players one day.
Power saving is also by far the least interesting aspect, you can also set to a fixed lower refresh rate by yourself (or let a script do it automatically). Perfect frame presentation on screen without drawbacks is far more difficult to achieve and thus the much more interesting part of this feature...
Intel supported automatically throttled output transmission to the display before VRR, and even that caused flickering issues. Variable refresh rate apart from games/video often (always?) causes more issues than perhaps 0-2W of power savings are worth it.Last edited by aufkrawall; 22 February 2021, 09:12 AM.
- Likes 2
Comment
-
Originally posted by oiaohm View Post
This bit turns out not to be true majority of monitors. Lot of monitors don't adjust their internal refresh rate instead when you send a frame it comes assigned to the next monitor Vsync of it internal speed. So monitors 240hz VRR is providing 60 each frame is being displayed 4 times by monitor. Early VRR stuff yes the monitor was attempting to adjust its refresh rate all the time but then you end up with unpredictable brightness problems as in static flicker and dynamic flicker. Yes the method to fix those means the monitor is not really dynamically changing its refresh rate just pretends it is.
That does not turn out to be true on AMD cards. Nvidia does there video out differently to AMD so it does not effect the core GPU silicon temperature as much.
- Likes 1
Comment
-
Originally posted by oiaohm View Post
Really no. Adaptive sync/VRR does not make your LCD panel any faster than its top speed so VRR it self does not directly fix latency. There is a different problem completely.
The problem is watts of power consume. Sending 30 Hz signal consumes less power in the GPU to sending 120Hz or faster. So if the game/program cannot reach the max speed of the LCD panel is it worth doing the max speed of the LCD panel. Think GPU doing a lower refresh rate in CRT(the output generation circuit parts) is generating less heat this gives more thermal space for the compute sections of the GPU to run frequency before being thermal throttled.
Another reason why you might want VRR is in a mobile phone/laptop to save on power usage.
adaptive sync and VRR is more about power usage and heat generation than performance. Yes the lower heat generation of the CRT parts can result in a few extra frame in some games due to the GPU in others being able to clock faster due to slightly lower heat generation.
Remember if the game/program is able to keep up with the max speed from the GPU of the screen and you have VRR on it will be in fact sitting at max frame rate of the screen so it would make no difference to how the game is compared to VRR off and refresh rate enabled.
There is a lot of people with the mistake that VRR improves latency without understanding how and because of the how the improvement is almost nothing. Only way VRR results in improved latency is that the CRT part of the GPU has produced less heat due to running at a lower HZ output speed resulting in the GPU processing parts being able to run at a slightly higher clock-speed. The latency improvement of VRR turns out close to nothing that is very hard to pick out from run to run variation absolutely going to be too small to be human noticeable in most cases.
VRR should be very important to laptop users for battery life.
- Likes 4
Comment
Comment