Originally posted by Rob72
View Post
Announcement
Collapse
No announcement yet.
Proposed GNOME Patches Would Switch To Triple Buffering When The GPU Is Running Behind
Collapse
X
-
-
Originally posted by caligula View PostTriple buffering should provide better / more consistent performance / latencies overall without increasing the computational load. It just requires a bit more memory for the extra buffer.
https://www.youtube.com/watch?v=seyAzw9zEoY Tech Focus - V-Sync: What Is It - And Should You Use It?
- Likes 4
Comment
-
Originally posted by gens View Post
Triple buffering is basically rendering all the time and showing only the (theoretically) latest drawn frame. It increases the load on the system as much as turning off vsync does. While it, in the case of gpu being slow, will provide more fps, the result will be jittery and the latency has a great chance of being bigger. Think about the timings, it should be clearer after a while.
https://www.youtube.com/watch?v=seyAzw9zEoY Tech Focus - V-Sync: What Is It - And Should You Use It?
https://twitter.com/ID_AA_Carmack/st...11153509249025
- Likes 5
Comment
-
Originally posted by discordian View PostI really don't get how you can tax a modern GPU, even if its an IGP, with composing 2D images (basically no overdraw). 4k * 60Hz would be 480MPixel / s, that's early 2000' level of performance.
I'd understand if some Windows aren't drawn that quickly, but thats' not what this addresses?
The problem they are talking about is that the gpu clocks drop, which is a.. funny problem. Their solution is not good, as expected.
Idk how they draw things (shadows, blurs, and such), but i suspect they don't use all the tricks of the rendering trade and could reduce their rendering time.
- Likes 2
Comment
-
Originally posted by CochainComplex View Posttongue in cheek ...why don't we use a game engine to render the desktop
- Likes 1
Comment
-
Originally posted by caligula View PostYour links are talking about game performance. On desktop vsync is typically turned on and the jitter comes from the fact that the rendering pipeline can't deliver the next frame on time. You don't have the same issues you have with games since you know beforehand how to render the next few frames. On desktop most applications produce deterministic output. In a game the jitter comes from the delays between game state and what's being rendered on screen. For example turning vsync off might be useful.
In some games the jitter comes from the render->simulate->render loop, but not in all and not in most modern games.
https://gafferongames.com/post/fix_your_timestep/ Here's the basics.
https://www.youtube.com/watch?v=_zpS1p0_L_o More advanced.
There are ways to know the hardware timings, and adapt for lower latency.
A game might do it, and a wm might do it. A game can look at its past frames and predict how long the rendering will last, a wm can't because something else might burden the gpu (a game is usually fullscreen, and the only one rendering).
My suggestion to everybody is to do your best and assume everything conservatively.
Think of time.
A 60hz display refreshes every 16.66ms.
Worst case rendering that could "benefit" from triple buffering is 45fps, that is 22.22ms.
Take a pencil or notepad and do some simulating. Like "first frame is dropped, second frame is x ms behind, third frame is x ms behind, fourth frame is dropped", etc. Do it for, idk 20ms, 22.22ms and 25ms.
And then you might realize why it's jittery.
Then think about how the commands from the cpu get to the gpu, and then... it's a lot of topic and there's a lot of resources for you to read. It's also a bit fun, so i do recommend it.
- Likes 1
Comment
-
Originally posted by gens View PostHow do i explain this to you...
In some games..
- Likes 3
Comment
Comment