Testing, testing...I posted a response several minutes ago and it has apparently just disappeared.
Announcement
Collapse
No announcement yet.
Ubuntu's Mir May Be Ready For FreeSync / Adaptive-Sync
Collapse
X
-
Originally posted by juno View PostAlthough I appreciate the foss efforts, it's sad how long we have to wait for features that have worked on windows for a long time.
Remember Windows is released with a finalized driver model each release and the graphic stack hardly ever changes(specially if it could brake the old Win32 API layers that 70% - 90% of the windows software still uses) whereas Linux actually never had a "graphic stack" but a dumb shadow FB and userspace libraries to pass GLX + GL1.2 commands until 8 years ago.
So what this means? on windows you just add the code to the blob kernel/userspace side and it should work on a set of windows releases and for those that doesn't they can simply drop support or get creative to work around it based on how much money those releases brings them. So is relatively "easy".
On Linux, the drivers on the kernel and userspace are written at the same time as the graphic stack is and have to adapt almost on real time with it, this also means desktops/middleware libraries/applications have to absorb those features in real time too which translates in a incredibly difficult set of tasks that require surgical coordination with 20 other teams just to get the feature into workable state just to start later to optimize. It is really really difficult.
The good news is that Linux graphic stack is almost getting globally entering the "usable state" and once all reach feature parity the optimization process will begin, hence next hardware features will enter the stack a freaking lot faster since it will only require to focus on write that feature like in Windows.
Why in Linux this has taken so much time? actually comparable to windows itself, the only difference is you don't experience it since its closed source. aka just get it magically when microsoft think is ready enough(remember Vista introduction to WDDM1.0 but took until Windows 7 WDDM 1.1 to actually work properly), the problem is since Linux didn't have an stable previous gen graphic stack you have no other choice but to jump into it as is developed whereas Windows had the NT model while worked on WDDM for some years, so you just don't notice it as hard as with Linux.
Comment
-
Originally posted by andre30correia View Post
normal, this site now has censorship
Originally posted by Zan Lynx View PostTesting, testing...I posted a response several minutes ago and it has apparently just disappeared.Michael Larabel
https://www.michaellarabel.com/
- Likes 1
Comment
-
Originally posted by Zan Lynx View PostPretty sure that isn't how it works at all. Not with G-Sync anyway. Perhaps I just assumed Adaptive Sync was just as good. Maybe it isn't.
Vsync is "adapt the video stream to screen refresh rate" (i.e. send double frames or clone stuff as needed), which is the other way around, and of course stresses more the GPU.
If the display hardware has other problems with it the G-Sync hardware on the monitor smooths it away. The GPU / computer side never sees a problem.
Gsync or AMD's Sync or Vesa's require a screen that can change refresh rate dynamically as this "smooths it away" magic happens by simply changing dynamically refresh rate of the screen dynamically with content.
i.e. the GPU outputs 23.89 frames per second? No problem, a message is sent and the screen refresh rate switches to 23.89 Hz and everything is as smooth as possible, no dropped frames, no half-rendered stuff, no double-frames.
The differences between the various standards are mostly due to how they actually make this framerate change happen and how precise they are and so on. Different implementations of the same general concept, not different technologies.
- Likes 1
Comment
-
Originally posted by andre30correia View Postnormal, this site now has censorship
Originally posted by Zan Lynx View PostPretty sure that isn't how it works at all. Not with G-Sync anyway. Perhaps I just assumed Adaptive Sync was just as good. Maybe it isn't.
Vsync is "adapt the video stream to screen refresh rate" (i.e. send double frames or clone stuff as needed), which is the other way around, and of course stresses more the GPU.
If the display hardware has other problems with it the G-Sync hardware on the monitor smooths it away. The GPU / computer side never sees a problem.
Gsync or AMD's Sync or Vesa's require a screen that can change refresh rate dynamically as this "smooths it away" magic happens by simply changing dynamically refresh rate of the screen dynamically with content.
i.e. the GPU outputs 23.89 frames per second? No problem, a message is sent and the screen refresh rate switches to 23.89 Hz and everything is as smooth as possible, no dropped frames, no half-rendered stuff, no double-frames.
The differences between the various standards are mostly due to how they actually make this framerate change happen and how precise they are and so on. Different implementations of the same general concept, not different technologies.
- Likes 1
Comment
-
Originally posted by starshipeleven View PostGsync or AMD's Sync or Vesa's require a screen that can change refresh rate dynamically as this "smooths it away" magic happens by simply changing dynamically refresh rate of the screen dynamically with content.
In fact, the whole concept of frames is a little old-fashioned when it comes to modern video display technology (witness Vulkan rendering chains, or nVidia's eglstreams driver). On a modern personal computer system with multiple monitors and VR goggles, framerate on the rendering and display side makes little sense and these new technologies in the drivers are trying to address that, giving you a better experience.
The things holding back the better experience right now are (1) an assumption on the part of games and other applications that there is a single CRT display that draws a single frame at a time and (2) legacy media formats that assume a fixed frame rate (eg. 24 frames per second for filmed movies, 29.97 frames per second for NTSC television and DVDs). Progress is slow and steady and not in great demand, because most consumers are unaware of what they're missing since they've never had it.
Don't worry, but the time we start having consumer-level content that demands the new display technology, the full graphics stack will be in place to support it.
Comment
-
Originally posted by starshipeleven View Posti.e. the GPU outputs 23.89 frames per second? No problem, a message is sent and the screen refresh rate switches to 23.89 Hz and everything is as smooth as possible, no dropped frames, no half-rendered stuff, no double-frames.
Using G-Sync, the module inside the monitor does the doubling when the refresh rate is too low. FreeSync on Windows uses what they call "LFC" for Low Framerate Compensation. They determine it in the driver and let the display engine output the same frame multiple times, if necessary. I assume this could be a bit tricky to implement, when frame times are fluctuating much.Last edited by juno; 30 August 2016, 08:24 AM.
Comment
-
Originally posted by juno View PostBasically, you are right, of course. But most panels don't support refresh rates as low as this. They start flickering or display other errors when trying so low, that's why there are still doubled/tripled/... frames. But it is not more a problem, of course, as there is no stuttering caused by this. Just outputting every frame twice and you see 23.98 FPS but the screen refreshes w/ 47.95 Hz.
All adaptive sync systems are supposed to support from 2hz to somewhere-triple-digit-hz, if the hardware cannot do it they resort to tricks like the above (still using the flexible refresh rate at their advantage), but that's a reaction to stated hardware limits, if (sometime in the future) you connect a screen that can do 10Hz fine they send 10hz.
I assume this could be a bit tricky to implement, when frame times are fluctuating much.
Comment
-
There's no "dynamic refresh rate" change. The LCD monitor waits for a new frame on DisplayPort. When it gets a new frame it displays it. There isn't anything strange about it.
What is actually artificial and unnatural for an LCD is a fixed refresh rate which is an artifact left over from CRT displays. An LCD doesn't actually need repeated frames. If it doesn't get new information it just holds what it has. The need for repeating frames under 30 FPS is a strange artifact related to losing data sync on the cable and the monitor thinking it got disconnected if the GPU stops sending.
So there's no refresh rate change. There's simply the maximum data rate on the DisplayPort cable, which cannot be exceeded without corruption. And there's the data rate the electronics in the LCD monitor can accept, which might be lower than the cable. And there's the rate the monitor can twist the LCD elements to a new image. Other than those limits, the GPU and the monitor can send a frame whenever it feels like it.
Laptop, tablet and phone displays don't use refresh rates either. When a tablet goes idle, about 50 milliseconds after the last image update, the display hardware just stops updating. It doesn't send new frames. It just keeps the LCD / OLED right where it was and the GPU goes idle. Intel integrated GPUs do this with laptop screens for power savings too.
- Likes 1
Comment
Comment