Originally posted by duby229
View Post
Announcement
Collapse
No announcement yet.
It Looks Like AMD Will Support FreeSync With Their New Linux Display Stack
Collapse
X
-
FreeSync sounds really great, I didn't know how interesting/important it was before this code drop and exchanges in forums. My biggest takeaway from this however is that AMD guys seem to understand what they are doing and are ready to get everything to get this merged. I approve such attitude. Hopefully together they will get this merged as soon as humanly possible without damaging integrity of code base.
- Likes 1
Comment
-
Originally posted by haagch View PostIntel doesn't care about prime. Try playing a game with intel dri3 using prime. It's completely unplayable. Intel knows that cross device synchronization is missing in their driver, they just don't do anything about it.
Comment
-
Originally posted by chuckula View Post
No, because Freesync and G-sync address *physical* tearing while Wayland addresses *logical* tearing. The physical tearing occurs when V-sync is turned off and the monitor refreshes part of the way through an update from the GPU, which is below Wayland's level. Wayland is focused on making sure that the final product of the graphics generation & composition process that gets placed into the frame buffer of the GPU for output is "perfect" in as much as the inherent graphics that the GPU itself is rendering don't include tears [assuming the GPU/display doesn't introduce tears due to timing issues].
Of course, you might have noticed that I mentioned old-fashioned V-sync above. Well, traditional V-sync has its own problems in that it eliminates tearing but replaces tearing with frame stutter in some instances where the physical refresh rate of the monitor doesn't line up with the rate at which new frames are actually being produced by the GPU. G-sync & Freesync fix both the tearing problem and the stutter problem, which goes beyond the capabilities of traditional V-sync.
I thought G-sync handled it differently through its module by not using v-sync on/off, but draws extra frames and increases refresh rates when frames drop below 40, and matches like Freesync above 40, which is claimed actually eliminates tearing/stutter completely unlike Freesync
Comment
-
Originally posted by DDF420 View Post
With FreeSync, i thought the refresh rate stayed static at 40 Hz for all frames below 40,anything above are matched Hz/frames until monitors max Hz,If a monitors max is 144Hz, and frames go higher than that,v-sync kicks in. If you have v-sync on, and frames drop below 40 v-sync kicks in.So you still have potential for tearing or stutter.
I thought G-sync handled it differently through its module by not using v-sync on/off, but draws extra frames and increases refresh rates when frames drop below 40, and matches like Freesync above 40, which is claimed actually eliminates tearing/stutter completely unlike Freesync
Since November AMD implemented frame doubling, so if your higher freesync range is 2.5 times the lower, the lower one does not matter. Same with NVidia, though I do not know if they double their frames in soft- or hardware. Once you are above the max monitor refresh range you have either vsync off, which results in tearing (and somewhat lower latency/input lag) or you have it on, which results in a tear free display (with an input latency of max 16ms at 60Hz) - so if you stay above the monitor refresh rate both adapive syncs technologies don't differ from non adaptive sync approaches.
- Likes 1
Comment
-
Originally posted by Namenlos View Post
Since November AMD implemented frame doubling, so if your higher freesync range is 2.5 times the lower, the lower one does not matter. Same with NVidia, though I do not know if they double their frames in soft- or hardware. Once you are above the max monitor refresh range you have either vsync off, which results in tearing (and somewhat lower latency/input lag) or you have it on, which results in a tear free display (with an input latency of max 16ms at 60Hz) - so if you stay above the monitor refresh rate both adapive syncs technologies don't differ from non adaptive sync approaches.
Comment
-
The problem it that without adaptive sync you have a fixed rate of refreshing the panel, every 16.7ms for 60Hz. Now if you are below 60fps you have to wait a whole frame. So you have a short time of 30Hz/30fps. You notice that because without motion blur like in movies 30fps is not quite enough. Adaptive sync technologies fix that completely sane way while 120Hz masks some of the problems. I don't think input lag is a real problem with 60fps and above, at least for me. But of course more is better .
edit: You can run freesync(/gsync probably too) with or without vsync.Last edited by Namenlos; 14 February 2016, 01:01 PM.
Comment
-
Originally posted by Zan Lynx View PostIf the price tag was $100 more than a similar monitor than you probably have it. :-P
Review at PCPer
- Likes 1
Comment
-
Originally posted by chithanh View PostI know this was a joke, but actually some bargain monitors like the Wasabi Mango UHD420 (one of the cheapest 42" UHD monitors on the market) are FreeSync capable with recent firmware.
Review at PCPer
Comment
Comment