Really dude? That's a contradiction right there, you need no specialized hardware except *cough* a FreeSync capable monitor with DisplayPort 1.2a or newer + a card with the same specs (and I assume an AMD card unless Nvidia decide they want to support it too)Unlike NVIDIA's G-Sync, no specialized hardware is required to support FreeSync. This week at Computex, AMD was showing off the first FreeSync capable monitor
Standalone monitors as is typically do not support this technology meaning you DO need specialized hardware. You just don't need that expensive GSync chip. I hope Nvidia responds to this by supporting freesync as well.
But judging by the differences that GSync is a much more complicated implementation of this, I am sure there are benefits it has that freesync does not (not that I care, I prefer freesync because it's hardly any extra price for me to get that)
Last edited by rabcor; 06-06-2014 at 02:16 PM.
I wonder if this will help with Oculus, too.
I think G-sync was meant to go with 120-144Hz monitors, which help with mouse lag.
Correct me if I'm wrong.......but the ONLY advantage that this adaptive-synchronization thing offers (compared to normal vertical-sync) is that is can reduce power usage a little of your monitor by dynamicly lowering the frame-rate at times when you are viewing low frame-rate stuff.. Right?..
The last time this was posted on phoronix, a lot of people were cheering about finally having tear-free games......but that's stupid, because v-sync has already done that for eons..
I guess some people just don't understand what they are talking about some times.. Oh well.. At least I do..
Adaptive sync lets the video card say: "This time, we're doing 59Hz rather than 60Hz so I can still immediately display this frame I just finished."
It is meant to fix stuttering.
If a new frame isn't made available to the monitor in time for the next refresh, the monitor will keep the old image for 2 frames, before eventually jumping forward.
This technology tells the monitor to wait slightly more than 1/60 seconds because the new frame isn't rendered yet.
This wouldn't be much of a problem if you have a really high refreshrate, or a really fast graphics card.
This is just common sense, and everyone is surprised it wasn't already like this.
Powersavings is probably extremely minor.
Last edited by Micket; 06-07-2014 at 07:01 AM.