Jittery animation is caused by the PC being unable to maintain a suitable framerate. Current fixed-framerate monitors make that worse by imposing an additional restriction on the definition of "suitable" if you want to prevent tearing. (The restriction that, if a frame is too late by even a nanosecond, it has to either be discarded or delayed until the next frame refresh.)
Adaptive sync makes animation better because you get the performance of non-VSync rendering with the tear-free visuals of VSync rendering.
Keep in mind that Adaptive-Sync has been around in the eDP standard since 2009, and it certainly didn't make laptop displays cost $50-100 more.
Adaptive-Sync sounds like a great standard(end of video tearing). To get it working monitor and graphics card have to support dp1.2a and graphics driver have to support VBI. In AMD case they will start to support it starting with gcn1.1 gpus(hawaii and bonaire no word gcn 1 cards yet). In nvidia's case I don't see reason why it could not be supported starting at kepler cards, if it's easy enough to implement in driver level addition to g-sync.
It always annoys me when people say "performance penalty of v-sync". Seriously, its like 47 vs 47 or 32 vs 30. It isn't that big of a performance penalty.
That being said, its nice to see this will be available in many monitors in the future and for non-nvidia systems.
I'm currious, what changes software wise will have to be done to make it work?
Will it just be a change in the driver code?
Or will the games have to be modified to take advantage of it?
Nothing needs to be modified in client-side code. SwapBuffers just suddenly stops blocking.
Hope HDMI copy this feature, or Intel NUCs start shipping with a Display Port.
It's snapping to multiples of the refresh rate so it magnifies even momentary drop-offs by skipping the associated frames. (eg. instead of 30 30 29 30, it's 30 30 15 30 because the frame that was a hair late has to be delayed until the next refresh.)