Do they have to do anything? It is an open standard. They don't need to sell tacked on proprietary chips for monitors when they just naturally handle adaptive vsync anyway. It is kind of like Mantle - bring out a closed arbitrary piece of shit, and motivate the open standards to get off their asses and adopt necessary technological improvements.
Originally Posted by Vidar
The hardware will require a bigger buffer and probably more processing. I would expect a 50-100$ premium to start with prices stabilizing as more vendors enter the market.
Originally Posted by zanny
You're missing the point.
Originally Posted by amehaye
Jittery animation is caused by the PC being unable to maintain a suitable framerate. Current fixed-framerate monitors make that worse by imposing an additional restriction on the definition of "suitable" if you want to prevent tearing. (The restriction that, if a frame is too late by even a nanosecond, it has to either be discarded or delayed until the next frame refresh.)
Adaptive sync makes animation better because you get the performance of non-VSync rendering with the tear-free visuals of VSync rendering.
50-100$ premium seems very high to me. The requirements for Adaptive-Sync are only a new scaler ASIC and passing the compliance test (if you want to carry the logo). The cost of the scaler ASIC is only a very small part of the monitor's bill of materials.
Originally Posted by grndzro
Keep in mind that Adaptive-Sync has been around in the eDP standard since 2009, and it certainly didn't make laptop displays cost $50-100 more.
If you mean by buffer that g-sync module, it does not need that 768MB of ram for buffer. It's there because need of memory bandwidth(used fpga altera chip needs 3 memory chips to enable full bandwidth, using very expensive programmable fpga chip(that altera cost about 800$ each) makes that price premium. Gsync will be going cheaper when asic finally comes out).
Originally Posted by grndzro
Adaptive-Sync sounds like a great standard(end of video tearing). To get it working monitor and graphics card have to support dp1.2a and graphics driver have to support VBI. In AMD case they will start to support it starting with gcn1.1 gpus(hawaii and bonaire no word gcn 1 cards yet). In nvidia's case I don't see reason why it could not be supported starting at kepler cards, if it's easy enough to implement in driver level addition to g-sync.
It always annoys me when people say "performance penalty of v-sync". Seriously, its like 47 vs 47 or 32 vs 30. It isn't that big of a performance penalty.
That being said, its nice to see this will be available in many monitors in the future and for non-nvidia systems.
I'm currious, what changes software wise will have to be done to make it work?
Will it just be a change in the driver code?
Or will the games have to be modified to take advantage of it?
Nothing needs to be modified in client-side code. SwapBuffers just suddenly stops blocking.
Hope HDMI copy this feature, or Intel NUCs start shipping with a Display Port.
Most of them do have displayport, mini displayport at least.
Originally Posted by tessio
It's not 32 vs. 30.
Originally Posted by ua=42
It's snapping to multiples of the refresh rate so it magnifies even momentary drop-offs by skipping the associated frames. (eg. instead of 30 30 29 30, it's 30 30 15 30 because the frame that was a hair late has to be delayed until the next refresh.)