Page 2 of 2 FirstFirst 12
Results 11 to 20 of 20

Thread: VESA Adds Adaptive-Sync To DisplayPort 1.2a Specification

  1. #11
    Join Date
    Dec 2012
    Posts
    573

    Default

    Quote Originally Posted by Vidar View Post
    So it begins. A new age of tearless gaming. What will NVIDIA be able to offer more with their G-SYNC compared to this?
    Do they have to do anything? It is an open standard. They don't need to sell tacked on proprietary chips for monitors when they just naturally handle adaptive vsync anyway. It is kind of like Mantle - bring out a closed arbitrary piece of shit, and motivate the open standards to get off their asses and adopt necessary technological improvements.

  2. #12
    Join Date
    Apr 2014
    Posts
    178

    Default

    Quote Originally Posted by zanny View Post
    Do they have to do anything? It is an open standard. They don't need to sell tacked on proprietary chips for monitors when they just naturally handle adaptive vsync anyway. It is kind of like Mantle - bring out a closed arbitrary piece of shit, and motivate the open standards to get off their asses and adopt necessary technological improvements.
    The hardware will require a bigger buffer and probably more processing. I would expect a 50-100$ premium to start with prices stabilizing as more vendors enter the market.

  3. #13

    Default

    Quote Originally Posted by amehaye View Post
    I actually don't like this feature. Smoothness of animation is achieved by a constant refresh rate. Adaptive refresh rate is going to cause jittery animation.
    You're missing the point.

    Jittery animation is caused by the PC being unable to maintain a suitable framerate. Current fixed-framerate monitors make that worse by imposing an additional restriction on the definition of "suitable" if you want to prevent tearing. (The restriction that, if a frame is too late by even a nanosecond, it has to either be discarded or delayed until the next frame refresh.)

    Adaptive sync makes animation better because you get the performance of non-VSync rendering with the tear-free visuals of VSync rendering.

  4. #14
    Join Date
    Jul 2008
    Location
    Berlin, Germany
    Posts
    848

    Default

    Quote Originally Posted by grndzro View Post
    The hardware will require a bigger buffer and probably more processing. I would expect a 50-100$ premium to start with prices stabilizing as more vendors enter the market.
    50-100$ premium seems very high to me. The requirements for Adaptive-Sync are only a new scaler ASIC and passing the compliance test (if you want to carry the logo). The cost of the scaler ASIC is only a very small part of the monitor's bill of materials.

    Keep in mind that Adaptive-Sync has been around in the eDP standard since 2009, and it certainly didn't make laptop displays cost $50-100 more.

  5. #15
    Join Date
    Apr 2007
    Location
    Arctic circle, Finland
    Posts
    303

    Default

    Quote Originally Posted by grndzro View Post
    The hardware will require a bigger buffer and probably more processing. I would expect a 50-100$ premium to start with prices stabilizing as more vendors enter the market.
    If you mean by buffer that g-sync module, it does not need that 768MB of ram for buffer. It's there because need of memory bandwidth(used fpga altera chip needs 3 memory chips to enable full bandwidth, using very expensive programmable fpga chip(that altera cost about 800$ each) makes that price premium. Gsync will be going cheaper when asic finally comes out).

    Adaptive-Sync sounds like a great standard(end of video tearing). To get it working monitor and graphics card have to support dp1.2a and graphics driver have to support VBI. In AMD case they will start to support it starting with gcn1.1 gpus(hawaii and bonaire no word gcn 1 cards yet). In nvidia's case I don't see reason why it could not be supported starting at kepler cards, if it's easy enough to implement in driver level addition to g-sync.

  6. #16
    Join Date
    Jan 2010
    Location
    Somewhere in Kansas.
    Posts
    291

    Default

    It always annoys me when people say "performance penalty of v-sync". Seriously, its like 47 vs 47 or 32 vs 30. It isn't that big of a performance penalty.
    That being said, its nice to see this will be available in many monitors in the future and for non-nvidia systems.

    I'm currious, what changes software wise will have to be done to make it work?
    Will it just be a change in the driver code?
    Or will the games have to be modified to take advantage of it?

  7. #17
    Join Date
    Feb 2008
    Location
    Linuxland
    Posts
    5,282

    Default

    Nothing needs to be modified in client-side code. SwapBuffers just suddenly stops blocking.

  8. #18
    Join Date
    Sep 2009
    Posts
    53

    Default

    Hope HDMI copy this feature, or Intel NUCs start shipping with a Display Port.

  9. #19
    Join Date
    Dec 2012
    Posts
    573

    Default

    Quote Originally Posted by tessio View Post
    Hope HDMI copy this feature, or Intel NUCs start shipping with a Display Port.
    Most of them do have displayport, mini displayport at least.

  10. #20

    Default

    Quote Originally Posted by ua=42 View Post
    It always annoys me when people say "performance penalty of v-sync". Seriously, its like 47 vs 47 or 32 vs 30. It isn't that big of a performance penalty.
    It's not 32 vs. 30.

    It's snapping to multiples of the refresh rate so it magnifies even momentary drop-offs by skipping the associated frames. (eg. instead of 30 30 29 30, it's 30 30 15 30 because the frame that was a hair late has to be delayed until the next refresh.)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •