Announcement

Collapse
No announcement yet.

VESA Adds Adaptive-Sync To DisplayPort 1.2a Specification

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by Vidar View Post
    So it begins. A new age of tearless gaming. What will NVIDIA be able to offer more with their G-SYNC compared to this?
    Do they have to do anything? It is an open standard. They don't need to sell tacked on proprietary chips for monitors when they just naturally handle adaptive vsync anyway. It is kind of like Mantle - bring out a closed arbitrary piece of shit, and motivate the open standards to get off their asses and adopt necessary technological improvements.

    Comment


    • #12
      Originally posted by zanny View Post
      Do they have to do anything? It is an open standard. They don't need to sell tacked on proprietary chips for monitors when they just naturally handle adaptive vsync anyway. It is kind of like Mantle - bring out a closed arbitrary piece of shit, and motivate the open standards to get off their asses and adopt necessary technological improvements.
      The hardware will require a bigger buffer and probably more processing. I would expect a 50-100$ premium to start with prices stabilizing as more vendors enter the market.

      Comment


      • #13
        Originally posted by amehaye View Post
        I actually don't like this feature. Smoothness of animation is achieved by a constant refresh rate. Adaptive refresh rate is going to cause jittery animation.
        You're missing the point.

        Jittery animation is caused by the PC being unable to maintain a suitable framerate. Current fixed-framerate monitors make that worse by imposing an additional restriction on the definition of "suitable" if you want to prevent tearing. (The restriction that, if a frame is too late by even a nanosecond, it has to either be discarded or delayed until the next frame refresh.)

        Adaptive sync makes animation better because you get the performance of non-VSync rendering with the tear-free visuals of VSync rendering.

        Comment


        • #14
          Originally posted by grndzro View Post
          The hardware will require a bigger buffer and probably more processing. I would expect a 50-100$ premium to start with prices stabilizing as more vendors enter the market.
          50-100$ premium seems very high to me. The requirements for Adaptive-Sync are only a new scaler ASIC and passing the compliance test (if you want to carry the logo). The cost of the scaler ASIC is only a very small part of the monitor's bill of materials.

          Keep in mind that Adaptive-Sync has been around in the eDP standard since 2009, and it certainly didn't make laptop displays cost $50-100 more.

          Comment


          • #15
            Originally posted by grndzro View Post
            The hardware will require a bigger buffer and probably more processing. I would expect a 50-100$ premium to start with prices stabilizing as more vendors enter the market.
            If you mean by buffer that g-sync module, it does not need that 768MB of ram for buffer. It's there because need of memory bandwidth(used fpga altera chip needs 3 memory chips to enable full bandwidth, using very expensive programmable fpga chip(that altera cost about 800$ each) makes that price premium. Gsync will be going cheaper when asic finally comes out).

            Adaptive-Sync sounds like a great standard(end of video tearing). To get it working monitor and graphics card have to support dp1.2a and graphics driver have to support VBI. In AMD case they will start to support it starting with gcn1.1 gpus(hawaii and bonaire no word gcn 1 cards yet). In nvidia's case I don't see reason why it could not be supported starting at kepler cards, if it's easy enough to implement in driver level addition to g-sync.

            Comment


            • #16
              It always annoys me when people say "performance penalty of v-sync". Seriously, its like 47 vs 47 or 32 vs 30. It isn't that big of a performance penalty.
              That being said, its nice to see this will be available in many monitors in the future and for non-nvidia systems.

              I'm currious, what changes software wise will have to be done to make it work?
              Will it just be a change in the driver code?
              Or will the games have to be modified to take advantage of it?

              Comment


              • #17
                Nothing needs to be modified in client-side code. SwapBuffers just suddenly stops blocking.

                Comment


                • #18
                  Hope HDMI copy this feature, or Intel NUCs start shipping with a Display Port.

                  Comment


                  • #19
                    Originally posted by tessio View Post
                    Hope HDMI copy this feature, or Intel NUCs start shipping with a Display Port.
                    Most of them do have displayport, mini displayport at least.

                    Comment


                    • #20
                      Originally posted by ua=42 View Post
                      It always annoys me when people say "performance penalty of v-sync". Seriously, its like 47 vs 47 or 32 vs 30. It isn't that big of a performance penalty.
                      It's not 32 vs. 30.

                      It's snapping to multiples of the refresh rate so it magnifies even momentary drop-offs by skipping the associated frames. (eg. instead of 30 30 29 30, it's 30 30 15 30 because the frame that was a hair late has to be delayed until the next refresh.)

                      Comment

                      Working...
                      X