Announcement

Collapse
No announcement yet.

Mesa Support Comes For Adaptive Vsync

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    V-sync on will mean the renderer will wait for a refresh before uploading the image. It will lower the fps, but not by much. I personally find screen tearing more annoying than having my fps drop from 45 to 40.

    Yeah, playing at 30 fps is painful, NS2 would do that to me until I upgraded my CPU, in a fps getting less than 45 can caused me too miss things.

    Comment


    • #12
      Originally posted by ua=42 View Post
      V-sync on will mean the renderer will wait for a refresh before uploading the image. It will lower the fps, but not by much. I personally find screen tearing more annoying than having my fps drop from 45 to 40.
      Use triple buffering and the frame drop tends to be 0. That's why I prefer it overall to normal vsync and adaptive vsync.

      Comment


      • #13
        Originally posted by ua=42 View Post
        V-sync on will mean the renderer will wait for a refresh before uploading the image. It will lower the fps, but not by much. I personally find screen tearing more annoying than having my fps drop from 45 to 40.
        If the game has a stable time per frame which happens to be just above one 60th of a second, say 1/55 s, then you should get 30 fps instead of 55 fps, a huge drop? I suppose there is some kind of asynchronous rendering going on in your example (once the frame is ready then the game engine starts creating the next frame right away, instead of waiting for the VBL).

        Comment


        • #14
          Originally posted by Calinou View Post
          It's called G-Sync and has quite a lot of issues:
          - NVIDIA-only
          - no Linux support, and even less support in Free drivers
          - needs a specific monitor which is also very expensive for now (at least $400 with an expansion kit)
          "quite a lot of issues"? I don't see any.

          1. NVIDIA has the best gaming cards anyway. Others are free to create their own solution if they are smart enough right?
          2. Linux has shitty game and driver support in general, but don't worry in 10-15 years you might have some decent open source drivers.
          3. Get a job, that's not expensive. If you are a serious gamer you already spent a lot more on your graphics card.

          G-Sync is a revolution, but of course open source hippies ain't happy.

          Comment


          • #15
            Originally posted by arokh View Post
            (?)
            So much stupidity in a single post, I didn?t think it was possible? Congratulations.

            Comment


            • #16
              What will Adaptive Vsync do to games on Linux? It will only give lazy game devs, who don't want to implement tripple buffering, a way out. As it has been mentioned here, 'normal' Vsync with tripple buffering makes Adaptive Vsync obsolete i.e. gives better results to the gamer.

              Comment


              • #17
                Originally posted by Bucic View Post
                What will Adaptive Vsync do to games on Linux? It will only give lazy game devs, who don't want to implement tripple buffering, a way out. As it has been mentioned here, 'normal' Vsync with tripple buffering makes Adaptive Vsync obsolete i.e. gives better results to the gamer.
                Triple buffering is useless because it increases input lag.

                Comment


                • #18
                  Originally posted by Calinou View Post
                  - no Linux support, and even less support in Free drivers
                  I wouldn't be so sure about that. Since the "magic" takes place on the hardware level it would not be hard for nvidia to include support in their binary drivers.

                  Comment


                  • #19
                    Originally posted by JS987 View Post
                    Triple buffering is useless because it increases input lag.
                    Really?
                    Consider some time in between two vsyncs. Suppose the first display buffer is being used to display the current image, and suppose the game was really fast and computed and rendered the next image ...

                    Comment


                    • #20
                      Originally posted by Bucic View Post
                      Really?
                      Consider some time in between two vsyncs. Suppose the first display buffer is being used to display the current image, and suppose the game was really fast and computed and rendered the next image ...

                      http://www.anandtech.com/show/2794/2
                      Broken triple buffering increases lag.
                      The major difference in the technique we've described here is the ability to drop frames when they are outdated. Render ahead forces older frames to be displayed. Queues can help smoothness and stuttering as a few really quick frames followed by a slow frame end up being evened out and spread over more frames. But the price you pay is in lag (the more frames in the queue, the longer it takes to empty the queue and the older the frames are that are displayed).


                      It seems to be broken at least for Intel Linux driver according to "man intel"
                      The disadvantage of triple buffering is that there is an extra frame of latency, due to the pre-rendered frame sitting in the swap queue, between input and any display update.

                      Comment

                      Working...
                      X