Announcement

Collapse
No announcement yet.

Mesa Support Comes For Adaptive Vsync

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    A well-done triple buffering adds between 0 to one frame of input latency. A badly done adds as much as there's frames (such as the recent Catalyst & Valve affair).

    So with triple buffering you have varying input latency, for 60Hz & good implementation between 0 and 16.6ms per frame, compared to normal un-vsynced rendering. (for a really heavy frame, it could be longer than 16ms). Whether that's acceptable to you, YMMV.

    G-sync, as you and Carmack pointed out, is the ideal solution, but currently unviable.

    Comment


    • #22
      Yup. So, my point remains valid. Adaptive vsync will only aid bad game devs. And ignorance regarding triple buffering is the very reason the bad game devs know there's no penalty in no or porked triple buffering implementation.

      Comment


      • #23
        Originally posted by Bucic View Post
        bad game devs
        Have you ever made a game worth playing? Maybe in your fantasy world it?s unacceptable to not have triple buffering, but there?s a thing called reality where making games is hard and requires LOTS of things to be done, many being more important than triple buffering.

        Besides, if this is true then it?s out of the hands of game developers.

        Comment


        • #24
          Originally posted by stqn View Post
          So much stupidity in a single post, I didn?t think it was possible? Congratulations.
          Wow you contributed lots to the discussion, great job! Not. U mad because nvidia revolutionized gaming too maybe?

          Comment


          • #25
            Originally posted by Bucic View Post
            Yup. So, my point remains valid. Adaptive vsync will only aid bad game devs. And ignorance regarding triple buffering is the very reason the bad game devs know there's no penalty in no or porked triple buffering implementation.
            ... you do realize triple buffering is implemented in the driver and enabling it is completely inaccessible to the client application, right?

            Comment


            • #26
              Originally posted by Ancurio View Post
              ... you do realize triple buffering is implemented in the driver and enabling it is completely inaccessible to the client application, right?
              It can just as well be implemented in the application.

              Comment


              • #27
                Originally posted by curaga View Post
                It can just as well be implemented in the application.
                That sounds interesting. How do I allocate multiple window buffers and tell eg. GLX to swap to a specific one?

                Comment


                • #28
                  Originally posted by stqn View Post
                  Have you ever made a game worth playing? Maybe in your fantasy world it?s unacceptable to not have triple buffering, but there?s a thing called reality where making games is hard and requires LOTS of things to be done, many being more important than triple buffering.

                  Besides, if this is true then it?s out of the hands of game developers.
                  No, never. Not even a ticktacktoe. It's not mandatory for my point to be valid. You speak of reality. Well, let me put it this way. All those who think tear-free is a luxury in 2013+ are the ones who need reality check.

                  Comment


                  • #29
                    Originally posted by Ancurio View Post
                    That sounds interesting. How do I allocate multiple window buffers and tell eg. GLX to swap to a specific one?
                    You render to a pair of RTTs (render target textures/FBOs), keep track of when the vsync is happening, and just in time you blit the ready RTT to the window. It's not 100% foolproof like a driver's would be, but if an app dev wants to go to the trouble, they can.

                    Comment


                    • #30
                      What about AMD Freesync (Displayport adaptive sync)?

                      Comment

                      Working...
                      X