Announcement

Collapse
No announcement yet.

AMD Enabling FreeSync Video Mode By Default With Linux 5.18, Merging AMDKFD CRIU

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by shmerl View Post

    Lag by definition means not displaying it when they are ready with some delay for "reasons". One of those reasons in the classic (no adaptive sync / VRR) setup is vsync. If you display it right away, it means that delay is avoided, so less input lag.
    yes, but the frames will still be displayed right away even without freesync, which is what screen tearing is.

    Originally posted by aufkrawall View Post
    No need to accept the ugly transition and tearing when fps would go beyond VRR range, just prevent vsync lag by using an fps limiter on top.
    yup, lowest "input latency" (or rather user perceived latency) actually requires you to use a frame limiter, as system latency actually increased when a gpu reaches the top threshold of usage. not to mention you don't get any real useful benefit to the game rendering faster than your display in most modern games

    Comment


    • #22
      Originally posted by Quackdoc View Post
      yes, but the frames will still be displayed right away even without freesync, which is what screen tearing is.
      Only if you don't have vsync. Otherwise you'll have a lag. The whole point of adaptive sync is to avoid tearing and lag both (when framerate is within certain range).

      My whole point was that adaptive sync totally does help reducing lag in comparison with classic vsync.

      Comment


      • #23
        Originally posted by aufkrawall View Post
        No need to accept the ugly transition and tearing when fps would go beyond VRR range, just prevent vsync lag by using an fps limiter on top.
        You are missing the point that high fps with tearing means lower lag than limited fps without tearing. So those who really need that low latency are OK with tearing. Besides, tearing at range above let's say 144 (or even 180 Hz) is barely noticeable.

        I.e. let's say your game can push 240 fps and your monitor maxes out at 180 Hz. It's OK to allow it tear about 180 and benefit from lower lag there (if you care). Not everyone cares about that latency though, so you can prefer smoother display (no tearing and capping at 179).
        Last edited by shmerl; 13 February 2022, 08:51 PM.

        Comment


        • #24
          Originally posted by shmerl View Post

          Only if you don't have vsync. Otherwise you'll have a lag. The whole point of adaptive sync is to avoid tearing and lag both (when framerate is within certain range).

          My whole point was that adaptive sync totally does help reducing lag in comparison with classic vsync.
          I think we have a misunderstanding in terminology. adaptive sync is not an alternative to vsync it is a seperate, additive technology. Vsync vs limiting FPS+freesync is what you are talking about. enabling freesync will not limit your fps in game.

          you would use either vsync, or an fps cap like in game or RTSS then you add freesync on top of it. adaptive sync doesn't add or remove input delay, that would be done by capping your fps. it just makes capped fps more tolerable. so in the sense that it makes it bearable, freesync would lower input lag, but technically speaking, freesync, or adaptive sync in general

          Comment


          • #25
            Originally posted by Quackdoc View Post

            I think we have a misunderstanding in terminology. adaptive sync is not an alternative to vsync it is a seperate, additive technology.
            I never said anything about alternatives (though it is an alternative in a sense). Let me repeat my point - adaptive sync reduces input lag (you claimed that it has nothing to do with it, which is simply incorrect). Details matter obviously, but when adaptive sync works (in range) it totally helps reducing latency.

            vsync is not adaptive sync + something. vsync is indeed a separate thing which prevents tearing by introducing a delay (below max refresh rate).

            And adaptive sync does solve the problem of tearing, same as vysnc so in this sense it is a clear alternative. Except it does it without introducing lag, so it's better in all scenarios.

            Outside of adaptive sync range it's the same as without it. I.e. you can have tearing or you can cap it at max threshold which will equal vsync-like behavior there.

            So again, your claim that adaptive sync doesn't reduce input delay is wrong. In comparison with vsync it does, and I explained it above.

            There is a lot of confusion on this topic for some weird reason. It's all basic physics and how actual displays work.
            Last edited by shmerl; 13 February 2022, 09:21 PM.

            Comment


            • #26
              Originally posted by shmerl View Post
              You are missing the point that high fps with tearing means lower lag than limited fps without tearing.
              I am not because that is wrong when game has CPU prerender with unlocked fps vs. non/less with fps limiter. Look up latency measurements by BlurBusters or Battle(non)sense, especially more recent results with Reflex.

              Comment


              • #27
                Originally posted by aufkrawall View Post
                I am not because that is wrong when game has CPU prerender with unlocked fps vs. non/less with fps limiter. Look up latency measurements by BlurBusters or Battle(non)sense, especially more recent results with Reflex.
                I'm not sure what they measure, but tearing by its own definition means lack of waiting between frame repaints. So it means lower latency in when you see (parts) of the new frame starting to appear. Given it's a minuscule value the higher max refresh rate the monitor has, it obviously gives you diminishing returns.

                I.e. I'd question how valuable is to tear for monitors that have let's say 180 Hz or even 240 Hz mas refresh rate.

                Comment


                • #28
                  Originally posted by shmerl View Post

                  I never said anything about alternatives (though it is an alternative in a sense). Let me repeat my point - adaptive sync reduces input lag (you claimed that it has nothing to do with it, which is simply incorrect). Details matter obviously, but when adaptive sync works (in range) it totally helps reducing latency.
                  this is wrong, without adaptive sync, assuming the display rate is in sync range will be pushed to the display at the same time. adaptive sync simply changes the timing of the display to avoid tearing, input latency is the same in this case. what changes is if there is a line in the screen.

                  vsync is not adaptive sync + something. vsync is indeed a separate thing which prevents tearing by introducing a delay (below max refresh rate).

                  And adaptive sync does solve the problem of tearing, same as vysnc so in this sense it is a clear alternative. Except it does it without introducing lag, so it's better in all scenarios.
                  I didn't say it was, what it was that I said was "adaptive sync is not an alternative to vsync it is a seperate, additive technology" adaptive sync is something you can put on top of vsync or other form of frame limiting. adaptive sync only solves the problem within it's range (and sometimes using LFC but that is something different altogether.) vsync is technology that limits the frames that get pushed to the display for when it is ready, freesync only syncs display refresh rate to incoming display rate withing a range. freesync/gsync was often paired ON TOP of vsync. you still need to limit refresh rate for freesync to work.

                  Outside of adaptive sync range it's the same as without it. I.e. you can have tearing or you can cap it at max threshold which will equal vsync-like behavior there.

                  So again, your claim that adaptive sync doesn't reduce input delay is wrong. In comparison with vsync it does, and I explained it above.

                  There is a lot of confusion on this topic for some weird reason. It's all basic physics and how actual displays work.
                  outside of sync range, freesync turns off. that is not "reducing delay", that is simply not working. you will still get screen tearing as you said.
                  Freesync and turning off vsync in this case gain the exact same result. the difference is one doesn't start working automatically. again this is not "reducing input lag"

                  so unless you consider screen tearing to be contributing to input lag, freesync does not reduce input lag, where input lag is the effect a user creates is reflected on the screen. enhanced sync (amd's name for it) has been out for a while which does close to the same thing, except what it does is instead of traditional vsync where the gpu waits, the buffer constantly gets refreshed with a new frame.

                  but again, turning off, is not reducing input lag, syncing frames, is not reducing input lag unless you consider frame tearing to contribute to it.

                  Comment


                  • #29
                    Originally posted by Quackdoc View Post
                    this is wrong, without adaptive sync, assuming the display rate is in sync range will be pushed to the display at the same time. adaptive sync simply changes the timing of the display to avoid tearing, input latency is the same in this case. what changes is if there is a line in the screen.
                    Not sure what you are talking about, so pay attention please.

                    I'm comparing classic vsync (no adaptive sync) and adaptive sync. First one introduces latency to prevent tearing. That is clear I hope? I don't want to repeat the same thing needlessly.

                    Originally posted by Quackdoc View Post
                    So unless you consider screen tearing to be contributing to input lag
                    That I don't understand completely. Tearing in general produces the minimal lag possible, i.e. lowest latency. Because you start seeing parts of new frame before old one is completely finished.

                    There was a good video somewhere which explains it all in detail.

                    Found it: https://www.youtube.com/watch?v=uzp8z1i5-Hc
                    Last edited by shmerl; 13 February 2022, 10:00 PM.

                    Comment


                    • #30
                      Originally posted by shmerl View Post
                      I'm not sure what they measure, but tearing by its own definition means lack of waiting between frame repaints. So it means lower latency in when you see (parts) of the new frame starting to appear. Given it's a minuscule value the higher max refresh rate the monitor has, it obviously gives you diminishing returns.
                      Lag reduction by tearing vs. VRR is just miniscule, as is by shorter refresh interval when being well above 120fps. If game has idiotic CPU prerender of 3 frames (which many have), this is still ~12ms of additional lag even at 240fps. ~138fps with Reflex have lower total lag than uncapped 240fps with sh*t prerender. That's what those tests show.

                      Comment

                      Working...
                      X