Announcement

Collapse
No announcement yet.

Intel Smooth Sync Support Being Worked On For Linux Graphics Driver

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    How does this work in practice?

    Does it know _in advance_ before rendering a frame where the vsync tear is going to be? This sounds like a hard problem. So most likely no?

    If no...blur current/previous frame is done post-processing. Would this not add one frame time of lag?





    Comment


    • #12
      Originally posted by JEBjames View Post
      How does this work in practice?

      Does it know _in advance_ before rendering a frame where the vsync tear is going to be? This sounds like a hard problem. So most likely no?

      If no...blur current/previous frame is done post-processing. Would this not add one frame time of lag?
      Perhaps you can simply blur the top and bottom of rendered frames. This doesn't add more lag than the render time it requires, which is ~none. Same applies to other driver postprocessing effects, such as sharpen.
      It will still look choppy vs. VRR/perfect vsync, but it might be useful on devices without VRR if you hate vsync lag/stutter (which I wholeheartedly do).

      Comment


      • #13
        Originally posted by rabcor View Post
        For all the hype around adaptive sync I've never actually gotten to use it, everytime I went out to buy a laptop or monitor, products without it have won out due to being better products within a lower budget more often than not, wonder if that's still the same, or if the next monitor or laptop i buy will actually have it.
        Add to that the fact that the requirements for adaptive sync is very strict and you'll realize that it's totally not worth the headache and the money. The monitor has to support it, and the computer has to support the same sync technology (g-sync vs adaptive sync) and the connection has to be done strictly using display port, or using HDMI 2.1 which gaming monitors generally don't support. And then on the software side who knows what else is needed to get it to work. There is just too many places across the chain where things can go wrong. It's just not worth it until it becomes a standardized mainstream technology that works across all vendors.

        Comment


        • #14
          Yes, this is a post-processing filter. It just adds some blur and blends the scanlines around the tear with the previous frame to make it less obvious. No, it doesn't add a frame of latency. Or at least it shouldn't. If it did you might as well use vsync because this trick would be useless. The way it probably works is that when a game submits a frame the driver does its thing and then puts the result on screen. Even if it takes 1-2ms it doesn't really matter because without vsync there is no vblank period to miss (which adds lag because you have to wait for the next vblank period to avoid tearing) so it can just swap the buffers when it's done. In practice it's probably faster and maybe even no-cost if done in hardware, but I'm not sure if it is.

          Comment


          • #15
            Originally posted by sarmad View Post

            Add to that the fact that the requirements for adaptive sync is very strict and you'll realize that it's totally not worth the headache and the money. The monitor has to support it, and the computer has to support the same sync technology (g-sync vs adaptive sync) and the connection has to be done strictly using display port, or using HDMI 2.1 which gaming monitors generally don't support. And then on the software side who knows what else is needed to get it to work. There is just too many places across the chain where things can go wrong. It's just not worth it until it becomes a standardized mainstream technology that works across all vendors.
            Adaptive sync requirements are not "strict"

            you can blame nvidia for taking so long to adopt the vesa adaptive sync standard

            connection doesnt have to be done over DP or HDMI 2.1 Freesync is an open standard and works on HDMI 1.4 etc and besides that almost every single gaming monitor supports displayport. (just use display port)

            on the software side, linux, windows and macos support freesync or vesa adaptive sync.

            sure, there are a lot of places, but its been rock solid for the past couple years.

            VRR over display port is standardized.

            pretty much every single laptop being produced now uses eDP so that supports freesync/adaptivesyc (the protocol not the panels themselves), the steamdeck IIRC uses eDP so that supports freesync (the protocol, not the panel itself), the majority of type-c with display out uses displayport. Standardization is not an issue, dont buy shitvidia. we are now seeing vesa adpativesync certifications for TVs, so you can bet that will be freesync/adpative sync compatible too.

            There are few excuses for not supporting freesync now, not when for many people it can be a massive game changer, but not only in games, but also fluidity of watching media.

            Comment


            • #16
              Can someone explain what screen tearing is to me? I've been using low end stuff since 2003. Windows XP direct X 8 and 9 games, Debian 7 open source games, some games on Windows 7, same XP games via wine on FreeBSD. Never thought the graphics looked bad. Do I just not care about graphical fidelity that much or have I never experienced screen tearing?

              Comment


              • #17
                Originally posted by kylew77 View Post
                Can someone explain what screen tearing is to me? I've been using low end stuff since 2003. Windows XP direct X 8 and 9 games, Debian 7 open source games, some games on Windows 7, same XP games via wine on FreeBSD. Never thought the graphics looked bad. Do I just not care about graphical fidelity that much or have I never experienced screen tearing?
                On my screen tearing test video recently I got some comments where people are not sure if they have screen tearing or not. I decided to make a video showing ...

                Comment


                • #18
                  Thank You!

                  Comment


                  • #19
                    Originally posted by rabcor View Post
                    For all the hype around adaptive sync I've never actually gotten to use it, everytime I went out to buy a laptop or monitor, products without it have won out due to being better products within a lower budget more often than not, wonder if that's still the same, or if the next monitor or laptop i buy will actually have it.
                    Hmmm. By having an adaptive display you are actually unlocking untapped performance with the elimination of virtually all stutter. I watched my old GTX 1070 unleash a lot more performance by contrast of being gimped by a locked display. Also image fidelity is greatly increased by a much more persistent/consistent full range of motion and image with a much much higher degree of fluidity. Adaptive display technologies without question increase overall fidelity, it's also my own experience that it eliminates the need for a compositor.

                    Also compositors gimp performance.

                    If you game, and update your display to gsync/freesync whatever you want to call it, you are also unlocking performance, that's what most people don't understand.

                    I was a non believer until I started using it wherever possible, which is virtually everywhere except certain games that are locked at 60.

                    That is not to say that what you have is not good but what I just posted about is actually by far superior. I'm rocking a Dell S2721DGF and it is sublime.

                    If you get anything recent even budget, just research what you are buying and if it supports gsync or freesync, both are great.
                    Last edited by creative; 27 August 2022, 10:42 AM.

                    Comment


                    • #20
                      Originally posted by Quackdoc View Post

                      Adaptive sync requirements are not "strict"

                      you can blame nvidia for taking so long to adopt the vesa adaptive sync standard

                      connection doesnt have to be done over DP or HDMI 2.1 Freesync is an open standard and works on HDMI 1.4 etc and besides that almost every single gaming monitor supports displayport. (just use display port)

                      on the software side, linux, windows and macos support freesync or vesa adaptive sync.

                      sure, there are a lot of places, but its been rock solid for the past couple years.

                      VRR over display port is standardized.

                      pretty much every single laptop being produced now uses eDP so that supports freesync/adaptivesyc (the protocol not the panels themselves), the steamdeck IIRC uses eDP so that supports freesync (the protocol, not the panel itself), the majority of type-c with display out uses displayport. Standardization is not an issue, dont buy shitvidia. we are now seeing vesa adpativesync certifications for TVs, so you can bet that will be freesync/adpative sync compatible too.

                      There are few excuses for not supporting freesync now, not when for many people it can be a massive game changer, but not only in games, but also fluidity of watching media.
                      Indeed.

                      Comment

                      Working...
                      X