Announcement

Collapse
No announcement yet.

Wayland Protocols 1.30 Introduces New Protocol To Allow Screen Tearing

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #91
    Originally posted by Myownfriend View Post
    Learn the difference between someone stating information about the market and someone either expressing their own opinion or making statements about a technology. Just because you don't use v-sync doesn't mean that no one uses it anymore and that no one notices tearing. Just because I like to play games at UHD doesn't mean that no one plays games at lower resolutions any more. Just because some people play their games with low settings so they can try to max out their frame rate, that doesn't mean that no body plays at high settings anymore. Just because we play games on our PCs does not mean that no one uses consoles anymore and that we're better than them.

    And just because a majority of people aren't playing at 4K 120 with VRR doesn't mean that it's worse than 720p 30 with a fixed fps and v-sync. If you thought that's what I was inferring then it's you who is illiterate.

    Drop the "PC master race"attitude. It reeks of little dick energy and ignorance just like the original people to call themselves "the master race".
    Yeah, except they can be literally blind. Some people think 30 fps is fine... until they game at 60 fps for a while. Heck, play on a 240hz monitor (with 240 fps, obviously) for a couple days and you won't be able to go back to 60hz because it will feel sluggish and stuttery. And this isn't even counting vsync, I'm even talking 60hz with vsync turned off. If you add vsync on top, oh boy oh boy!

    People who never experienced things they talk about for a long enough time have no idea how it's like and hence where the meme was born. It doesn't matter what they game at. It doesn't matter if they have vsync on. Their experience is terrible, but they don't know it, because they don't know any better. That's not an indication of "they're fine with it". They're ignorant.

    Comment


    • #92
      Originally posted by CochainComplex View Post

      Very good summery. Just adding one thing to point 3. The ~24-25 fps mark in Movies was also an economical decision. It was enough to have an almost fluent picturestream for your brain but not to an optimum extend. But adding more frames does also mean more roll material. Earlier movies had far less making them appearing "stuttery" but basically its still precieved as a motion.

      And as you have described very well the fps recognition threshold varies from person to person. Some may not see a difference between 50 vs 70 fps but some do easily spot a difference in 120 vs 140. Im a high fps guy. If my pocket would allow it I would consider anything below 80fps as stuttery...but gfxcards prices.
      It is troll post not summary.

      Anyway, Movies with 24 fps is fine because they add additional blur in motion. Meaning we end up with blur in fast movements meaning we don't see issue particulary with 24 fps. Also it is rock solid 24 fps without any variance - literally best case scenario for fluidity.

      If you play game at fixed 24 fps with fast movement without any motion blur, you will see it is stuttery because most games will present image sharp without blur. It is especially noticable on fast display with fast pixel response time. Average TV is not fast pixel response time.

      Something from me : I once was using Lenovo laptops that defaulted to idiotic intel setting of decreasing refresh rate (50hz or even below) to save battery life. I was aware something is not fast but i assumed it was just pixel response time etc. But what i noticed consistently that after sitting for 2-3 hours in front of that laptop i was getting headaches. Once i turned off that option and defaulted to at least 60 hz, things much improved, and once i overclocked display to 70hz, all my headaches were gone.
      Last edited by piotrj3; 23 November 2022, 10:29 AM.

      Comment


      • #93
        Originally posted by piotrj3 View Post
        Anyway, Movies with 24 fps is fine because they add additional blur in motion. Meaning we end up with blur in fast movements meaning we don't see issue particulary with 24 fps. Also it is rock solid 24 fps without any variance - literally best case scenario for fluidity.
        Exactly, even the camera needs to be panned extra slow to hide the low frame rate.

        I tried to replay Doom and Doom 2 with its original engine (35 FPS and extrem aliasing and boxy pixels) on an LCD. Couldn't do it for even 5 min, the spatial/motion blur of a CRT did a lot for us back in the days. With a source port 144 Hz and full resolution + AA it was a plesure to play.

        Comment


        • #94
          Originally posted by Quackdoc View Post

          this is how wayland compositors already work.
          Really? I was under the impression that Ubuntu had to go quite a long way to get it into "their" gnome (not upstreamed yet) and must admit I don't know how kwin does this under wayland. But if this is true: Does this triple buffering carry over to launched fullscreen applications (aka games)? Thanks in advance

          Comment


          • #95
            Originally posted by ffs_ View Post
            Probably doesn't really matter for casual/singleplayer games, whether it's FPS or otherwise, but for competitive games v-sync (or any other *sync) is one of the first things to disable.

            As for 240 hz displays, I personally have no idea why would one buy it for anything other than playing competitive games.
            Yes this is because VSync always adds latency, I mean that's what double/triple buffering does, it holds the last frame in order to compare it to the last (or last two with triple buffering) frames and doing this stalls the pipeline. With vsync disabled the GPU can just spit out frames as fast as it generates them. This will also happen irrespective of how multithreaded/parallel your game engine is, double/triple buffering is fundamently a serial operation.

            I mean this is why gsync/freesync we're created, although they also add input lag it's much lower than vsync.
            Last edited by mdedetrich; 23 November 2022, 06:09 PM.

            Comment


            • #96
              Originally posted by mdedetrich View Post
              I mean this is whtly gsync/freesync we're created, although they also add input lag it's much lower than vsync.
              VRR doesn't add lag: https://youtu.be/L42nx6ubpfg?t=848

              Comment


              • #97
                Originally posted by aufkrawall View Post
                Good to know, I read in the early days when it came out that it did add some negligible input lag but I didn't look further into it. This makes VSync comparatively look even worse then

                Comment


                • #98
                  Originally posted by drake23 View Post

                  Really? I was under the impression that Ubuntu had to go quite a long way to get it into "their" gnome (not upstreamed yet) and must admit I don't know how kwin does this under wayland. But if this is true: Does this triple buffering carry over to launched fullscreen applications (aka games)? Thanks in advance
                  correct, what happens is that the wayland compositor will only ever do a full frame refresh to the display, the only (in the past) alternative to this was DRM leasing

                  Comment


                  • #99
                    Interesting that so many here think Triple buffering adds latency. It actually does not.

                    Triple buffering simply means the game does not need to wait for the next frame to start rendering on screen before it starts on the next frame. So we have three frames to work with yes? One is actively written to, one is being rendered onto screen and one is waiting to be drawn, yes?

                    Here comes the beauty of it all; if the game finishes the frame before the next frame is rendered on the screen, it sets the new frame as ready to be rendered and starts immediately drawing on the old buffer. This means the next frame to be drawn can be as fresh as a millisecond or two at the time the frame is drawn.

                    This makes Triple buffering lower latency overall, and it is as fast as double buffering in the worst case (one frame latency or 16.666... ms), and sub milli in the best case.

                    Of course, to get sub milli you need that game to draw 900 FPS. Not impossible but not easy, either.

                    Comment


                    • Originally posted by wertigon View Post
                      Interesting that so many here think Triple buffering adds latency. It actually does not.

                      Triple buffering simply means the game does not need to wait for the next frame to start rendering on screen before it starts on the next frame. So we have three frames to work with yes? One is actively written to, one is being rendered onto screen and one is waiting to be drawn, yes?

                      Here comes the beauty of it all; if the game finishes the frame before the next frame is rendered on the screen, it sets the new frame as ready to be rendered and starts immediately drawing on the old buffer. This means the next frame to be drawn can be as fresh as a millisecond or two at the time the frame is drawn.

                      This makes Triple buffering lower latency overall, and it is as fast as double buffering in the worst case (one frame latency or 16.666... ms), and sub milli in the best case.

                      Of course, to get sub milli you need that game to draw 900 FPS. Not impossible but not easy, either.
                      it does add latency, but only ever 1 frame of latency in the worst case. the latency will always be worse then flipping the currently drawn frame to the screen immediately. but like I said, at worst this means 1 frame of latency

                      Comment

                      Working...
                      X