Announcement

Collapse
No announcement yet.

Wayland Protocols 1.30 Introduces New Protocol To Allow Screen Tearing

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #81
    Originally posted by drake23 View Post
    As an fps gamer, I think this is very nice.

    On thing though that I am still missing: With x.org I can globally enable triple buffering for OGL games (or rather everything) via x.org.conf (Option "TearFree" "true"​). While I understand that triple buffering adds minimal latency, I think this is the very best compromise (when not having vrr display at hand and not being a pro gamer^^). It would be absolutely fantastic, if one day one could force triple buffering with wayland for ogl/vulkan games.
    this is how wayland compositors already work.

    Comment


    • #82
      Originally posted by doomie View Post
      Very much this. I am not surprised at how many people don't want screen tearing... in fact I bet grandma doesn't notice either scenario, but what absolutely shocks me are the people saying, "you don't need that" and who literally don't want you to have a choice. "If I don't like chocolate, YOU SHOULDN'T EVER BE ALLOWED TO HAVE IT, now go put your perfectly good hardware in a landfill and buy all new stuff for $2k+" ... they actually seem angry that anyone has a different experience. "It's the fault of the game engines" and those games will never be patched to fix any of it, but I suspect some of these people simply don't play the kinds of games we do, if any at all. It ain't just FPS, it's action/skill stuff in general, including sidescrollers. This control-freak/gatekeeping mentality being so common is always hard for me to fully accept. It's so irrational, and you see it in politics all the time. I guess utopians really do believe their vision is best for EVERYONE, and never listen long enough to notice even the people that agree with them on a policy have different overall visions. Put em on an island and they'd eventually literally kill each other off over what color loincloths should be allowed because they're "best for everyone". Does every minuscule aspect of ones life have to be publicly mandated? TEARING IS OPTIONAL UNDER THIS SPEC, AND NOT EVEN DEFAULT. But it's happening anyway, so you may as well fixate on something new.

      EDIT: Speaking of fixation, I think I answered my own question: it seems to be a kind of hardcore idealism (lack of real world life experiences?) that won't be realistic about unsolvable problems (why are human beings corrupt/corruptible?) and so "this isn't the right/efficient way" can never be compromised on for the sake of facing practical realities. In such a person's mind, "all 'problems' are solvable, just use policy" but definitions thereof are so unwittingly arbitrary. Stop trying to FORCE the ideal in your mind to happen in real life. There are legitimate reasons it can't, or shouldn't (or at least can't or shouldn't YET), that you haven't considered. I know you're a dreamer and believe in hope and change, but you're sacrificing a lot of valuable things to something you yourself haven't completely fleshed out / tested. Applies to original topic as well. Thanks for reading.
      Tearing sucks. And it sucks even more if it is optional.
      Simple reason is that this is not going to fly on some DEs that want to offer a modern experience that is competitive with WinMac.
      So forget uniform Linux DE behavior (with KDE offering bad behavior..) or reasonably well tested Linux behavior (only most popular DE is tested if at all).
      Lack of consistency/predictability results in lack of adoption.

      .. or are you really suggesting we need like 600 active Linux distributions?

      Comment


      • #83
        Originally posted by doomie View Post

        Very much this. I am not surprised at how many people don't want screen tearing... in fact I bet grandma doesn't notice either scenario, but what absolutely shocks me are the people saying, "you don't need that" and who literally don't want you to have a choice. "If I don't like chocolate, YOU SHOULDN'T EVER BE ALLOWED TO HAVE IT, now go put your perfectly good hardware in a landfill and buy all new stuff for $2k+" ... they actually seem angry that anyone has a different experience. "It's the fault of the game engines" and those games will never be patched to fix any of it, but I suspect some of these people simply don't play the kinds of games we do, if any at all. It ain't just FPS, it's action/skill stuff in general, including sidescrollers. This control-freak/gatekeeping mentality being so common is always hard for me to fully accept. It's so irrational, and you see it in politics all the time. I guess utopians really do believe their vision is best for EVERYONE, and never listen long enough to notice even the people that agree with them on a policy have different overall visions. Put em on an island and they'd eventually literally kill each other off over what color loincloths should be allowed because they're "best for everyone". Does every minuscule aspect of ones life have to be publicly mandated? TEARING IS OPTIONAL UNDER THIS SPEC, AND NOT EVEN DEFAULT. But it's happening anyway, so you may as well fixate on something new.

        EDIT: Speaking of fixation, I think I answered my own question: it seems to be a kind of hardcore idealism (lack of real world life experiences?) that won't be realistic about unsolvable problems (why are human beings corrupt/corruptible?) and so "this isn't the right/efficient way" can never be compromised on for the sake of facing practical realities. In such a person's mind, "all 'problems' are solvable, just use policy" but definitions thereof are so unwittingly arbitrary. Stop trying to FORCE the ideal in your mind to happen in real life. There are legitimate reasons it can't, or shouldn't (or at least can't or shouldn't YET), that you haven't considered. I know you're a dreamer and believe in hope and change, but you're sacrificing a lot of valuable things to something you yourself haven't completely fleshed out / tested. Applies to original topic as well. Thanks for reading.
        So much BS in a single post.... Look pal, no one said you should not be allowed to have what you want. Of course, no one should be forced to offer it to you for free either, get your IDE ready and do the code yourself if you want X so badly, or pay someone to do it for you. But people are free to tell you "you don't need that" when discussing the feature and its usefulness, and most of the time they are correct to, if they provide solid arguments for their case. Calling them "control freaks" says more about you than about them.

        As for a feature being optional and not even default, even optional non-default features increase the size of the code, and introduce more maintenance burden and perhaps even performance burden. Since open source resources are limited, most people don't want them to be spend on niche optional features 99% of the people don't really need. This doesn't make them control freaks just because you for some reason think you need that feature. You are the one who is acting irrationally writing stuff like this, empathy is not your forte it seems.

        Comment


        • #84
          Originally posted by Weasel View Post
          Yea and 30 fps is more than the eye can see! Certified console peasant.
          1) The brain sees, not the eye. The image in front of you, is actually constructed inside your brain. It is not raw from your eye "sensor". Your brain already does heavy image/video processing on it without you spending any effort or realizing it.

          2) Your eye can actually record extremely high "fps" in raw form. BUT, your brain cannot process them that much and spot any differences, because the brain doesn't spend energy wastefully, no reason to attempt to differentiate between changes at 10000 fps for example. Of course all this depends on the brain and its health/age. Younger healthier people have better processing capability/energy, so they tend to be more sensitive to higher fps changes. Practice also improves it, people who play high fps games constantly adopt to processing higher fps better than people who do not. There have actually been many scientific studies on this.

          3) The motion effect of video and video games is actually produced in your brain. Your eye just receives the still images in raw form, but if they change quickly enough, your brain mixes them and creates the idea that what you are watching is live and continuous. This effect can be achieved at very low fps, even 20 can do it. 25-30 are enough and that is why they are used in movies. Some movies use 60 and many people can notice the difference, myself included, but honestly it is not that much of a difference. After a certain point though, the difference is very hard to notice. Going from 200fps, to 300 fps, for example, won't produce any noticeable effect for the vast majority of people. Even going from 60fps to 100fps will most of the time not do much.

          4) The reason you "feel" low fps more when you are playing video games, is input lag, like another poster wrote previously. The higher the fps, the lower the latency between your input and the change on screen. Since you instantly "know and feel" your input and movement of keyboard/mouse, if that delay is significant, you can notice it and the game can feel "jerky". This is not about what the eye can see per se.

          5) Having constant fps, is more important than having high fps. Your brain can adopt to a certain fps level and feel comfortable after a time. But if there are sudden dips in fps, they are ruining the flow and are noticeable. Constant 30fps, all the time, are far better than non-constant fps that hover around 45, sometimes 30, sometimes 60. You are going to notice the volatility and imbalance.

          6) Console players are actually the kings, PC players are the peasants. Not only do they get inferior console ports most of the time, and no exclusives, but they are slaves to the idea that they have to purchase 5k PCs in order to be able to play the same games at the same settings at absurd FPS levels, and have an inferiority complex with just using more "lowly" efficient hardware to play games at lower resolutions, disable bad performance/quality ratio settings, and just be fine with a 30fps lock.

          Comment


          • #85
            Looks like KDE devs are trying hard to transform Wayland into X11. First server side decorations now this. Absolute positioning, keyboard hooks and drawing extensions to go.

            Comment


            • #86
              Originally posted by Khrundel View Post
              Looks like KDE devs are trying hard to transform Wayland into X11. First server side decorations now this. Absolute positioning, keyboard hooks and drawing extensions to go.
              wayland has been playing catchup for a long time to get usable for many people sadly

              Comment


              • #87
                That feature was long overdue.
                While I do agree tearing isn't very desirable, VSync is (to me) noticeably worse. To put it simply, in most cases I can't aim sh*t with VSync on.

                Comment


                • #88
                  Originally posted by Myownfriend View Post

                  No I'm not. Most people who play videos games, FPS or otherwise, aren't nerds on Reddit or Phoronix. They don't know what V-sync is and just play their games to enjoy them.]
                  Probably doesn't really matter for casual/singleplayer games, whether it's FPS or otherwise, but for competitive games v-sync (or any other *sync) is one of the first things to disable.

                  As for 240 hz displays, I personally have no idea why would one buy it for anything other than playing competitive games.

                  Comment


                  • #89
                    Originally posted by TemplarGR View Post

                    1) The brain sees, not the eye. The image in front of you, is actually constructed inside your brain. It is not raw from your eye "sensor". Your brain already does heavy image/video processing on it without you spending any effort or realizing it.

                    2) Your eye can actually record extremely high "fps" in raw form. BUT, your brain cannot process them that much and spot any differences, because the brain doesn't spend energy wastefully, no reason to attempt to differentiate between changes at 10000 fps for example. Of course all this depends on the brain and its health/age. Younger healthier people have better processing capability/energy, so they tend to be more sensitive to higher fps changes. Practice also improves it, people who play high fps games constantly adopt to processing higher fps better than people who do not. There have actually been many scientific studies on this.

                    3) The motion effect of video and video games is actually produced in your brain. Your eye just receives the still images in raw form, but if they change quickly enough, your brain mixes them and creates the idea that what you are watching is live and continuous. This effect can be achieved at very low fps, even 20 can do it. 25-30 are enough and that is why they are used in movies. Some movies use 60 and many people can notice the difference, myself included, but honestly it is not that much of a difference. After a certain point though, the difference is very hard to notice. Going from 200fps, to 300 fps, for example, won't produce any noticeable effect for the vast majority of people. Even going from 60fps to 100fps will most of the time not do much.

                    4) The reason you "feel" low fps more when you are playing video games, is input lag, like another poster wrote previously. The higher the fps, the lower the latency between your input and the change on screen. Since you instantly "know and feel" your input and movement of keyboard/mouse, if that delay is significant, you can notice it and the game can feel "jerky". This is not about what the eye can see per se.

                    5) Having constant fps, is more important than having high fps. Your brain can adopt to a certain fps level and feel comfortable after a time. But if there are sudden dips in fps, they are ruining the flow and are noticeable. Constant 30fps, all the time, are far better than non-constant fps that hover around 45, sometimes 30, sometimes 60. You are going to notice the volatility and imbalance.

                    6) Console players are actually the kings, PC players are the peasants. Not only do they get inferior console ports most of the time, and no exclusives, but they are slaves to the idea that they have to purchase 5k PCs in order to be able to play the same games at the same settings at absurd FPS levels, and have an inferiority complex with just using more "lowly" efficient hardware to play games at lower resolutions, disable bad performance/quality ratio settings, and just be fine with a 30fps lock.
                    Very good summery. Just adding one thing to point 3. The ~24-25 fps mark in Movies was also an economical decision. It was enough to have an almost fluent picturestream for your brain but not to an optimum extend. But adding more frames does also mean more roll material. Earlier movies had far less making them appearing "stuttery" but basically its still precieved as a motion.

                    And as you have described very well the fps recognition threshold varies from person to person. Some may not see a difference between 50 vs 70 fps but some do easily spot a difference in 120 vs 140. Im a high fps guy. If my pocket would allow it I would consider anything below 80fps as stuttery...but gfxcards prices.

                    Comment


                    • #90
                      Originally posted by mantide View Post

                      You are talking about a whole different topic.
                      Different topic? The person you quoted was talking about the AA issue. Read before you quote.

                      Let me help you:

                      “This is the same mindset as Gnome devs refusing to implement subpixel font AA in GTK 4.0, assuming HiDPi screens are gaining popularity, while in reality the vast majority are still on non HiDPi screens.”

                      That was part of the post YOU quoted. So no, I'm not talking about a different topic.

                      Comment

                      Working...
                      X