Announcement

Collapse
No announcement yet.

Wayland Protocols 1.30 Introduces New Protocol To Allow Screen Tearing

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • mdedetrich
    replied
    Originally posted by aufkrawall View Post
    Good to know, I read in the early days when it came out that it did add some negligible input lag but I didn't look further into it. This makes VSync comparatively look even worse then

    Leave a comment:


  • aufkrawall
    replied
    Originally posted by mdedetrich View Post
    I mean this is whtly gsync/freesync we're created, although they also add input lag it's much lower than vsync.
    VRR doesn't add lag: https://youtu.be/L42nx6ubpfg?t=848

    Leave a comment:


  • mdedetrich
    replied
    Originally posted by ffs_ View Post
    Probably doesn't really matter for casual/singleplayer games, whether it's FPS or otherwise, but for competitive games v-sync (or any other *sync) is one of the first things to disable.

    As for 240 hz displays, I personally have no idea why would one buy it for anything other than playing competitive games.
    Yes this is because VSync always adds latency, I mean that's what double/triple buffering does, it holds the last frame in order to compare it to the last (or last two with triple buffering) frames and doing this stalls the pipeline. With vsync disabled the GPU can just spit out frames as fast as it generates them. This will also happen irrespective of how multithreaded/parallel your game engine is, double/triple buffering is fundamently a serial operation.

    I mean this is why gsync/freesync we're created, although they also add input lag it's much lower than vsync.
    Last edited by mdedetrich; 23 November 2022, 06:09 PM.

    Leave a comment:


  • drake23
    replied
    Originally posted by Quackdoc View Post

    this is how wayland compositors already work.
    Really? I was under the impression that Ubuntu had to go quite a long way to get it into "their" gnome (not upstreamed yet) and must admit I don't know how kwin does this under wayland. But if this is true: Does this triple buffering carry over to launched fullscreen applications (aka games)? Thanks in advance

    Leave a comment:


  • Anux
    replied
    Originally posted by piotrj3 View Post
    Anyway, Movies with 24 fps is fine because they add additional blur in motion. Meaning we end up with blur in fast movements meaning we don't see issue particulary with 24 fps. Also it is rock solid 24 fps without any variance - literally best case scenario for fluidity.
    Exactly, even the camera needs to be panned extra slow to hide the low frame rate.

    I tried to replay Doom and Doom 2 with its original engine (35 FPS and extrem aliasing and boxy pixels) on an LCD. Couldn't do it for even 5 min, the spatial/motion blur of a CRT did a lot for us back in the days. With a source port 144 Hz and full resolution + AA it was a plesure to play.

    Leave a comment:


  • piotrj3
    replied
    Originally posted by CochainComplex View Post

    Very good summery. Just adding one thing to point 3. The ~24-25 fps mark in Movies was also an economical decision. It was enough to have an almost fluent picturestream for your brain but not to an optimum extend. But adding more frames does also mean more roll material. Earlier movies had far less making them appearing "stuttery" but basically its still precieved as a motion.

    And as you have described very well the fps recognition threshold varies from person to person. Some may not see a difference between 50 vs 70 fps but some do easily spot a difference in 120 vs 140. Im a high fps guy. If my pocket would allow it I would consider anything below 80fps as stuttery...but gfxcards prices.
    It is troll post not summary.

    Anyway, Movies with 24 fps is fine because they add additional blur in motion. Meaning we end up with blur in fast movements meaning we don't see issue particulary with 24 fps. Also it is rock solid 24 fps without any variance - literally best case scenario for fluidity.

    If you play game at fixed 24 fps with fast movement without any motion blur, you will see it is stuttery because most games will present image sharp without blur. It is especially noticable on fast display with fast pixel response time. Average TV is not fast pixel response time.

    Something from me : I once was using Lenovo laptops that defaulted to idiotic intel setting of decreasing refresh rate (50hz or even below) to save battery life. I was aware something is not fast but i assumed it was just pixel response time etc. But what i noticed consistently that after sitting for 2-3 hours in front of that laptop i was getting headaches. Once i turned off that option and defaulted to at least 60 hz, things much improved, and once i overclocked display to 70hz, all my headaches were gone.
    Last edited by piotrj3; 23 November 2022, 10:29 AM.

    Leave a comment:


  • Weasel
    replied
    Originally posted by Myownfriend View Post
    Learn the difference between someone stating information about the market and someone either expressing their own opinion or making statements about a technology. Just because you don't use v-sync doesn't mean that no one uses it anymore and that no one notices tearing. Just because I like to play games at UHD doesn't mean that no one plays games at lower resolutions any more. Just because some people play their games with low settings so they can try to max out their frame rate, that doesn't mean that no body plays at high settings anymore. Just because we play games on our PCs does not mean that no one uses consoles anymore and that we're better than them.

    And just because a majority of people aren't playing at 4K 120 with VRR doesn't mean that it's worse than 720p 30 with a fixed fps and v-sync. If you thought that's what I was inferring then it's you who is illiterate.

    Drop the "PC master race"attitude. It reeks of little dick energy and ignorance just like the original people to call themselves "the master race".
    Yeah, except they can be literally blind. Some people think 30 fps is fine... until they game at 60 fps for a while. Heck, play on a 240hz monitor (with 240 fps, obviously) for a couple days and you won't be able to go back to 60hz because it will feel sluggish and stuttery. And this isn't even counting vsync, I'm even talking 60hz with vsync turned off. If you add vsync on top, oh boy oh boy!

    People who never experienced things they talk about for a long enough time have no idea how it's like and hence where the meme was born. It doesn't matter what they game at. It doesn't matter if they have vsync on. Their experience is terrible, but they don't know it, because they don't know any better. That's not an indication of "they're fine with it". They're ignorant.

    Leave a comment:


  • Vistaus
    replied
    Originally posted by mantide View Post

    You are talking about a whole different topic.
    Different topic? The person you quoted was talking about the AA issue. Read before you quote.

    Let me help you:

    “This is the same mindset as Gnome devs refusing to implement subpixel font AA in GTK 4.0, assuming HiDPi screens are gaining popularity, while in reality the vast majority are still on non HiDPi screens.”

    That was part of the post YOU quoted. So no, I'm not talking about a different topic.

    Leave a comment:


  • CochainComplex
    replied
    Originally posted by TemplarGR View Post

    1) The brain sees, not the eye. The image in front of you, is actually constructed inside your brain. It is not raw from your eye "sensor". Your brain already does heavy image/video processing on it without you spending any effort or realizing it.

    2) Your eye can actually record extremely high "fps" in raw form. BUT, your brain cannot process them that much and spot any differences, because the brain doesn't spend energy wastefully, no reason to attempt to differentiate between changes at 10000 fps for example. Of course all this depends on the brain and its health/age. Younger healthier people have better processing capability/energy, so they tend to be more sensitive to higher fps changes. Practice also improves it, people who play high fps games constantly adopt to processing higher fps better than people who do not. There have actually been many scientific studies on this.

    3) The motion effect of video and video games is actually produced in your brain. Your eye just receives the still images in raw form, but if they change quickly enough, your brain mixes them and creates the idea that what you are watching is live and continuous. This effect can be achieved at very low fps, even 20 can do it. 25-30 are enough and that is why they are used in movies. Some movies use 60 and many people can notice the difference, myself included, but honestly it is not that much of a difference. After a certain point though, the difference is very hard to notice. Going from 200fps, to 300 fps, for example, won't produce any noticeable effect for the vast majority of people. Even going from 60fps to 100fps will most of the time not do much.

    4) The reason you "feel" low fps more when you are playing video games, is input lag, like another poster wrote previously. The higher the fps, the lower the latency between your input and the change on screen. Since you instantly "know and feel" your input and movement of keyboard/mouse, if that delay is significant, you can notice it and the game can feel "jerky". This is not about what the eye can see per se.

    5) Having constant fps, is more important than having high fps. Your brain can adopt to a certain fps level and feel comfortable after a time. But if there are sudden dips in fps, they are ruining the flow and are noticeable. Constant 30fps, all the time, are far better than non-constant fps that hover around 45, sometimes 30, sometimes 60. You are going to notice the volatility and imbalance.

    6) Console players are actually the kings, PC players are the peasants. Not only do they get inferior console ports most of the time, and no exclusives, but they are slaves to the idea that they have to purchase 5k PCs in order to be able to play the same games at the same settings at absurd FPS levels, and have an inferiority complex with just using more "lowly" efficient hardware to play games at lower resolutions, disable bad performance/quality ratio settings, and just be fine with a 30fps lock.
    Very good summery. Just adding one thing to point 3. The ~24-25 fps mark in Movies was also an economical decision. It was enough to have an almost fluent picturestream for your brain but not to an optimum extend. But adding more frames does also mean more roll material. Earlier movies had far less making them appearing "stuttery" but basically its still precieved as a motion.

    And as you have described very well the fps recognition threshold varies from person to person. Some may not see a difference between 50 vs 70 fps but some do easily spot a difference in 120 vs 140. Im a high fps guy. If my pocket would allow it I would consider anything below 80fps as stuttery...but gfxcards prices.

    Leave a comment:


  • ffs_
    replied
    Originally posted by Myownfriend View Post

    No I'm not. Most people who play videos games, FPS or otherwise, aren't nerds on Reddit or Phoronix. They don't know what V-sync is and just play their games to enjoy them.]
    Probably doesn't really matter for casual/singleplayer games, whether it's FPS or otherwise, but for competitive games v-sync (or any other *sync) is one of the first things to disable.

    As for 240 hz displays, I personally have no idea why would one buy it for anything other than playing competitive games.

    Leave a comment:

Working...
X