Announcement

Collapse
No announcement yet.

Mesa 23.3 Lands Optional Support For Allowing Game Tearing On Wayland

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • rabcor
    replied
    Originally posted by oiaohm View Post
    rabcor the concept that working anyway in 3d that you want vsync off is not true. There are cases where you absolutely want vsync support on.
    I said a decent chance, not an absolute.

    Originally posted by oiaohm View Post
    Lot of 3d workflows Ideal place is halfway between vsync off and vsync on before freesync and the like.
    Thank you for re-iterating my exact point.

    Leave a comment:


  • oiaohm
    replied
    Originally posted by rabcor View Post
    Lol, why is this a feature in Mesa? It should be a feature in wayland...
    Here is a good question. Please name the last GPU to have vsync less output? Its been over a 2 decades since such a desktop GPU has existed. Lot of people will get this wrong and say current GPUs.

    Current GPU tearing is half done frames this is why with current GPU you can have inverted tearing. That where the top of the frame is newer than the bottom of the frame under the tear.

    The vsync less GPU you would always have the bottom tear being newer content than the top also you could have in vsync less GPU multi tears on a single frame from a single buffer..

    You can think of current GPU tearing as if this buffer is not complete render it anyhow and fill in the missing bits with the past frame.

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite


    Wayland did in 2022 extra option to allow applications to say this buffer should allow to be rendered in tearing. But when this is a GPU thing in reality there is a big question if this should have just been a DMABUF feature all along.



    Also it interesting when you read into PresentOptionAsyncMayTear that Mesa has just added it for bare metal X11 as well. This should allow X11 on bare metal to have vsync forced on by the X11 server tearfree option yet have application still render with tearing when the application has particularly requested it. This change is not only for Wayland.

    Lot of cad users don't particularly like vsync off. There is advantage to vsync of not over rendering. Yes cad is not like a first person shooter you can have your GPU in background doing like fluid dynamics and material stress calculation and so on spending time rendering what you are looking at takes time from the GPU it can be using for other things. Sometimes input lag is not your only problem. Making choices on how you alter a design with cad how current the stress modeling and the like is equally important.

    rabcor the concept that working anyway in 3d that you want vsync off is not true. There are cases where you absolutely want vsync support on.

    Lot of 3d workflows Ideal place is halfway between vsync off and vsync on before freesync and the like. To not over render you need to know when the last vsync was and how long to the next vsync. The historic vsync off software implementation problem is you lose when the vsync was so leading to rendering between vsync a frame that was going to be completed so far before the next vsync that it was never going to get displayed so wasting all that GPU time on something the operation never going to be used.

    Think about it all the 2d menus when using a CAD rebcor name a valid reason for any of those to render with tearing? (there is not one really other than stupid software limitations) Most 3d software is not pure 3d rabcor you forgot that. Reduced tearing reduces mental strain so allowing operating of a bit of software to maintain focus longer. 2d parts of interface that tearing provides no advantage should not be rendered with tearing.

    Users and application developers have had to pick between no tearing and tearing when in reality we need a middle ground where some parts of applications are tear free and other parts have tearing where it makes logical sense.

    Leave a comment:


  • rabcor
    replied
    Lol, why is this a feature in Mesa? It should be a feature in wayland...

    And it shoulda been here back in 2008 too. lol. yeah. this useless crap of a protocol from 2008 is just now maybe useable sometimes, finally, thanks to mesa, not the wayland devs themselves, bravo...

    Originally posted by ayumu View Post
    The one promise of Wayland to never show imperfect frames... broken.

    They could have focused their efforts into making triple buffering work really well, as well as dynamic refresh such as freesync.

    But no, they instead settle for the usual mediocrity.
    Mediocrity is forcing vsync on systemwide for some arbitrary promise that they didn't even fulfill themselves (not like they invented vsync, not like vsync wasn't already available in x, not like there haven't been created superior alternatives to vsync since, no thanks to these wayland devs that couldn't innovate if their lives depended on it...)

    You show me perfect frames without a major hit to the framerate and high risk of lotsa input lag and i'll be a happy camper. freesync for instance, freesync would be ok to have systemwide. Not forced on mind you, being able to disable features is just good sense. Vsync though? blegh, It's an ok solution if you're working in 2d, 2d as in videos, or 2d games. If you're working in any way with 3d be that a game or just cad, there's a decent chance you will want vsync off.
    Last edited by rabcor; 17 September 2023, 11:51 PM.

    Leave a comment:


  • user556
    replied
    Until every display fully supports VRR then that stays as a nice to have only. Allowing tearing is important simply because of the time it takes to process a frame. Remove the processing time and you remove the need for tearing.

    Also, it is only an adjustable user preference after all.
    Last edited by user556; 16 September 2023, 07:57 PM.

    Leave a comment:


  • oiaohm
    replied
    Originally posted by mdedetrich View Post
    oiaohm I don't know why you are still arguing with me, its basically a fact that higher FPS/refresh rate leads to better performance in games and people do care about that in games.
    No again people just read the head lines.
    Nvidia seems to have put a whole heap of research into this subject, and also graced us with heaps of graphs, infographics, and easy to digest information regarding high FPS and the gaming advantages it brings. Not alone in the study by any means, with Linus & friends from Linus Tech Tips also studying the

    They need the smoothest animations, lowest latency, and the least amount of distracting effects to achieve the best results.
    Proper studies have all come to the same result that higher FPS/refresh rates alone don't in fact lead to best performance out of all players.

    Higher refresh rates by introducing tearing causes problems. Tearing is distracting effects. How bad distracting effects harm you play is different human to human sometimes it a lot other times it not much.

    mdedetrich there is reason for "G-sync/Adaptive sync/VRR" bits. The demand for VRR bits is more important than allowing tearing for best player performance.

    Remember this feature being added to Mesa is not about FPS as such this is about allowing tearing at the same FPS. Gaming tearing is not universally beneficial.

    Yes teared output here and there might give a player a part frame information sooner but it also part frame that can be conflicting information the player is seeing.

    Remember not every human handles getting part frames of information equally well.

    https://www.techspot.com/article/2546-dlss-3/ DLSS is another case where getting higher FPS/refresh rate does not end up with a better latency. So its not a basic fact that higher FPS/refresh rate is in fact better performance. How you get the higher FPS/refresh rate is very important. There are ways to get higher FPS/refresh rate that don't end up giving you better performance really. DLSS is simple to measure and show it stuffed.

    Tearing is not as simple to measure and since not every human responds the same to part frames of information makes it a per human thing.

    mdedetrich basically stop repeating myth the important value is totally latency including the human player reaction time.

    More FPS/refresh rate is not always better simplest example prove that this is false with DLSS and systems like it because the Nvidia's LDAT tool and tools like it. The effects on tearing on human are a lot harder to show.

    How you got more FPS/refresh rate is just as important as having more FPS/refresh rate.

    Leave a comment:


  • mdedetrich
    replied
    oiaohm I don't know why you are still arguing with me, its basically a fact that higher FPS/refresh rate leads to better performance in games and people do care about that in games.

    You seem to be bringing up that people are "subjective" to dismiss this while also arguing for the sakes of arguing. Stop it.

    Leave a comment:


  • oiaohm
    replied
    Originally posted by mdedetrich View Post
    I'm sorry this is wrong, they have done tests with professional CS:Go players and the ones with higher FPS/refresh rate monitors consistently consistently got better results. There are diminishing returns but you start hitting those at 240hz+/200+ FPS.
    There have been multi different studies with CS:Go players. There are tools with CS:GO to measure reaction time. Yes increased monitor HZ to 240hz for majority does increase performance. But there are some of the top 100 CS:Go players in different studies that turned out that tearing causes their human response time to get worse.

    Enjoy the videos and music you love, upload original content, and share it all with friends, family, and the world on YouTube.

    The range of CS:Go player reaction time is massive. This shows in this test ~150ms difference between bottom and top. But the top 100 is wider than that.

    Originally posted by mdedetrich View Post
    You (or your brain) might subjectively not see a difference but when it comes reaction times in reality there most certainly is a difference.
    This is the problem you don't subjectively notice if your human reflex time has been undermined. Some players of CS:Go in the top 100 have improved their over all reaction time by removing tearing.

    Lets say you set you computer with all the recommendations to get the lowest latency possible that generates tearing is possible that your reaction times are totally garbage then you go set all the setting for the lowest latency possible without tearing and you have reaction times in the range of the top 100 players CS:Go. Remember if you are effected by one of the human problems you can be adding +500ms to your reflex time when there is tearing displayed.

    increasing monitor Hz has been show as good. Increasing FPS at the effect of having lots of tearing have been shown to be good for the majority of players but have been also have been shown bad for the minority of players.

    mdedetrich testing for tearing effected or not is a little harder. For a tear to happen a render has to be half done at a vsync. You might be getting like hundreds of frames without any tearing in a configuration that allows tearing.


    Smaller tearing effects help remove distracting effects, helping players maintain focused on winning the game.
    Interesting point when you get into the deeper studies. They noted that higher the hz the smaller the tears become due to less time between frames. But tearing is having adverse effects on players what this one puts under focus.

    There is improvement in players using G-sync/Adaptive sync/VRR over having screen tearing.

    The reality like it or not studies have found that not better always to have tearing even as a professional gamer. Yes as a pro gamer you may be investing in VRR monitors so you don't have tearing ever.

    Wish I could find the other study that went in on different CS:GO players why they were losing focus due to tearing. Yes the results was all kinds of things from there motor sections of brain messing up to vision section of brain messing up. When you are talking big competitions for money some of the CS:GO teams run serous medical checks on this stuff. Losing a big prize money game because a player saw tearing that adversely effected them is kind of a stupid mistake since tearing is known to do some people a number and you can test for the problem and invest money in hardware not to have the problem.

    Leave a comment:


  • mdedetrich
    replied
    Originally posted by oiaohm View Post

    This is presume. Issue is what the vision sections of your brain are effected by does not mean your conscious mind notices. There is a reason why some pro game teams do flicker tests to work out if a person human reaction time is being undermined by flicker.

    Flicker is one of those things that can change human reaction time to be worse. This is what leads to the problem. Same issues start happening with frame generation and other things like it as well.
    I'm sorry this is wrong, they have done tests with professional CS:Go players and the ones with higher FPS/refresh rate monitors consistently consistently got better results. There are diminishing returns but you start hitting those at 240hz+/200+ FPS.

    You (or your brain) might subjectively not see a difference but when it comes reaction times in reality there most certainly is a difference.
    Last edited by mdedetrich; 15 September 2023, 08:51 AM.

    Leave a comment:


  • oiaohm
    replied
    Originally posted by mdedetrich View Post
    If you can push/handle a lot of frames then having tearing is fine because you will almost never notice it due to how many frames you will be exposed to and with the tearing comes lower input lag (which for gaming based on reaction times is always good).
    This is presume. Issue is what the vision sections of your brain are effected by does not mean your conscious mind notices. There is a reason why some pro game teams do flicker tests to work out if a person human reaction time is being undermined by flicker.

    Flicker is one of those things that can change human reaction time to be worse. This is what leads to the problem. Same issues start happening with frame generation and other things like it as well.

    Leave a comment:


  • oiaohm
    replied
    Originally posted by Weasel View Post
    I was talking about latency. You will be impacted by it, no matter if you can react to 20ms or not, because it adds up to an already high reaction time from you.
    This is still wrong.
    There are two areas of brain that can add latency when it comes to game play in big way. Human vision processing and Human motor control. Its simple really seeing something 20ms faster as a human does not help if the flicker results in either of those sections adds back like 40ms. Yes nothing strange for fickler triggered issues in the human brain to be adding 500ms of human reaction latency.

    You said already high reaction time. What if a person has a low reaction time without flicker but then a high reaction time due to flicker. Real human exist with this problem.

    Human reaction time is not a constant. Depending on your own brain and training alters what human reaction is. Yes flicker is one of the things that can make human reaction massive worse. The idea that 20ms extra screen latency is always worse is based on the false idea that human reaction time is constant and what you doing to remove that 20ms screen latency is not doing something that under mines human reaction time.


    Originally posted by Weasel View Post
    Now, if you get sick from flickering that's a completely separate topic. It still decreases latency, lol. You just get sick while watching it, so probably not worth the tradeoff.
    I am not talking about noticeable sick and you wrong flicker is not a separate topic. I am talking about where flicker in fact under mines human reaction time as in increasing the human latency itself. Person can be felling perfectly fine and have their reaction time being mess up by 500ms intermittently. It is possible and does happen for particular humans 20ms savings of allowing tearing is totally undermined by how much effect the flicker is having on their response times.





    Leave a comment:

Working...
X