Hmm, allowing tearing. I would call this a regression feature. Weird.
Announcement
Collapse
No announcement yet.
Wayland Protocols 1.30 Introduces New Protocol To Allow Screen Tearing
Collapse
X
-
Originally posted by mantide View Post
I couldn't agree less. It's always easy to blame devs from the 2 things that you know, while dev have 200 things to consider. If you use old hardware, stick to LTS old software. You don't have to blame the people who is trying to create and push for bleeding edge technologies. New software for new hardware. Old hardware stick to old software.
Plus, nobody ever uses vsync on a 240hz monitor. That's complete idiocy. There is a thing called adaptive-sync/freesync. vsync itself is a piece of outdated technology.
Again. new with new, old with old. You can't expect your 5 year old pickup truck to run on electricity.
- Likes 1
Comment
-
Originally posted by cj.wijtmans View Post
adaptive sync is still a form of vsync, nothing really changes for video games pipeline coding, it issues a command that ends the frame draw the rest is up to the driver.
So from perspective of game developer you shouldn't even add "Vsync" option. Developers only add it because not many people follow blurbusters or other detailed website how those stuff work.
Also technicallities : at least on Nvidia (not sure how on AMD/Intel), Gsync/freesync only enables adjusting refresh rate of monitor to frame rate. This is only thing it does . It does not remove tearing in all cases - eg, if you go in framerate over maximum refresh rate or below minimum you might face tearing again. In case Gsync+Vsync is enabled, Vsync turns on in moment we go above maximum refresh rate (and in that case we still get additional 1 frame lag).
So how do you remove such issues?
The most optimal setting is , Gsync/freesync + vsync + frame rate limiter -3 to maximum refresh rate. Why?
Adaptive sync obviously is great.
Vsync engages when it is needed (eg. if we have 120hz, frames on average render in 8.33ms, but if we get one frame render in 8.43ms and next in 8.23ms, the 2nd frame gonna tear into 1st frame, in that case Vsync engages and delays stuff),
Frame rate limiter for sake of decreasing Vsync triggering (with -3fps, it is really rare Vsync will have to engage).
And this is part every gamer dreams about - good frame rate limiter option. Not enforcing "vsync" everywhere by default - you can always enable it in driver if needed. But frame rate limiter with good implementation can introduce close to no latency while make expierience consistent.
Windows actually does it exactly right. In classical window mode, we have proper composing with vsync. But the moment any aplication goes fullscreen or borderless fullscreen, composing and vsync goes off. Also windows even in windowed mode, supports variable refresh rate and decrease latency in that situation (compositor on windows is actually aware of adaptive sync).
One major issue for proper adaptive syncing during windowed mode is (again) implicit synchronization of buffers in present. This essentially means in wayland if one component lags (like Blender with giant scene) your entire desktop graphically lags. This really could ruin expierience for some cases.Last edited by piotrj3; 21 November 2022, 08:49 PM.
- Likes 3
Comment
-
Originally posted by mppix View PostSo KDE degraded picture quality of wayland. Thanks.
Have fun in eternal MR review process hell for your favorite project instead.
- Likes 7
Comment
-
glad to see wayland making it's snails pace progress, but im still not sold on this being a good thing for gamers to use, since it is at most 1 frame of latency before this, usually less so the majority of people IMO shouldn't really be using this. but more freedom of choice is always appreciated.
- Likes 1
Comment
-
Originally posted by TemplarGR View Post
Yeah, i don't really see the fuss about fractional scaling. Seriously. It never, EVER, worked well even on windows. It was a hack.
we are talking about shaders in MPV simply no longer working at real time, the desktop can even get sluggish depending on whats happening, this is absolutely absurd. and this isn't even considering the downscaling it then needs to do to hit back to a 4k display, which can absolutely crush quality in some cases. you know what xorg does? it doesn't force me to render everything at 5902x3320.
even at 1.6 scaling ratio, my screen is still rendering at 4796x2696. THIS is why fractional scaling matters, because many systems are literally unusable without it. for the record, my 580 is perfectly fine and usable at 4k resolution for the majority of things I use it for.
- Likes 7
Comment
-
Half of the people in the thread couldn't care less about other people's need of minimum latency (tens of millions of gamers). The Linux community in its essence.
Oh, how can you imagine Linux being a good desktop OS when you cannot provide this out of the box?
Damn, this is disgusting.
- Likes 4
Comment
Comment