Announcement

Collapse
No announcement yet.

KWin-LowLatency: An Effort To Yield Less Stutter & Lower Latency With The KDE Desktop

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • aksdb
    replied
    Originally posted by Awesomeness View Post

    Wrong. The majority uses Mesa drivers and KWin on Wayland works just fine with them. There are a few regressions but far from "completely broken".
    Interesting. I tried it with Mesa drivers and get black borders around windows, some tray icons not showing properly and some applications outright crashing.
    So for me it is still a no-go, while it works fine on X11.

    Leave a comment:


  • oiaohm
    replied
    Originally posted by czz0 View Post
    It's true that Vsync is force enabled for Windows 10 with no way to disable, but fullscreen games are supposed to fully bypass the compositor, just like Gnome is supposed to do.
    Try again. Vsync enabled was in fact forced in Windows since windows 3.5 with the in kernel graphic solution.

    If you think you have turned Vsync off under windows you have in fact failed. Windows 10 removed the bogus in fact do nothing vsync option.

    Originally posted by czz0 View Post
    Professional CSGO players use 240Hz or 144Hz monitors and do not care about tearing which is almost unnoticeable at that refresh rate anyway.
    Tearing is a lot worse than you think. Even at 240HZ proper tearing of rendering a no ready buffer multi times over is still absolute horrible.

    Originally posted by czz0 View Post
    You want Vsync completely disabled all the way through, on Windows or Linux.
    You are only saying this because you are not aware that no matter what you do on Windows you are not turn off the vsync option enabled by the NT kernel.

    Last time you could in fact run on windows with Vsync fully off was Windows Me the last of the 9x series.

    Be very careful what you are asking for that is really what you want. Min Vsync is smart this buffer is half done and not ready skip it display last fully complete buffer when line up for vsync. Applications don't really need to know this is going on. You do not ever want Vsync complete off.

    Leave a comment:


  • czz0
    replied
    Originally posted by oiaohm View Post

    This is a lie because there are two different forms of vsync disable. Even when you turn off vsync on Windows or OS X the buffer to screen is still processed to screen by vsync. The opengl library lies to application and say start rendering next frame instead of waiting for the vsync window.

    When you disable vsync on X11 server you do something major worse. You disable vsync in the present code so you get tearing from hell.

    You don't want option to turn off vsync in the compositor. You want option to lie to application so it does not wait for a vsync before starting to render stuff but this is really a opengl quirk flag.

    Basically you don't want Vsync blanket off as this is half rendered buffers being displayed to users and fairly much make you not be able to see players lining up to attack you in FPS.

    Pro players in fact want vsync half off.
    You want Vsync completely disabled all the way through, on Windows or Linux.

    You do not want Vsync disabled in the game, but still running through a desktop compositor with Vsync

    It's true that Vsync is force enabled for Windows 10 with no way to disable, but fullscreen games are supposed to fully bypass the compositor, just like Gnome is supposed to do.

    Tearing is infinitely better than Vsync input lag for competitive FPS games.

    Professional CSGO players use 240Hz or 144Hz monitors and do not care about tearing which is almost unnoticeable at that refresh rate anyway.

    Leave a comment:


  • oiaohm
    replied
    Originally posted by czz0 View Post
    Vsync off for fast pace games is not just my preference, it is horrible for any FPS or fast paced game, literally 100% of any professional competitive FPS players, and the vast majority of any regular competitive FPS players, have Vsync off..
    This is a lie because there are two different forms of vsync disable. Even when you turn off vsync on Windows or OS X the buffer to screen is still processed to screen by vsync. The opengl library lies to application and say start rendering next frame instead of waiting for the vsync window.

    When you disable vsync on X11 server you do something major worse. You disable vsync in the present code so you get tearing from hell.

    You don't want option to turn off vsync in the compositor. You want option to lie to application so it does not wait for a vsync before starting to render stuff but this is really a opengl quirk flag.

    Basically you don't want Vsync blanket off as this is half rendered buffers being displayed to users and fairly much make you not be able to see players lining up to attack you in FPS.

    Pro players in fact want vsync half off.

    Leave a comment:


  • czz0
    replied
    Originally posted by schmidtbag View Post
    Speak for yourself. Even though my gaming PC uses X11, I deliberately keep vsync on because I prefer a tear-free experience over input latency. Sure, a hard cap of 60Hz is dumb (assuming your display goes higher) and I agree there should be a [user-friendly] way to disable vsync, but you are heavily exaggerating your preferences as though they're what everyone else prefers.
    I explicitly was talking about gaming, so stop with the strawman argument of Vsync for desktop usage which I said nothing about.

    Vsync off for fast pace games is not just my preference, it is horrible for any FPS or fast paced game, literally 100% of any professional competitive FPS players, and the vast majority of any regular competitive FPS players, have Vsync off.

    No one with a brain will tell you Vsync is okay to have enabled for a game like CSGO or Overwatch.
    Last edited by czz0; 09 May 2019, 05:44 AM.

    Leave a comment:


  • royce
    replied
    More importantly, you won't be driving acceptably mixed dpi display scenarios out of X11. 4k is relatively rare now, but it is gaining ground same as 1080p has done.

    Leave a comment:


  • Britoid
    replied
    Originally posted by debianxfce View Post

    That is your personal opinion, not a technical fact. A fact is that wayland is not ready and popular and never will be. Time has passed by of wayland in 10 years, you can use the Xfce desktop with a pentium III computer.
    You're not going to be gaming on a Pentium III computer.

    Leave a comment:


  • oiaohm
    replied
    Originally posted by debianxfce View Post
    That is your personal opinion, not a technical fact. A fact is that wayland is not ready and popular and never will be. Time has passed by of wayland in 10 years, you can use the Xfce desktop with a pentium III computer.
    Wayland is already used in embedded solutions with cpu and ram less than what you get in pentium III. Just because it not got to main stream desktop that much does not mean we don't have good performance figures from its embedded usage.

    Sway currently is functional on weaker hardware than xfce can run on.

    Leave a comment:


  • oiaohm
    replied
    Originally posted by debianxfce View Post
    Small things do matter and can be a game changer. See the history of the universe.
    Problem is the performance gain going across to wayland and getting out of the X11 fail crap improves stuff more than that small difference.

    Leave a comment:


  • oiaohm
    replied
    Originally posted by debianxfce View Post
    No sense to run Xwayland when the X windowing system rules. Wayland and gnome3 will fade away in the future. More and more people realize how bad it is,see Linux news:
    https://www.forbes.com/sites/jasonev...-surprise-you/
    Really look closer at those benchmarks. Xfce is not that much faster than gnome 3 on X11. Mostly not saving you power either.

    Leave a comment:

Working...
X