Announcement

Collapse
No announcement yet.

KWin-LowLatency: An Effort To Yield Less Stutter & Lower Latency With The KDE Desktop

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • RealNC
    replied
    Originally posted by debianxfce View Post
    So you do not need the 120Hz refresh rate in Linux. 60Hz is fine for everything for most of the people. Freesync and LFC is for preventing tearing and no need to buy expensive high refresh rate monitors.
    120Hz is not only useful for games. The desktop itself sees a huge improvement. Once you used a 120Hz desktop, 60Hz compares extremely poorly. Everything is blurry when it moves and feels sluggish and stuttery, be it scrolling in web pages and documents, desktop effect animations or moving a window on the desktop. There's just no comparison. If you haven't used a 120Hz desktop yourself, you can't really understand.

    Also, the attitude of "60Hz is fine for most people" is, frankly, annoying. If 60Hz is fine for you, then I'm afraid your standards are too low. I've been using high refresh rate monitors since the 90s. 60Hz never was, and never will be "fine." It's only fine if you never actually had anything better.

    Leave a comment:


  • RealNC
    replied
    Originally posted by debianxfce View Post
    Xfce works fine without the compositor as you wrote. Compositors slow down performance, so disable them in gaming computers that you have too.
    I'm not playing games in Linux anymore (I dual boot into my Wintendo 10 install for that, which is the only thing it's good for.) I use my Linux install for work and multimedia, and I don't want it to feel like it's from 1995. I'm afraid disabling compositing for me in 2019 is completely and utterly out of the question.

    In any event, this KWin fork actually solved ALL my issues :-) Perfectly smooth, no stutter or frame skips, no lag. So it's not like I have to switch to something different anymore.
    Last edited by RealNC; 22 May 2019, 06:33 AM.

    Leave a comment:


  • RealNC
    replied
    Originally posted by debianxfce View Post
    Vote with your package manager and install the Xfce desktop. KDE is never ready and the KDE desktop is schizophrenic compared to simple,freely configurable, stable, light and fast Xfce.
    I tried all desktop environments. All had problems. XFCE's is its inability to make use of modern, high refresh rate displays:



    It seems it's worse than vanilla KWin. It also doesn't sync its compositing loop with the monitor's vsync but instead applies an unsynced frame limit, which results in frame drops and stutter. And in this case, that unsynced frame cap seems fixed to 60FPS.

    This is NOT how you do compositing. macOS and Windows do it right. This KWin fork seems to finally get it right.
    Last edited by RealNC; 21 May 2019, 05:16 AM.

    Leave a comment:


  • RealNC
    replied
    I just installed it. It's great. Using nvidia binrary driver. No hiccups and frame drops anymore when playing video. Less latency too. This is how KWin should always have been. It is some serious improvement over vanilla KWin.

    Leave a comment:


  • oiaohm
    replied
    Originally posted by tildearrow View Post
    Really? I thought you couldn't have a sync object for the VBlank interval...
    You can but inside the GPU only.

    Originally posted by tildearrow View Post
    But actually, what I mean is a method to wait for VBlank without swapping buffers.
    This is a horrible no you cannot from the CPU this is not a Linux/Wayland/X11 limitation this is a GPU designed in limitation.
    GitHub is where people build software. More than 100 million people use GitHub to discover, fork, and contribute to over 420 million projects.

    This protocol is done this way it is because this is the GPU limitation.

    Presentation time is exploiting the fact you can get the time a buffer was in fact displayed.


    Looking though the weston implementation of presentation time will be use to you.

    This is where you start hating dynamic vsync rate. Fixed vsync you can by tracking when the buffers present and aligning you timer events based off that feedback you can choose to wait for vsync to pass. Dynamic is a pure ass.

    This limitation of not being able to in fact wait for a vsync on the cpu side is true under x.org without wayland as well. This is just how the opengl/vuklan/gpu are.

    Leave a comment:


  • oiaohm
    replied
    Originally posted by debianxfce View Post
    You should prove that when Randr is none, it activates tearfree in the xf86-video-amdgpu driver. Because you are not a developer, you can not post link to a code line line that proves your point.
    Originally posted by debianxfce View Post
    I see tearing with this: https://www.testufo.com/stutter
    Invalid test. This test on Linux disable tearing controls. Application level tearing you can force by turning different features off.

    Originally posted by debianxfce View Post
    So tearfree is not enabled in the xf86-video-amdgpu driver when it is auto. The video is tear free when enabling the xfce compositor.
    The test you pointed to will look just teared the same with the xfce compositor on or off. I am a xfce desktop user.

    Really I am said I am a support person. This means I don't need to put up the source. You are claiming to be a software developer so please show me the exact lines of code. Support person has to know what tests are total bogus. You know less than a support person.

    I may not be a developer with wine. But does not mean I am not a Software developer on other things.

    Leave a comment:


  • tildearrow
    replied
    Originally posted by oiaohm View Post

    EGL_CHROMIUM_sync_control is not required. EGL_KHR_sync + egl_swapinterval settings should do the job. Basically standard EGL. There is EGL_NV_SYNC and a few other vendor particular.
    Really? I thought you couldn't have a sync object for the VBlank interval...

    But actually, what I mean is a method to wait for VBlank without swapping buffers.

    Leave a comment:


  • oiaohm
    replied
    Originally posted by debianxfce View Post
    When Randr is none, it is not enabled. See Settings/Display of the Xfce desktop.
    Not true.
    https://github.com/xfce-mirror/xfwm4...307bdaa4dfa7f2
    Randr rotation 0 is done by GDK part of GTK. You will see commit after commit address Randr issues using GDK. Great fun a default setting hidden inside libraries in this case is GDK.

    Settings/Display of the Xfce desktop is a big fat lie in places. This is what happens when you don't built the toolkit you are using and it does not in fact tell you everything. Things are done in background you are not aware of.

    Really you are so far out of your depth and only digging yourself deeper.

    Leave a comment:


  • skeevy420
    replied

    Leave a comment:


  • oiaohm
    replied
    Originally posted by debianxfce View Post
    There is tearing when moving windows fast and with the vertical bar tearing web test page when the compositor is disabled and no Tearfree option in xorg.conf. So Tearfree is off in the xf86-video-amdgpu driver.
    Completely wrong for the current version of the driver.

    man amdgpu
    Option "TearFree" "boolean"
    If this option isn't set, the default value of the property is auto, which means that TearFree is on for rotated outputs, outputs with RandR transforms applied and for RandR 1.4 slave outputs, otherwise off.

    This is from the man page with xf86-video-amdgpu 18.1.99.

    Note the bold out the amdgpu man page on debian. Default when xorg.conf does not contain a Tearfree option is auto. Do note rotated as in you send a RandR rotation instructions and the result is on.

    Option "TearFree" "off" need to be in xorg.conf for TearFree to be off otherwise more often than not its in fact on. Old versions of xf86-video-amdgpu the default was off.

    Originally posted by debianxfce View Post
    Vsync is a feature of the GPU card and Tearfree is a feature of the f86-video-amdgpu driver. See: https://en.wikipedia.org/wiki/Screen_tearing#V-sync
    Yep xf86-video-amdgpu with tearfree ends up using the vsync for buffer switching behind your back. This results in half on state. Where the buffers inside applications are not being processed with vsync in mind but the output rendering is being processed with vsync in mind.

    Originally posted by debianxfce View Post
    IT support persons are low educated usually. I planned to stop this nonsense but you trolled.
    IT support personal know to read man pages as you don't. You are lower educated than us support personal as we learn things change.

    xf86-video-amdgpu TearFree has changed from a default of off to a default of auto that mostly equals on. Your video very much looks like xfce on xf86-video-amdgpu after the change in default.




    Yes people says that testing/buster with amd cards is rendering better than prior stable. The reason is a lot todo with the fact TearFree using Vsync is most of the time turning on by default. This is more aligned with how Windows NT-!0 has been doing it the complete time.
    Last edited by oiaohm; 14 May 2019, 01:57 AM. Reason: Added the debian man pages.links and note about the difference.

    Leave a comment:

Working...
X