Announcement

Collapse
No announcement yet.

Top Latency-Reducing Optimizations for Linux?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    So much hearsay involve Xorg vs Wayland performance... We're on Phoronix - no one has any benchmarks? Latency is king in the context of my thread, but performance is still a top criteria.

    Comment


    • #12
      Originally posted by make_adobe_on_Linux! View Post
      So much hearsay involve Xorg vs Wayland performance... We're on Phoronix - no one has any benchmarks? Latency is king in the context of my thread, but performance is still a top criteria.
      Just try it out for yourself. Benchmarking input latency is really hard to capture. It's a very subjective "feeling".

      Just fire up a DE in a xorg session and a wayland session and then play around with games and apps and see which session "feels" worse for you. It'll be the wayland session for damn sure.

      Comment


      • #13
        Originally posted by duby229 View Post
        Just try it out for yourself. Benchmarking input latency is really hard to capture. It's a very subjective "feeling".

        Just fire up a DE in a xorg session and a wayland session and then play around with games and apps and see which session "feels" worse for you. It'll be the wayland session for damn sure.
        Xorg sessions have higher output lag than Wayland sessions. Test case: Grab a window with the mouse and drag it around smoothly.


        Feel is a very bad way of measuring input latency. The bug here shows where a person though X11 had worst input lag than wayland. They were right.

        Thing to remember both wayland compositor and x11 servers can use hardware accelerated mouse pointer. Interesting point some users complain that wayland is slower because lot of wayland compositor choose default to a software pointer instead of a hardware accelerated so when you move the mouse the pointer does not respond as fast but stays correctly placed on window edge. Others are like this issue where X11 has hardware cursor enabled and the mouse pointer when dragging a window moves away from the window edge and also complain that the interface has input lag. Yes you can fun flip these settings over so wayland compositor is using hardware mouse pointer and x11 is using software and watch both groups of people complain just the same except now the ones that were complaining about X11 will be complaining about wayland and the reverse as well.

        There are a few rock and hard place problems when it comes to humans and subjective feeling of latency. Yes do you like hardware mouse pointer or do you like software mouse pointer is a very good question.

        Proper benchmarking input latency is not in fact hard you just need a high speed camera and a trigger device for mouse and keyboard. This is so you can measure how much time between mouse event/keyboard event and how long to screen.

        Of course if you are comparing just the output of screen latency you can get away with basic camera.
        We test the input lag of the Haier 55E5500U 4K 55" UHDTV. The total input lag measured for this TV is 43ms.


        You personal human subjective feeling of latency that is a different beast you will have personal preferences. duby229 I bet you cannot answer if you are a hardware mouse pointer person or a software mouse pointer person. Stupid as it sounds in surveys of end users about 60% of the population likes software mouse pointer and about 40 percent like hardware mouse pointer allowing for the survey limited sample size and percentage of error possibility that could really be 50/50. Horrible point is there was basically no one in the surveys done that did not care one way or the other.

        Movement of mouse pointer by hardware or software is only one factor. Another is did the mouse curse blink when it first moved or not blink when it first moved. Just with the mouse pointer alone there are 15 documented things that will throw a person subjective idea of latency well off. Yes 2 to the power of 15 or 32768 different combinations of settings to choose from just with mouse pointer if the interface you are using is not set to the right combination you can be thinking it slow when really camera benchmarks will tell you its fast.

        Anyone who says they can tell what has true low input latency by subjective means is talking out ass because there are too many little things that will throw human perception off. Only way to check input latency is camera for real. After that there is a long set of personal settings so that you get the felling of low input latency yes some of those setting will for some people in fact make input latency by camera worse but by making input latency really worse make input latency fell faster to them this is the human brain subjective nature for you.

        Comment


        • #14
          Originally posted by oiaohm View Post

          Xorg sessions have higher output lag than Wayland sessions. Test case: Grab a window with the mouse and drag it around smoothly.


          Feel is a very bad way of measuring input latency. The bug here shows where a person though X11 had worst input lag than wayland. They were right.

          Thing to remember both wayland compositor and x11 servers can use hardware accelerated mouse pointer. Interesting point some users complain that wayland is slower because lot of wayland compositor choose default to a software pointer instead of a hardware accelerated so when you move the mouse the pointer does not respond as fast but stays correctly placed on window edge. Others are like this issue where X11 has hardware cursor enabled and the mouse pointer when dragging a window moves away from the window edge and also complain that the interface has input lag. Yes you can fun flip these settings over so wayland compositor is using hardware mouse pointer and x11 is using software and watch both groups of people complain just the same except now the ones that were complaining about X11 will be complaining about wayland and the reverse as well.

          There are a few rock and hard place problems when it comes to humans and subjective feeling of latency. Yes do you like hardware mouse pointer or do you like software mouse pointer is a very good question.

          Proper benchmarking input latency is not in fact hard you just need a high speed camera and a trigger device for mouse and keyboard. This is so you can measure how much time between mouse event/keyboard event and how long to screen.

          Of course if you are comparing just the output of screen latency you can get away with basic camera.
          We test the input lag of the Haier 55E5500U 4K 55" UHDTV. The total input lag measured for this TV is 43ms.


          You personal human subjective feeling of latency that is a different beast you will have personal preferences. duby229 I bet you cannot answer if you are a hardware mouse pointer person or a software mouse pointer person. Stupid as it sounds in surveys of end users about 60% of the population likes software mouse pointer and about 40 percent like hardware mouse pointer allowing for the survey limited sample size and percentage of error possibility that could really be 50/50. Horrible point is there was basically no one in the surveys done that did not care one way or the other.

          Movement of mouse pointer by hardware or software is only one factor. Another is did the mouse curse blink when it first moved or not blink when it first moved. Just with the mouse pointer alone there are 15 documented things that will throw a person subjective idea of latency well off. Yes 2 to the power of 15 or 32768 different combinations of settings to choose from just with mouse pointer if the interface you are using is not set to the right combination you can be thinking it slow when really camera benchmarks will tell you its fast.

          Anyone who says they can tell what has true low input latency by subjective means is talking out ass because there are too many little things that will throw human perception off. Only way to check input latency is camera for real. After that there is a long set of personal settings so that you get the felling of low input latency yes some of those setting will for some people in fact make input latency by camera worse but by making input latency really worse make input latency fell faster to them this is the human brain subjective nature for you.
          Another wall of bullshit from you.

          The bottom line fact is that using wayland feels terrible. If -anybody- doesn't believe that then all they have to do is fire up a DE in a wayland session for themselves. In my own testing wayland is a horrible user experience, definitely -THE- worst user experience of any modern OS by far.

          You can waste your time benchmarking your camera all you want, but while you're doing that input lag on wayland is still so terrible it's horribly unusable.
          Last edited by duby229; 25 May 2020, 10:57 PM.

          Comment


          • #15
            Originally posted by duby229 View Post

            Just try it out for yourself. Benchmarking input latency is really hard to capture. It's a very subjective "feeling".

            Just fire up a DE in a xorg session and a wayland session and then play around with games and apps and see which session "feels" worse for you. It'll be the wayland session for damn sure.
            It's possible to benchmark - but a bit depending on time resolution the cost can matter a lot. With a reasonably cheap digital camera capable of 1000 frames/second it's possible to get quite good results. I have had reason to do similar things for other purposes where filming a LED turning on when input ("mouse key") activated and then counting frames until the system response happened.

            One thing with keyboards is that many keyboards with 1000 Hz report rate still doesn't read out the keys with that time resolution - or read out the keys but adds key debounce - so it may still take quite a number of USB reports until the actual key press is sent. But the placebo effect still remains - the keyboard owner is happy about the responsive keyboard with the fast 1kHz report rate (and 1 kHz is a magic value for USB because that's the protocol synchronization frequency in the protocol specification).

            There really is a huge amount of placebo effect when it comes to latency. Even world-class gamers tests hardware and tell how one is so much quicker/better - because that manufacturer have printed the 1kHz figure. While the other hardware might actually be the faster if really tested.

            We people are quite easily fooled. It's same thing with weight. Lift a heavy camera and you'll end up nodding "this is good shit - very good quality" compared to the lighter camera made with a plastic shell. At the same time, it's plastic shells that are used in helmets and not magnesium - because magnesium is brittle. But we can still buy equipment with a heavy piece of metal plate at the bottom just to trig the "quality" senses in customers. Our mind is just so very bad at making correct decisions - so many senses that tricks us and overrides the intelligent part of our brains.

            Comment


            • #16
              Originally posted by zyxxel View Post

              It's possible to benchmark - but a bit depending on time resolution the cost can matter a lot. With a reasonably cheap digital camera capable of 1000 frames/second it's possible to get quite good results. I have had reason to do similar things for other purposes where filming a LED turning on when input ("mouse key") activated and then counting frames until the system response happened.

              One thing with keyboards is that many keyboards with 1000 Hz report rate still doesn't read out the keys with that time resolution - or read out the keys but adds key debounce - so it may still take quite a number of USB reports until the actual key press is sent. But the placebo effect still remains - the keyboard owner is happy about the responsive keyboard with the fast 1kHz report rate (and 1 kHz is a magic value for USB because that's the protocol synchronization frequency in the protocol specification).

              There really is a huge amount of placebo effect when it comes to latency. Even world-class gamers tests hardware and tell how one is so much quicker/better - because that manufacturer have printed the 1kHz figure. While the other hardware might actually be the faster if really tested.

              We people are quite easily fooled. It's same thing with weight. Lift a heavy camera and you'll end up nodding "this is good shit - very good quality" compared to the lighter camera made with a plastic shell. At the same time, it's plastic shells that are used in helmets and not magnesium - because magnesium is brittle. But we can still buy equipment with a heavy piece of metal plate at the bottom just to trig the "quality" senses in customers. Our mind is just so very bad at making correct decisions - so many senses that tricks us and overrides the intelligent part of our brains.
              Even if this were all true, then it is still sowing doubt and uncertainty. Can't you feel it?

              Comment


              • #17
                Originally posted by duby229 View Post
                Another wall of bullshit from you.

                The bottom line fact is that using wayland feels terrible. If -anybody- doesn't believe that then all they have to do is fire up a DE in a wayland session for themselves. In my own testing wayland is a horrible user experience, definitely -THE- worst user experience of any modern OS by far.

                You can waste your time benchmarking your camera all you want, but while you're doing that input lag on wayland is still so terrible it's horribly unusable.
                By the way duby229 you cannot really test wayland as it a protocol not software.

                I have personally tested sway, weston, gnome, kde, arcan(x11 client and wayland client) and X11 server stuff.

                arcan for input performance is one of the best wayland clients under it do better than X11.

                The Clutter master clock supports three synchronization methods in order of preference: (a) Hardware presentation times (b) Swap throttling (c) Fake vsync at 60.00Hz

                This here is a good read because this showed me why when I was benchmarking gnome with camera back then with wayland or X11 with a 60hz screen why there was always glitch. Screen timing by specification was 59.95 frames per second and mutter in fake vsync mode was doing 60 frames per second real instead so you would have a frame glitch on a 60hz screen every 20 seconds. 60hz, 120hz, 240 hz screen values are all in fact rounding up so you don't ever really want todo exactly 60 fps on a 60hz screen, 120fps on a 120hz screen or 240fps or a 240hz screen because it will glitch. Heck 60fps exact on a 240hz screen will glitch. Yes that glitch will be detracting you every 20 seconds or so.

                GoPro HERO8 Black at roughly 350 dollars with its 240 fps capture is good enough to split the differences in input behaviour between the different Wayland compositors and x11 compositors because the differences are not all the small side as anyone who has tested in fact knows. Yes countable in whole frames.

                Here is a fun fact Compositing window manager under X11 is always at least 1 frame slower on output than X11 without a Compositing windows manager even when you do compositor bypass this is designed into the X11 protocol and obeyed by x.org X11 server. Wayland protocol does not have this bug so compositing under Wayland can be as fast as X11 without a compositing windows manager. Yes arcan supports X11 applications but does not obey this limitation.

                That the thing duby229 have you ever played with arcan if you have not maybe you should. After that you would learn that X11 x.org server solutions is fairly much crap as well. The true server side GPU accelerated(shader) windows decorations of arcan make a huge ass difference if you are going todo server side decorations. Yes this effects latency on windows resizes massively.

                In fact after having a fairish high speed camera (240-500fps) video capture on arcan and compare it to windows windows/os x you will see defects in windows/os x solution as well.

                The bar for what we expect need to be higher than what X11 x.org solutions are offering heck need to be higher than what MS Windows and OS X is offering as arcan shows that is truly achievable. Yes some wayland compositors can in fact put a good showing against arcan as well. Please note I said some I will be truthful most don't.


                Reality here is a long list of bugs have been located and fixed in x.org X11 server from the Wayland process of re-implementing. So X.org x11 today is only as good as it is because of the wayland work and can still improve a bit more yet but some of the X11 protocol design issues cannot be over come at least without breaking/throwing away some of it. Arcan for example supports X11 applications but does not support and never will X11 windows managers or X11 compositors. Arcan has thrown away and replaced those section of the X11 specifications.

                Wayland throws away all X11 protocol and starts over.
                Arcan throws away large sections of X11 protocol.
                Xwayland is also working on over time throwing away the windows manager and x11 compositor sections of the X11 protocol.

                So interesting right wayland compositors/xwayland and Arcan for the parts of X11 protocol they are throwing away are on the same path. There is no other way to fix X11 Protocol than get rid sections of it for good at this point.

                X11 protocol has been on a diet for quite some time as sections of it get nuked out of existence. The time had to come for X11 windows manager and X11 compositor part at some point.

                Originally posted by make_adobe_on_Linux! View Post
                Even if this were all true, then it is still sowing doubt and uncertainty. Can't you feel it?
                Exactly what duby229 is really doing particularly when he is not naming the wayland compositor he was testing and when. Like with gnome 2-3 years wayland ago stuck at exact 60fps that wrong for everything was really bad. Interesting enough wayland westons and gnome exact 60 fps mistake for fake vsync comes straight out upstream x.org X11 server and its not the only bug like this.

                Comment


                • #18
                  Originally posted by oiaohm View Post
                  Exactly what duby229 is really doing particularly when he is not naming the wayland compositor he was testing and when. Like with gnome 2-3 years wayland ago stuck at exact 60fps that wrong for everything was really bad. Interesting enough wayland westons and gnome exact 60 fps mistake for fake vsync comes straight out upstream x.org X11 server and its not the only bug like this.
                  Correct - it was not wayland I was testing. The world has a huge number of systems where performance still matters - and developers needs tools to try to figure out if their system is fast enough. And if not, then tools to try to figure out where the time is lost.

                  Comment


                  • #19
                    Higher polling speed mice (like 2000hz) can actually increase latency due to CPU interrupts

                    Comment


                    • #20
                      Originally posted by Moscato View Post
                      Higher polling speed mice (like 2000hz) can actually increase latency due to CPU interrupts
                      Mouse interrupts are not really a factor that anyone needs to worry about in the kind of systems we are talking about here. It's waaaaay more important if you have a network card with a good coalesce function or not, so the machine doesn't get one interrupt for every single network packet that is sent or received.

                      And next thing is how much the OS needs to do when the mouse is moved - with older Microsoft Word, you could see the CPU load reach 100% when the mouse was moved over Word, because that mouse move was reflected to thousands of graphical objects to let them figure out if they were affected or not by the move. All an effect of a solution that didn't do a drill-down from the current mouse position but instead traversed the full tree of parent/child objects for events processing.

                      Comment

                      Working...
                      X