Announcement

Collapse
No announcement yet.

A Low-Latency Kernel For Linux Gaming

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Low latency kernel does not mean good for gaming.
    It does not mean high performance.

    It means events are guaranteed to trigger with low latency, doesn't mean it will be fast, just quick.

    Low latency is great for industrial applications, robotics, physical security systems, medical systems, and such. Also good for audio production. Its not for gaming.

    Comment


    • #12
      This article seems pointless. With maximum fps over 60 in all games, who cares about the exact number? Minimum frame rate or the already mentioned frame jitter seem a much more suitable measure of the effect of an rt kernel to me.

      Comment


      • #13
        Originally posted by del_diablo View Post
        Which is blatantly false. A monitor does not need such a severe input lag as 100ms before it becomes noticable. All you need to do, is that you have a jitter of 1ms --> 10ms --> 1ms --> 10ms, and it should be noticable that the input is quite unsmooth, especially if you are testing a application where the input matters(hardcore quake FPS anybody?).
        I want your hardware that jitters between 100-1000 frames/s...

        Comment


        • #14
          Originally posted by DavidNielsen View Post
          It may be that what people are talking about in terms of improved gaming "performance" is not the framerate (which unsurprisingly does stay unchanged or degrades a bit with preemption enabled - let alone if one was to install a proper -rt kernel). However you might experience a situation where overall latency decreases which might improve responsiveness to input devices, or the overall "smoothness" of the game might feel more right. Call it trading off excess fps for equal low latency access (who really cares if you are pushing 172 fps if your mouse jerky, when it could all be smooth at say 160 fps).
          That's what I thought when I read low latency. I expected better response from controls and network latency. It would have obviously traded off fps and through output to achieve this.

          Comment


          • #15
            Originally posted by ownagefool View Post
            FYI, if we're seriously talking about 30ms kernel, I'd suspect we're seriously being way too slow already. We're talking about a group of people who raise their USB polling rates from 8ms to 1-2ms, using monitors than need to have < 10ms lag, and playing at high FPS (thus meaning < 10ms delay between screens) with generally 100hz+ refresh rates. A 30ms responce time is already way too slow, though I suppose thats a maxiumum as opposed to an average.
            I believe the latency numbers for rt kernels on good hardware are measured in *microseconds*, say average <10us and maxiumum <30us on a core2 duo e6300.

            Comment


            • #16
              Not.sure if it makes sense to benchmark a low-latency system if you don't have a latency benchmark or understand what latency is.

              Comment


              • #17
                Originally posted by del_diablo View Post
                Which is blatantly false. A monitor does not need such a severe input lag as 100ms before it becomes noticable. All you need to do, is that you have a jitter of 1ms --> 10ms --> 1ms --> 10ms, and it should be noticable that the input is quite unsmooth, especially if you are testing a application where the input matters(hardcore quake FPS anybody?).
                Nevermind that once you have gotten the mind into "ready mode", and are into a "flow of actions", the static reaction times is no longer there. Now... the 100ms median is true if you are waiting for twitching. But if you are in a constant twitching movement, in a state where you already have processed all the information, 100ms not your reaction time.
                First of i wasn't referring to games or computers. The numbers come from the book "Introduction to Human Factors and Ergonomics for Engineers". 100ms as reaction time is out of this world. There are quite some things that happen between the time you get stimulated and the time that you will do what you have to do. The human body has its limits.

                Comment


                • #18
                  Originally posted by ownagefool View Post
                  FYI, if we're seriously talking about 30ms kernel, I'd suspect we're seriously being way too slow already. We're talking about a group of people who raise their USB polling rates from 8ms to 1-2ms, using monitors than need to have < 10ms lag, and playing at high FPS (thus meaning < 10ms delay between screens) with generally 100hz+ refresh rates. A 30ms responce time is already way too slow, though I suppose thats a maxiumum as opposed to an average.
                  I think you're mixing things up. Where we're too slow actually? Linux kernel is much more responsive than Windows, but if this matters to things you described I'm not so sure. When I had stuttering in Skyrim under Windows after 1.5 patch I had to limit FPS and game become playable once again, so there are more important things that affect smoothness than kernel responsiveness. I bet your responsiveness will be messed up with real time kernel and games will simply be much worst. Just a bet, but I played a lot with custom kernels in the past and generic was usually the best when comes to gaming.

                  Comment


                  • #19
                    I agree with most of the previous posters, the benchmark only measures one aspect of many - of which some are much more important than "pure" fps.
                    - input latency
                    - input latency jitter
                    - fps jitter
                    are the three factors that would be most interresting.

                    Comment


                    • #20
                      For everyone complaining about <1ms kernel lag, consider that Xorg with its defaults can cause lag up to 30ms:
                      https://bugs.freedesktop.org/show_bug.cgi?id=46578

                      The only way to avoid that in the current situation is to run a dedicated X server for your game. Then it can't ignore your client, because it's the only one there.

                      Comment

                      Working...
                      X