Announcement

Collapse
No announcement yet.

A Low-Latency Kernel For Linux Gaming

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by kraftman View Post
    The sad thing is you're still mixing things up. The stock Linux kernel is much more responsive (exactly in terms of latency) than Windows kernels, so you don't have to use low-latency one to make it better.
    Theres nothing sad about me responding to someone who suggested that the rt kernel has a max latency of around 30ms, and stating thats too slow for an interative game. I never claimed those, nor any other figures were true, thus I don't particularly see your point that I'm mixing _anything_ up. Furthermore, I didn't claim that either Linux nor Windows was more responsive, only that the numbers quoted were more competitive. I'd hazard a guess that the amount of work required to produce an interative game may vary between platforms, but I don't particularly know anything about the platforms under the hood, thus I wouldn't and didn't suggest either was faster. Perhaps you could read what I've said, and understand the context of the discussuon before you jump the gun in future. Furthermore, if you are going to disagree with me (or rather, in this case, what you've imagined I've said) than you ought to actually back that up, otherwise you come across as a frothing at the mouth fanboy, which'll quickly get yourself ignored.

    @minuseins - Whilst I'd be all too happy to game at 10ms latency with regards to pretty much all my equipment, when faced with opponents who're suggesting the human eye can't see over 24fps, I think actually displaying the differences between 100 and 1000 are measurable would be beneficial. Today we generally sit with game engines that have 33-50ms tick rates and monitors which have a delay of around the same number, and getting anything faster is significantly harder than it was 10 years ago. The industry is going in the wrong direction because of the "good enoughs" myths, and as someone who'd like to get things faster, I don't think theres any issue with setting the ideal speeds fairly high, as long as it gets us moving in the right direction.

    With regards to latency, modern games generally take very little network traffic, however you are right that there is quite a difference between the 33ms tick rates, and 1ms tick rates, you'd be talking about 1000 times as much traffic in a 32 man server. I'm not sure off the top of my head if that'd be completely unfeasible, but it would eliminate many from online gaming; as a competitive LAN player though, the option would be nice. On top of that, input lag is additional to network latency (and server tick rates) as you're responding to the data you've been given. As you previously said, you're already behind, so limiting any further input lag is still beneficial, and shouldn't be ignored. Alas the competitive gamer doesn't get a whole lot of say in the matter, so at best I can cross my figures and hope they don't artifically limit those OLED (if they ever release computer monitors) to stupidly low refresh rates (ie, 60hz, which freaking sucks).

    Comment


    • #32
      Modern games stabilized around two numbers: 30 FPS (mostly console games) and 60 FPS (PC games). 30 FPS is the lowest acceptable FPS, and it's chosen out of necessity, because it turns out that some people prefer (with their dollars) richer graphics (possible with 33 ms per frame) over smoother experience.

      As for network games, network latency (ping to server) is rather orthogonal aspect to visual latency because of two factors:

      1) ALL network games [try hard to] predict server response in advance (by essentially running the same code - on possibly stale/incomplete data - on clients) and then correct/compensate/lerp after getting authoritative server data. Games would be unplayable if you there was a delay of 20-50 ms between you pressing forward and you actually moving forward.

      2) Game tick (i.e. when game entities are updated, when they "think" - which in network client-server based games only happens on servers) already happens at different (often lower) rate than drawing. For user experience, it's much more important to update animations (which often are local to clients), non-game-affecting physics (particles, smoke, etc), and other "visual" things in a network game, which happens client-side.


      P.S. When talking about games, it's misleading to use FPS. It's better to compare frame time in milliseconds, and for latency tests, it's better to compare standard deviation of this value instead of an "average FPS".

      Comment


      • #33
        Originally posted by JanC View Post
        I want your hardware that jitters between 100-1000 frames/s...
        What if its just a costum GLXgears demo? 1600fps is fairly easy to reach on that one, and then just add a rotateable camera and a script which produces the weird sinus curve of reaction time. It would be really annoying to smoothly move.

        Comment


        • #34
          Originally posted by kraftman View Post
          The sad thing is you're still mixing things up. The stock Linux kernel is much more responsive (exactly in terms of latency) than Windows kernels, so you don't have to use low-latency one to make it better.
          by the way talking about "wiwi responsiveness" is being DUMB.....

          i explain : have you tried to do a search in the registry of win7 x64 ...?
          have you tried to save it ?
          few days ago i used nt regopt http://www.larshederer.homepage.t-online.de/erunt/

          at end it shows a box about the results , then i saw the registry of my newly installed wiwi7 is above 2 Go.....

          M$ does worse at each wiwi

          i can not wait to play with linux and steam ;']

          Comment


          • #35
            just m2c as guy-with-his-own-thoughts, didn't study in any of these fields

            generally, because some ppl are like "1ms reaction time must mean 1000fps" ... this is generally _not_ true. In a "normal/simple" game engine you have a step/redraw loop that tries to use the system to the max there is a coupling between these two to some extent. Generally it would also be possible to draw a frame 1ms after the input is recorded (this is the lag) and then wait 9ms before fetching input again (which equals 100fps being drawn), in which you can eg. run other stuff that may not need to be "this much instant" (animation updates for instance), so the loop would look something like this:
            input --1ms--> action/redraw completed --9ms--> physics/animation/shadow pre-calculations, ...
            So you need hardware that can push out a frame in less than 1ms (lets say 200ms for updating content, 800ms for drawing), but you then don't have to drive it in a tight loop but rather in a burst mode (and with some logic I am sure you can increase the available render time for at least part of the scene/output, spreading the load further without sacrificing latency)

            Comment


            • #36
              Phoronix could learn a thing or two

              Check out the techreport.com GPU testing. They do meaningful benchmarks showing minimum framerates and render latency.

              That's where it really counts for gaming.

              I suspect this kernel would show better minimum fps and less jitter as others have mentioned.

              I'd also be curious to see this compared to the kernel ck-low latency patch set.

              Comment


              • #37
                What a strange thread. A collection of unverifiable statements of fact and logic arguments that challenge the intellect. Low latency is a good thing. High throughput is a good thing. The lower you can make the latency without affecting throughput, the better. In certain cases where throughput is static or completely arbitrary, additional latency related gains can be made. In cases where latency is arbitrary, throughput related gains can be made.

                There's a theoretical limit to both latency and throughput. Unless some new magical scheduler algorithm or multifaceted mystical hardware comes about, we're not going to get an order-of-magnitude improvement out of today's commodity hardware. It's my opinion that we should stop trying to squeeze that extra 10% our of an already well-optimized kernel (with available low latency options), and start focusing on new tech and/or reducing the things in kernel that cause the latency to begin with. Schedulers are fun and all, but we're not going to improve much if we keep re-implementing the same thing over and over again. Fair queuing (or whatever CFS is doing) was probably the last significant leap we'll see in a while.

                In the mean time, if you want your game to run smoothly when background tasks are running, linux provides a "nice" way of facilitating this.

                F

                Comment


                • #38
                  It's too bad that some so disparage Xorg because as is common problems coexist with power and while Xorg may seem an overlapping disorganized mess it is nonetheless wide open and full of config power and is still being developed and refined. Wayland may be an improvement but it must also come at a cost. So whether you feel powerful or stuck is no matter because mouse movement/response is eminently controllable in Xorg. I used to use Xset but apparently it is no longer enough on it's own but increased power comes with the new rules. If you want to own your mouse especially on an rt-lowlatency kernel check this out http://tinyurl.com/nzahdtm

                  Comment


                  • #39
                    Originally posted by Paradox Uncreated View Post
                    Latency

                    And frames above 73hz really isn`t useful anyway.
                    .
                    Just FTR that statement may only have meat on the bones if your refresh rate is at or under 73Hz. Back when games allowed extreme user control but still in effect today, setting FPS to match Refresh made for extreme smoothness. For example capping Quake 3 Arena at 120FPS and setting Refresh to 120 Hz makes tricks possible that are extremely fifficult or even impossible at different settings. It is my understanding that this creates a hard sync in timing and i can assure you, whatever the cause, you can feel it. This of course assumes your monitor is capable of 120Hz.

                    Note - Yeah I know Q3 is a 20+ year old game and even those that still play it are mostly on Quake Live, but the effect is the same and afaik independent of the game being a natural timing issue.

                    Comment


                    • #40
                      Originally posted by RCL_ View Post
                      Modern games stabilized around two numbers: 30 FPS (mostly console games) and 60 FPS (PC games). 30 FPS is the lowest acceptable FPS, and it's chosen out of necessity, because it turns out that some people prefer (with their dollars) richer graphics (possible with 33 ms per frame) over smoother experience.

                      As for network games, network latency (ping to server) is rather orthogonal aspect to visual latency because of two factors:

                      1) ALL network games [try hard to] predict server response in advance (by essentially running the same code - on possibly stale/incomplete data - on clients) and then correct/compensate/lerp after getting authoritative server data. Games would be unplayable if you there was a delay of 20-50 ms between you pressing forward and you actually moving forward.

                      2) Game tick (i.e. when game entities are updated, when they "think" - which in network client-server based games only happens on servers) already happens at different (often lower) rate than drawing. For user experience, it's much more important to update animations (which often are local to clients), non-game-affecting physics (particles, smoke, etc), and other "visual" things in a network game, which happens client-side.


                      P.S. When talking about games, it's misleading to use FPS. It's better to compare frame time in milliseconds, and for latency tests, it's better to compare standard deviation of this value instead of an "average FPS".
                      Used to be 60 for PC. It's tilting towards 120Hz and 144Hz more and more. Bunch of such specialized monitors on gaming peripherals market.

                      Comment

                      Working...
                      X