Announcement

Collapse
No announcement yet.

Benchmarking The Ubuntu "Low-Jitter" Linux Kernel

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Which all leads to single conclusion - we need a test that measures delay within kernel responses and not kernel throughput/raw performance.

    Comment


    • #22
      @Paradox

      Please create an automatic test for this jitter measurement. We have the doom3 source, should be very easy.

      1. Insert timing calls before and after each frame
      2. Keep track of max, and average.
      3. At the end of a timedemo, calculate max - avg.
      4. Print that difference both as microseconds and as a percentage of the average frame time. "Jitter for timedemo1469 was 1500 usec, or 15%".
      5. ???
      6. PROFIT

      Comment


      • #23
        Wrong Benchmarks

        Phoronix Test Suite is awesome sauce, of course, but it's no good if you run the wrong tests.

        In the future, when testing latency, these are the benchmarks that would be most relevant:
        1. Add Cyclictest from the Realtime Linux wiki to the PTS (if not already present). This is the standard Linux latency benchmark.
        2. For game benchmarks, report the Minimum FPS, median, and standard deviation instead of average FPS. Most importantly, you want to know the single longest frame time, and 5th percentile would be useful as well.
        3. Throughput benchmarks should only be included as an afterthought, if at all. They don't measure the important variable.

        Comment


        • #24
          Originally posted by Paradox Uncreated View Post
          And as I said elsewhere, if it comes down to a choice between latency/max jitter and performance, 0.2ms (200uS) is where I stop caring.
          So you play your games on 5000 FPS?

          Comment


          • #25
            Two situations here:

            1) You have 30 FPS in game X, but frome time to time you see lags (usually when lots of things happen == most important parts of gameplay)

            2) You have 25 FPS in game X, all smooth, this is from any given frame to next eclipse same amount of time.

            Which one you choose?

            Well that depend on type of game. Solitare .... But generally games that require immediate action based on sound and video benefit from 2) scenario.

            That is why good (hardware) benchmarking sites depart from reporting FPS and present some mathematical derivations, like medians, avrgs, or min/max.

            Comment


            • #26
              Originally posted by Paradox Uncreated View Post
              On c64 I remember you had to do clever stuff with interrupts to do smooth scrolling. What has happened since then? Someone please update me!
              Software blitting due to insanely high CPU speeds and video memory throughput :-P Started around the time of the 486. Ended again when 3D accelerators came along.

              Comment


              • #27
                I would really like to know which benchmarks you run to compare kernels. I just played a few levels using dhewm3 and a card below midrange (gt630 kepler) and still have got 60 fps with a kernel with u default config. I did not notice input lags or whatever. As doom3 is very rarely played online (4 players max with default game, 8 with addon), maybe you could say something about quake live - but there the gfx card does not need to be so fast and i doubt the kernel config is so extremely important. Right now default u config means 250 HZ setting, thats 4 times faster than a tft refreshrate. What changes do you do to your kernel?

                Comment


                • #28
                  Originally posted by przemoli View Post
                  Two situations here:

                  1) You have 30 FPS in game X, but frome time to time you see lags (usually when lots of things happen == most important parts of gameplay)

                  2) You have 25 FPS in game X, all smooth, this is from any given frame to next eclipse same amount of time.

                  Which one you choose?

                  Well that depend on type of game. Solitare .... But generally games that require immediate action based on sound and video benefit from 2) scenario.

                  That is why good (hardware) benchmarking sites depart from reporting FPS and present some mathematical derivations, like medians, avrgs, or min/max.
                  But the benchmark does include a standard error. And, looking at the full results, it does show that the low jitter kernel usually reduces it, but the difference is very slight. On Prey it seems to make the largest difference of 0.06 FPS smoothing while eating 0.36 FPS. On the other hand, with Ultra quality Xonotic, it makes everything worse, both the framerate and smooth framerate.

                  Comment


                  • #29
                    Originally posted by GreatEmerald View Post
                    But the benchmark does include a standard error. And, looking at the full results, it does show that the low jitter kernel usually reduces it, but the difference is very slight. On Prey it seems to make the largest difference of 0.06 FPS smoothing while eating 0.36 FPS. On the other hand, with Ultra quality Xonotic, it makes everything worse, both the framerate and smooth framerate.
                    We want deviation measured from one frame to the next within a single benchmark, not one benchmark run to the next.

                    Comment


                    • #30
                      @Paradox Uncreated

                      I don't fully get what you mean by jitter. When you think of loading assets for the first time you certainly get a delay - that has got nothing to do with the kernel. Even on my ssd i get slighly different benchmarks when using timedemo for the first time. How on earth do you get your values?!

                      Comment

                      Working...
                      X