Announcement

Collapse
No announcement yet.

Looking At An Early Performance Regression In Linux 5.13 - Scheduler Related

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Looking At An Early Performance Regression In Linux 5.13 - Scheduler Related

    Phoronix: Looking At An Early Performance Regression In Linux 5.13 - Scheduler Related

    Since the Linux 5.13 merge window began settling down and especially now with 5.13-rc1 out the door, I've been ramping up performance testing of the Linux 5.13 kernel. So far I've been seeing one area where the kernel is regression and stems from the scheduler changes this cycle...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Thanks a lot Michael

    Comment


    • #3
      Luckily the Linux community have Michael, without him there would be unnoticed regressions.

      Comment


      • #4
        This needs a post in LKML or a bug report in https://bugzilla.kernel.org/

        Comment


        • #5
          Man I feel like the linux community would really benefit from having something like this automatically run. It would probably cost too much to run as a CI, but maybe on every release candidate benchmarks could automatically be ran?

          Comment


          • #6
            I still don't understand why the Linux Foundation have millions in funding and cannot give something to Michael for his amazing work finding all these problems !

            Comment


            • #7
              Originally posted by iskra32 View Post
              Man I feel like the linux community would really benefit from having something like this automatically run. It would probably cost too much to run as a CI, but maybe on every release candidate benchmarks could automatically be ran?
              micheal That comment has me curious as to about how long these tests took you, if a higher core system would run them faster, and, based on those answers, why weekly, biweekly, or monthly tests aren't done. With the amount of things I know that you've caught doing your benchmarks it seems like it would be a no-brainer for them to do some sort of time-framed benchmarking.

              Comment


              • #8
                Sometimes I have to wonder what type of unit tests the developers run before they commit these type of changes. I realize that many times it is a real pain in the butt, but that doesn't excuse them either not doing them or having issue suited unit tests. Is there a way we can petition the Linux Foundation to employ Michael for this valuable work he does?

                Comment


                • #9
                  Originally posted by iskra32 View Post
                  Man I feel like the linux community would really benefit from having something like this automatically run. It would probably cost too much to run as a CI, but maybe on every release candidate benchmarks could automatically be ran?
                  I used to run my kernel benchmarks on various systems daily with PTS+Phoromatic at LinuxBenchmarking.com but ultimately too expensive and no corporate support that I had to quit the effort.
                  Michael Larabel
                  https://www.michaellarabel.com/

                  Comment


                  • #10
                    Originally posted by skeevy420 View Post

                    micheal That comment has me curious as to about how long these tests took you, if a higher core system would run them faster, and, based on those answers, why weekly, biweekly, or monthly tests aren't done. With the amount of things I know that you've caught doing your benchmarks it seems like it would be a no-brainer for them to do some sort of time-framed benchmarking.
                    Higher core count systems help with running things faster but not with all bugs... Really need a mix of systems. Like the CPPC freq invariance being spotted in part by its impact on smaller systems. Especially with many of the 'corporate' developers testing mostly on such high core count servers and less so on smaller or desktop systems.

                    But yes it would be trivial to turn back on more daily/weekly testing if I had the resources.
                    Michael Larabel
                    https://www.michaellarabel.com/

                    Comment

                    Working...
                    X