Announcement

Collapse
No announcement yet.

Linux 4.2 vs. 4.3 Kernel Benchmarks On Other Systems

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Linux 4.2 vs. 4.3 Kernel Benchmarks On Other Systems

    Phoronix: Linux 4.2 vs. 4.3 Kernel Benchmarks On Other Systems

    Last week I delivered some Linux 4.3 Git kernel benchmarks on Intel Skylake comparing it to Linux 4.2 stable. However, for those not yet on Intel's latest generation of processors, here are some Linux 4.2 vs. Linux 4.3 benchmarks with older hardware...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    You should add support for perf or profiling into the testsuite, then add an option to compare where most time was spent when running the tests
    and identify where the regressions were introduced.

    Comment


    • #3
      Originally posted by mlau View Post
      You should add support for perf or profiling into the testsuite, then add an option to compare where most time was spent when running the tests
      and identify where the regressions were introduced.
      Oh, I'd be already happy if he just went straight ahead and fixed the regression. *lol*

      Comment


      • #4
        Originally posted by mlau View Post
        You should add support for perf or profiling into the testsuite, then add an option to compare where most time was spent when running the tests
        and identify where the regressions were introduced.
        There's automated bisecting support, though not used for most articles due to still taking a fair amount of time to run. But perf could be added too as a module to PTS if I had the time or support/sponsors.
        Michael Larabel
        https://www.michaellarabel.com/

        Comment


        • #5
          Originally posted by mlau View Post
          You should add support for perf or profiling into the testsuite, then add an option to compare where most time was spent when running the tests
          and identify where the regressions were introduced.
          Ta da example: http://www.phoronix.com/scan.php?pag...f-module&num=1
          Michael Larabel
          https://www.michaellarabel.com/

          Comment


          • #6
            Originally posted by Michael View Post

            There's automated bisecting support, though not used for most articles due to still taking a fair amount of time to run. But perf could be added too as a module to PTS if I had the time or support/sponsors.
            An interesting fact about LLVM is that it has infrastructure for precisely this sort of thing. The infrastructure was Apple-internal --- a cluster with literally around a thousand LLVM builds, and automatic machinery for bisecting through these old builds to find where a particular regression occurred, once it was determined to be of interest. Apple recently opened this up to the public, making the tools that perform the bisection public and putting all the LLVM builds onto an Amazon machine.
            I find this all really neat, both the fact that the tools exist and are used this way, and the fact that Amazon virtual machines can be used in this way, to host, continually update, and run a thousand binaries for public use.

            Comment

            Working...
            X