This 250k milestone is just about the testing done for LinuxBenchmarking.com and not the results done for Phoronix.com articles themselves over the past 11+ years with our other systems. This LinuxBenchmarking.com initiative started at the end of last year but really got underway this year when building the basement server room and getting up over 50 systems running for this daily Linux performance benchmarking -- 56 systems at present. As of writing this post, 250,112 benchmark results were uploaded, but that's a number that increases every few minutes.
The capacity for this basement server room is now nearly maxed out with four 42U server racks being full (there's two more 4U systems in the front office I need to rack up in the remaining slots). There's now four 20 Amp circuits feeding all of these systems while the power bills have become insane as a result, even with Phoromatic handling the powering on/off (WoL), etc for making efficient use of the systems.
Plus there's laptops, NUCs, and other small form factor devices all around... Any additional systems will likely be put in my main office area or elsewhere.
These systems are daily in their respective groups for tracking the upstream development code of prominent open-source projects to look for performance regressions, improvements over time, etc. Since starting at the end of last year, more than 250k individual benchmarks were completed -- there's more than one thousand individual benchmarks being run everyday, this 250k milestone would have been crossed many days ago had it not been for the Linux 4.2 kernel getting messed up on many of the systems.
All of this automated benchmarking is done by the Phoronix Test Suite with Phoromatic controlling the test scheduling, centralized management of systems, collection of results, etc. Phoronix Test Suite / Phoromatic is open-source and can be extended via commercial agreements and custom engineering services to make it fully tailored for running benchmarking labs within other organizations, etc.
A lot of interesting tests happen daily for these different test schedules. Phoronix Test Suite and Phoromatic make everything easy except for when there are the rare failure of system hardware/software, etc.
Aside from performance changes, the data has uncovered odd kernel/graphics behavior and other functional regressions over time too. Unfortunately, with doing all of this myself and LinuxBenchmarking.com itself not being a source of revenue, there's only so much time in a day with juggling everything else.
One of the main PTS/Phoromatic items on my TODO list are to improve the graph rendering speed / parsing -- particularly when it comes to viewing the results for the larger test groups over the course of the past 6+ months, sometimes there will be a stall due to the immense amount of data being rendered. I also have plans for some UI and graph visual improvements, among other items.
Go stop by LinuxBenchmarking.com to see all of the latest data for the different daily performance trackers. If you're an organization looking to deploy the Phoronix Test Suite and/or Phoromatic for carrying out your Linux benchmarking, contact us for commercial support to discuss options.
If you're just an end-user or someone interested in the work being done for tracking Linux performance, finding regressions, etc, please consider making a PayPal tip, Bitcoin tip, Phoronix Premium subscription, or other forms of support to help with this costly effort -- in terms of hardware, the increasing electrical costs, making the desired improvements to the LinuxBenchmarking.com site, etc -- for providing more open-source benchmarks. Thanks for your support with the Phoronix mission over the past 11+ years of enriching the Linux hardware experience. Code contributions are also welcome with our GPL benchmarking software being via the Phoronix-Test-Suite GitHub repository.