Over at openbenchmarking.org/linux I have been experimenting with some improvements to OpenBenchmarking.org for the past few weeks. What's shown at the user-interface level is quite simple: it's serving as just a directory of Linux hardware (right now the public version just shows motherboards and processors, but the graphics card category will be enabled next). It's similar to the OpenBenchmarking.org Performance Index where it lists all detected hardware components of a certain type, as it discovers new hardware from the crowd-sourced benchmarking process. From there you can explore the product, find benchmark results, etc.
The openbenchmarking.org/linux hardware directory takes advantage of some new code-paths in determining hardware model numbers, product series, etc. Of course, it's all automated and self-learning. Once complete, it will be folded back into the main OpenBenchmarking.org area to provide greater search and hardware indexing capabilities.
There are some visible differences though when reaching a product page. For example, the ASUS Crosshair IV Formula motherboard. Some of the new information exposed from my "experimental playground" that's not currently shown from the main OpenBenchmarking.org web-site when conducting a similar motherboard search is showing comparable motherboards (on the basis of the motherboard chipset and other matching PCI devices), a cleaner appearance for the Linux distributions where the given product has been tested, and some of the detected on-board devices.
Users can subsequently click on one of the detected integrated devices (e.g. a motherboard's USB controller), which will then reveal the Linux distributions where that specific ASIC has been used, other motherboards also using the given ASIC, and the detected Linux driver(s) for that particular component.
The lspci information and example dmesg outputs for the given controller/device is also available from this page. It's all automatically handled by OpenBenchmarking.org.
The CPU area also displays new information nicely. Comparable CPUs (in terms of performance by compositing all of the Phoronix Test Suite results for each processor) are shown and there's also more detailed CPU information: the CPU core/thread counts, L2 cache size, special instruction sets (e.g. SSE4, SSE5, AVX, XOP, FMA4), whether AES encryption is supported on the hardware, if an Energy Performance Bias is supported, and the hardware virtualization method supported. Popular motherboards for the given CPU are also shown as well as what Linux distributions the CPU has been tested with. The /proc/cpuinfo is also available. Everything is generated automatically from the data that's captured by the Phoronix Test Suite when opting to submit the results and system hardware/software information to OpenBenchmarking.org. An example is the brand new AMD FX-8150 Bulldozer Eight-Core CPU.
This already offers up some more information beyond what's easily reachable from the main OpenBenchmarking.org search area right now, but it will all be integrated when the interfaces are settled. Besides playing around with these new features in the /linux area right now due to the features still being in-development, it's also separate from the main site right now to help me determine the external API requirements for OpenBenchmarking.org in finishing the public API for this trove of Linux hardware and software data.
Part of the next phase of the OpenBenchmarking.org conquest becomes even more interesting. With most people likely going to such a Linux hardware directory when shopping for new Linux hardware or experiencing a hardware functionality/performance issue, a new OpenBenchmarking.org component will be coming to provide greater assistance. When viewing a product (like the Intel Core i5 2500K), new information that will also be pulled in provide information on outstanding Linux bug-reports covering the hardware and/or the driver it uses. Important commits from the Linux kernel relevant to that product, with the corresponding version where the change was introduced, is also to be displayed.
This new capability will be turned on soon (as in perhaps this week, I'm just working on some more enhancements first), but for any graphics card, motherboard, CPU, PCI device, and hard drive (perhaps other components too) it will add in this extra information it finds about the product and/or device driver it utilizes. The BugZilla databases of popular Linux distributions and upstream software projects will be polled automatically by OpenBenchmarking.org, depending upon the type of hardware component. OpenBenchmarking.org will automatically attempt to find any relevant open bug reports, without displaying every single bug report, such as for issues that may be trivial or irrelevant to most users.
The Git repository of the Linux kernel is monitored as well using new OpenBenchmarking.org infrastructure in attempting to identify commits that implement new functionality for the given hardware/software, correct serious bugs, or other noteworthy changes. This can allow the user to see Linux version requirements or what they would need for an ideal level of support. Conquering mailing lists is next to offer similar functionality to this OpenBenchmarking.org smart crawling of Git repositories and bug reports. Monitoring the Git repositories of Mesa and the DDX drivers is also on my road-map to implement, which should be easy as well and coming shortly.
Having this bug/commit information correlated to hardware components isn't solely for the user's benefit with being able to see rough version requirements or how Linux support has matured for a given driver and piece of hardware. OpenBenchmarking.org and the Phoronix Test Suite are already well integrated and designed to be extremely extensible and all interfaces abstracted. When pairing that architecture with already existing features like being able to automatically find regressions within code-bases (such as the Linux kernel, or the Phoronix Test Suite can be used for actively monitoring for regressions as is used by CodeWeavers/Wine.) and even changes in power consumption and other functional areas, new opportunities arise. This new data from parsing bug reports and commit histories can likely be used for speeding up the auto-bisecting process.
There's already hundreds of thousands of test runs and over one million component configurations that's recorded by OpenBenchmarking.org, and when pairing this growing data-set with these other pools of information (Git, BugZilla, mailing lists), I'm also hoping OpenBenchmarking.org will be able to organically narrow down bugs based upon detected test result differences and how the hardware/software combinations vary with their matching issues. Since the Phoronix Test Suite / OpenBenchmarking.org captures the dmesg, xorg.conf, and other important system information, this should be quite feasible in the very near future. On the opposite side, it's also working to providing auto-recommended optimization strategies for users.
This also comes in to play with another long-standing OpenBenchmarking.org action item I've had of basically the crowd-source benchmarking (in a faintly similar manner to how Folding@HOME works) where your system's excess compute time could be donated to a particular project to run functional/performance tests in an automated manner to help uncover regressions or other anomalies.
With OpenBenchmarking.org becoming more aware of how Git repositories relate to certain hardware/software components and smart parsing of mailing lists, another item on my TODO list is support in the soon-to-be-done Phoromatic-on-OpenBenchmarking.org-implementation that would allow for the automated benchmarking of new patch-sets as they hit a mailing list. Rather than just running Phoromatic on a timed basis or per code commit, Phoromatic could support monitoring any mailing lists and then via the Phoromatic account hooks set to apply the patch(es) to a code-base and then run a given set of test profiles/suites. Regressions could be caught sooner before they even hit the tree and are just going through initial code review.
Well, that's just some of what's coming up in the near future. Due to these new features and other upcoming work, as I mentioned on Twitter, I'm not ruling out the possibility that OpenBenchmarking.org could be pulling in even more traffic than Phoronix.com in 2012. I'm also aspiring to make creating a PTS/OpenBenchmarking.org test profile to illustrate a bug (whether it be performance related or not), one of the most valuable contributions an end-user can do to solve an issue. Phoronix Test Suite 4.0 will also be delivered in H2'2012 with more ground-breaking work, just not for Linux but any operating system.
Feedback and other thoughts are always welcome. I will also be at the Ubuntu Developer Summit next week in Orlando. I'll also be around Germany and Austria next month for any meetings.