Originally posted by [Knuckles]
View Post
Announcement
Collapse
No announcement yet.
The Quest Of Finding Linux Compatible Hardware
Collapse
X
-
Michael Larabel
https://www.michaellarabel.com/
-
Wow, that sounds really valuable Michael. An empirically-generated HCL (Hardware Compatibility List) for arbitrary distros and configurations.
Of course, the success of the system depends upon three things in particular that I can identify:
(1) People using the system extensively enough that all relevant hardware has been tested. If there aren't enough data points, then it just becomes another search engine that happily says "0 results found".
(2) How heavily can users rely upon the results provided by other users? Maybe there is a misconfiguration, custom software, or a special piece of hardware (e.g. vendor customizations on top of a IHV base chip, especially with e.g. laptop sound systems) that can skew the results. And of course any information that is expected to come from the user, rather than the operating system, is suspect, as users are often wrong about judging what they think their hardware is, intentionally or by accident.
(3) How much detail does the system go into about the things we care about? For example, a gamer will want to know whether graphics card X running driver Y can run game Z with reasonable performance and no rendering artifacts. Just because someone running graphics card X says "it works nice on Ubuntu" doesn't mean, then, that they were using the same driver, or that said driver will be able to run the game you care about. As a particular example, finding a graphics card / driver combo able to run games like Savage 2, Heroes of Newerth, or Unigine games would be an interesting exercise, in a hypothetical future world where some implementations of some Mesa drivers can render these games correctly. Right now it's very cut-and-dry: use Radeon HD4000 or later, or Nvidia G80 or later with the respective proprietary drivers. But in the future it may not be as obvious if r600g starts to support the extensions required by these games.
The value of such a database would be related to the depth of the data gathered, imho. Very general comments like "Works" (as in other HCLs) is almost completely useless in the case of complex hardware like GPUs, where you can say "Works" to some degree if it can start X -- and that's true for every card supported by vesa. And sometimes performance isn't the most important issue, either: I don't care if I can get 500 fps in a "benchmark" of Unigine, if all the effects aren't rendered and half the textures are black. Driver implementations can, and very often do, process API calls incorrectly without crashing or otherwise reporting an error, even if the resulting image is partially or completely messed up. Incorrect rendering is as damning, if not more damning, than bad performance. Automatically detecting correct rendering (or, alternatively, relying upon the user to report rendering issues) should become a crucial input for reporting results to the database.
Comment
-
Originally posted by allquixotic View Post(1) People using the system extensively enough that all relevant hardware has been tested. If there aren't enough data points, then it just becomes another search engine that happily says "0 results found".
Originally posted by allquixotic View PostAs a particular example, finding a graphics card / driver combo able to run games like Savage 2, Heroes of Newerth, or Unigine games would be an interesting exercise
Originally posted by allquixotic View PostThe value of such a database would be related to the depth of the data gathered, imho. Very general comments like "Works" (as in other HCLs) is almost completely useless in the case of complex hardware like GPUs, where you can say "Works" to some degree if it can start X -- and that's true for every card supported by vesa.
Originally posted by allquixotic View PostAnd sometimes performance isn't the most important issue, either: I don't care if I can get 500 fps in a "benchmark" of Unigine, if all the effects aren't rendered and half the textures are black...Automatically detecting correct rendering (or, alternatively, relying upon the user to report rendering issues) should become a crucial input for reporting results to the database.Michael Larabel
https://www.michaellarabel.com/
Comment
-
Originally posted by Nobu View PostWell, sometimes you have to be careful about what network adapter or printer/scanner you get, but I guess that's not the kind of hardware Michael was referring to.
The rule for printers seems to be HP, Epson, or any printer properly supporting HP PCL or Postscript just like "picking NVidia" for graphics cards. But you've got to do a bit of research before buying to avoid the lemons in that space.
Scanners...now that's a minefield still (and this would include support on MF devices...). Most of the Epson MF devices seem to be supported "okay" as with HP's (though there ARE devices that support hasn't happened on for both brands...). Cannon's stuff is hit or miss (Part of the CanoScan LIDE line is WELL supported, part of it is in the expensive paperweight/doorstop category... Same goes for their printers...sigh...)
Comment
-
Originally posted by RealNC View PostThe quest for Linux-compatible hardware is actually quite simple:
* Get any sound card except X-Fi.
* Get an NVidia graphics card.
That's all there is to it
Compatible with proprietary drivers though? You may be right, in some ways, though other ways that isn't true. For instance, even with fglrx I can still use xrandr and the open source GUIs which utilize that, while AFAIK you still cannot with nVidia's proprietary driver. If you don't care about standards as I do so be it, but standards and openness give you more options and empower you and the community, so those kinds of things are better to support.
Comment
-
Originally posted by Michael View PostI continue to be amazed at what I already find on the system... Very few hardware yields 0 results. Heck, I even ended up finding Sandy Bridge information dating back to mid December!
Originally posted by Michael View PostWrite test profiles for HoN and Savage and you can easily find out.
Originally posted by Michael View PostOnly manual user data is inputting save name, identfier, and description from the Phoronix Test Suite. No other manual data is asked of the user.
I'm Joe User, and I'm a Linux gamer. To figure out whether I can play the hottest new games coming out on Linux, I just have to go to your website, and do a search for video card / graphics driver combos that support my game. The definition of "support" will vary between users, so maybe there should be different test profiles with different in-game settings. The complexities abound:
- Maybe the games I'm interested in only work with the binary driver? I'd like to know that up-front.
- Maybe a certain game only works on minimum detail settings, or using an optional renderer written to an older OpenGL spec? I'd still like to see the graphics card / driver pair that produced a running game included in the results.
- Rarely, the minimum detail settings will employ techniques that cause rendering issues or crashes, but the higher detail settings use newer paths (e.g. GLSL) that work correctly! It'd be great to know this, too.
We need this level of detail, because the OpenGL specification does an extremely poor job of creating a contract between the device driver implementation and the application. Bugs aside, there's still the matter of (mis)interpretation of the spec, and you see issues crop up all the time, even with the binary drivers on Windows and an army of fastidious testers. We need to know exactly which games work under which drivers against which cards. Having the app developer specify "system requirements" is grossly inadequate on Linux; it's bad enough as it is on Windows. Throw in the spotty, fickle support of OpenGL in Mesa, for those of who (god forbid) value freedom, and you've got a lot of data to mine.
Originally posted by Michael View PostThe Phoronix Test Suite already supports this: http://www.phoronix.com/vr.php?view=14380 and there will continue to only get more qualitative tests going forward.
So, what's your strategy for rolling this all out?Here's what I'd do :
- Test Development: Push open-source contributors, gamer enthusiasts, application developers and hardware vendors to develop real-world application and game tests. Writing tests benefits everyone. It benefits users because it provides the foundation for fleshing out the data set. It benefits application / game developers because it makes users aware of their software, while also letting potential customers see how their software runs (or doesn't run) on relevant hardware. It benefits hardware vendors because they can show off the hottest hardware or recent driver improvements with real-world tests that users care about.
- Test Run Submission: This is the bread-and-butter of the project; it has to be. Having good tests is not useful unless they are run on a very broad range of hardware and software configurations. This not only exposes potential issues with the applications tested, but also may expose driver issues or limitations of certain hardware. Test runs should be done primarily by enthusiasts and end-users. For end-users, it needs to be easy for them to volunteer to execute a large number of tests in an automated fashion. Maybe you should develop a way for someone to just turn on PTS at night when they go to sleep, and it will download tests to execute from a centralised work queue, managed by Phoronix personnel, which is basically a list of the most desired test runs to be done? Think of what BOINC does with distributed computing. Apply this simplicity to distributed testing.
- Database Harvesting: This is where end-users and companies use your search forms to gather useful product evaluation data based on the collected results. This component will naturally accrue a user-base to the extent that the data returned is useful. So you don't really need to do anything to advertise or encourage people to use this: it'll be like Google, with people using it all the time once they realize how good it is. But you have to go through the first two points to get to that level.
Hope I gave you some ideas...
Comment
-
Originally posted by nobody View Postdon't leave out coreboot support
Of course, I hate motherboard makers already for their destruction of CPU socket standards, so I don't have very high hopes for BIOS standards.
Comment
-
Originally posted by allquixotic View Post
- Maybe the games I'm interested in only work with the binary driver? I'd like to know that up-front.
- Maybe a certain game only works on minimum detail settings, or using an optional renderer written to an older OpenGL spec? I'd still like to see the graphics card / driver pair that produced a running game included in the results.
- Rarely, the minimum detail settings will employ techniques that cause rendering issues or crashes, but the higher detail settings use newer paths (e.g. GLSL) that work correctly! It'd be great to know this, too.
You can basically figure that out from auto-parsing the test results and levels of performance for each driver and what are the most common drivers being used, etc. There will also be a RESTful external API eventually [one of many things on my ever-growing TODO list] for also external analysis of said data.
Originally posted by allquixotic View PostThrow in the spotty, fickle support of OpenGL in Mesa, for those of who (god forbid) value freedom, and you've got a lot of data to mine.
Originally posted by allquixotic View PostTest Development: Push open-source contributors, gamer enthusiasts, application developers and hardware vendors to develop real-world application and game tests. Writing tests benefits everyone. It benefits users because it provides the foundation for fleshing out the data set. It benefits application / game developers because it makes users aware of their software, while also letting potential customers see how their software runs (or doesn't run) on relevant hardware. It benefits hardware vendors because they can show off the hottest hardware or recent driver improvements with real-world tests that users care about.
Originally posted by allquixotic View PostTest Run Submission: This is the bread-and-butter of the project; it has to be. Having good tests is not useful unless they are run on a very broad range of hardware and software configurations. This not only exposes potential issues with the applications tested, but also may expose driver issues or limitations of certain hardware. Test runs should be done primarily by enthusiasts and end-users. For end-users, it needs to be easy for them to volunteer to execute a large number of tests in an automated fashion
Originally posted by allquixotic View PostMaybe you should develop a way for someone to just turn on PTS at night when they go to sleep, and it will download tests to execute from a centralised work queue, managed by Phoronix personnel, which is basically a list of the most desired test runs to be done? Think of what BOINC does with distributed computing. Apply this simplicity to distributed testing.
Hits to test profile pages, test results, most common test results, number of times a test/suite is cloned, etc, is all tracked. From that information it can auto-determine what's popular. Nothing on OpenBenchmarking.org requires manual maintenance; even the list of distributions to select from in these search areas is auto-generated based upon the most popular distributions it sees in use over the course of recent days/weeks.Michael Larabel
https://www.michaellarabel.com/
Comment
Comment