Announcement

Collapse
No announcement yet.

The Quest Of Finding Linux Compatible Hardware

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Originally posted by Nobu View Post
    Well, sometimes you have to be careful about what network adapter or printer/scanner you get, but I guess that's not the kind of hardware Michael was referring to.
    I'd say you should be less concerned with network adapters these days and more about which printers and scanners. And more the scanners.

    The rule for printers seems to be HP, Epson, or any printer properly supporting HP PCL or Postscript just like "picking NVidia" for graphics cards. But you've got to do a bit of research before buying to avoid the lemons in that space.

    Scanners...now that's a minefield still (and this would include support on MF devices...). Most of the Epson MF devices seem to be supported "okay" as with HP's (though there ARE devices that support hasn't happened on for both brands...). Cannon's stuff is hit or miss (Part of the CanoScan LIDE line is WELL supported, part of it is in the expensive paperweight/doorstop category... Same goes for their printers...sigh...)

    Comment


    • #17
      Originally posted by RealNC View Post
      The quest for Linux-compatible hardware is actually quite simple:

      * Get any sound card except X-Fi.
      * Get an NVidia graphics card.

      That's all there is to it
      Compatible with open source drivers? No, the OSS drivers for AMD cards are much more compatible i.e. many more features are functional than for nVidia cards. This means with Linux bundles/distros containing the open source drivers, nVidia will not be more compatible out-of-the-box.
      Compatible with proprietary drivers though? You may be right, in some ways, though other ways that isn't true. For instance, even with fglrx I can still use xrandr and the open source GUIs which utilize that, while AFAIK you still cannot with nVidia's proprietary driver. If you don't care about standards as I do so be it, but standards and openness give you more options and empower you and the community, so those kinds of things are better to support.

      Comment


      • #18
        Originally posted by Michael View Post
        I continue to be amazed at what I already find on the system... Very few hardware yields 0 results. Heck, I even ended up finding Sandy Bridge information dating back to mid December!
        Then hopefully the existing result set is enough to help kickstart future adoption.
        Originally posted by Michael View Post
        Write test profiles for HoN and Savage and you can easily find out.
        This sounds like fun! I'll see what I can't scrounge up with my modder friends in the S2 community...

        Originally posted by Michael View Post
        Only manual user data is inputting save name, identfier, and description from the Phoronix Test Suite. No other manual data is asked of the user.
        With that particular quote, I was referring more to the searchable fields in the database, which of course, have to correspond to results collected from tests. I'm not sure if application compatibility is going to be a priority, but I think it should be. Here's a simple example.

        I'm Joe User, and I'm a Linux gamer. To figure out whether I can play the hottest new games coming out on Linux, I just have to go to your website, and do a search for video card / graphics driver combos that support my game. The definition of "support" will vary between users, so maybe there should be different test profiles with different in-game settings. The complexities abound:
        • Maybe the games I'm interested in only work with the binary driver? I'd like to know that up-front.
        • Maybe a certain game only works on minimum detail settings, or using an optional renderer written to an older OpenGL spec? I'd still like to see the graphics card / driver pair that produced a running game included in the results.
        • Rarely, the minimum detail settings will employ techniques that cause rendering issues or crashes, but the higher detail settings use newer paths (e.g. GLSL) that work correctly! It'd be great to know this, too.

        We need this level of detail, because the OpenGL specification does an extremely poor job of creating a contract between the device driver implementation and the application. Bugs aside, there's still the matter of (mis)interpretation of the spec, and you see issues crop up all the time, even with the binary drivers on Windows and an army of fastidious testers. We need to know exactly which games work under which drivers against which cards. Having the app developer specify "system requirements" is grossly inadequate on Linux; it's bad enough as it is on Windows. Throw in the spotty, fickle support of OpenGL in Mesa, for those of who (god forbid) value freedom, and you've got a lot of data to mine.

        Originally posted by Michael View Post
        The Phoronix Test Suite already supports this: http://www.phoronix.com/vr.php?view=14380 and there will continue to only get more qualitative tests going forward.
        Sounds like you need help writing meaningful tests! OpenBenchmarking / PTS might provide a good platform for creating good tests, but without a lot of actually useful tests, followed by a lot of users running said tests to collect data, search results will be spotty / outdated.

        So, what's your strategy for rolling this all out? Here's what I'd do :
        • Test Development: Push open-source contributors, gamer enthusiasts, application developers and hardware vendors to develop real-world application and game tests. Writing tests benefits everyone. It benefits users because it provides the foundation for fleshing out the data set. It benefits application / game developers because it makes users aware of their software, while also letting potential customers see how their software runs (or doesn't run) on relevant hardware. It benefits hardware vendors because they can show off the hottest hardware or recent driver improvements with real-world tests that users care about.
        • Test Run Submission: This is the bread-and-butter of the project; it has to be. Having good tests is not useful unless they are run on a very broad range of hardware and software configurations. This not only exposes potential issues with the applications tested, but also may expose driver issues or limitations of certain hardware. Test runs should be done primarily by enthusiasts and end-users. For end-users, it needs to be easy for them to volunteer to execute a large number of tests in an automated fashion. Maybe you should develop a way for someone to just turn on PTS at night when they go to sleep, and it will download tests to execute from a centralised work queue, managed by Phoronix personnel, which is basically a list of the most desired test runs to be done? Think of what BOINC does with distributed computing. Apply this simplicity to distributed testing.
        • Database Harvesting: This is where end-users and companies use your search forms to gather useful product evaluation data based on the collected results. This component will naturally accrue a user-base to the extent that the data returned is useful. So you don't really need to do anything to advertise or encourage people to use this: it'll be like Google, with people using it all the time once they realize how good it is. But you have to go through the first two points to get to that level.

        Hope I gave you some ideas...

        Comment


        • #19
          Originally posted by nobody View Post
          don't leave out coreboot support
          Agreed, proprietary BIOSes really need an overthrowing. Maybe motherboard makers will start making standards for BIOSes as well so that it's easier to put Coreboot on them, if no such standards currently exist much.

          Of course, I hate motherboard makers already for their destruction of CPU socket standards, so I don't have very high hopes for BIOS standards.

          Comment


          • #20
            Originally posted by allquixotic View Post

            • Maybe the games I'm interested in only work with the binary driver? I'd like to know that up-front.
            • Maybe a certain game only works on minimum detail settings, or using an optional renderer written to an older OpenGL spec? I'd still like to see the graphics card / driver pair that produced a running game included in the results.
            • Rarely, the minimum detail settings will employ techniques that cause rendering issues or crashes, but the higher detail settings use newer paths (e.g. GLSL) that work correctly! It'd be great to know this, too.
            You can basically figure that out from auto-parsing the test results and levels of performance for each driver and what are the most common drivers being used, etc. There will also be a RESTful external API eventually [one of many things on my ever-growing TODO list] for also external analysis of said data.

            Originally posted by allquixotic View Post
            Throw in the spotty, fickle support of OpenGL in Mesa, for those of who (god forbid) value freedom, and you've got a lot of data to mine.
            OpenBenchmarking.org prior to its launch already has around 300MB of such information to play with.

            Originally posted by allquixotic View Post
            Test Development: Push open-source contributors, gamer enthusiasts, application developers and hardware vendors to develop real-world application and game tests. Writing tests benefits everyone. It benefits users because it provides the foundation for fleshing out the data set. It benefits application / game developers because it makes users aware of their software, while also letting potential customers see how their software runs (or doesn't run) on relevant hardware. It benefits hardware vendors because they can show off the hottest hardware or recent driver improvements with real-world tests that users care about.
            Yep, and with OpenBenchmarking.org any registered user can upload their own tests and suites (or build one from the web interface) and have it available to any PTS 3.0+ user via the package-management-like system for how everything is handled. It's no longer a matter of getting the test profile pushed to me and then included in the next release, they're instantly available as long as it doesn't break the test profile spec in a particular version of PTS.


            Originally posted by allquixotic View Post
            Test Run Submission: This is the bread-and-butter of the project; it has to be. Having good tests is not useful unless they are run on a very broad range of hardware and software configurations. This not only exposes potential issues with the applications tested, but also may expose driver issues or limitations of certain hardware. Test runs should be done primarily by enthusiasts and end-users. For end-users, it needs to be easy for them to volunteer to execute a large number of tests in an automated fashion
            Judging by http://global.phoronix.com/ in recent months, it should be no problem getting users to push their results when upgrading to PTS3.

            Originally posted by allquixotic View Post
            Maybe you should develop a way for someone to just turn on PTS at night when they go to sleep, and it will download tests to execute from a centralised work queue, managed by Phoronix personnel, which is basically a list of the most desired test runs to be done? Think of what BOINC does with distributed computing. Apply this simplicity to distributed testing.
            I've already developed a way to do this quite a while ago as part of Phoromatic. The next-generation Phoromatic is also running atop OpenBenchmarking.org and is compliant with its API.

            Hits to test profile pages, test results, most common test results, number of times a test/suite is cloned, etc, is all tracked. From that information it can auto-determine what's popular. Nothing on OpenBenchmarking.org requires manual maintenance; even the list of distributions to select from in these search areas is auto-generated based upon the most popular distributions it sees in use over the course of recent days/weeks.
            Michael Larabel
            http://www.michaellarabel.com/

            Comment


            • #21
              Here's another exclusive preview

              Basically showing off an example of what happens if you're looking up a specific product on OpenBenchmarking.org... Current prototype; pardon some of the reviews and global matches listed not being a perfect match, I am still tuning the search algorithms.

              Michael Larabel
              http://www.michaellarabel.com/

              Comment


              • #22
                P.S. just how easy it is to benchmark off of OpenBenchmarking.org will be revealed in a video probably next week... With Phoronix Global it's already easy in doing something like phoronix-test-suite benchmark 123213213213, but with OpenBenchmarking.org it's made even easier
                Michael Larabel
                http://www.michaellarabel.com/

                Comment


                • #23
                  Originally posted by RealNC View Post
                  The quest for Linux-compatible hardware is actually quite simple:

                  * Get any sound card except X-Fi.
                  * Get an NVidia graphics card.

                  That's all there is to it
                  Actually I wont use any audio card if it does not have hardware mixing in Linux. So if I want to stay in the cost effective space I need to use creative almost exclusively.

                  I have gone from audigy2zs to the X-fi as the sound quality is tons better. I like more than one sound to be able to be played back at the same time from multiple apps at the same time so hardware mixing is the only way to do this currently unless you want to use a cpu hogging pulseaudio setup.

                  Of course if you require 3d with performance there is no choice but nvidia imo.

                  Comment


                  • #24
                    Originally posted by MNKyDeth View Post
                    I have gone from audigy2zs to the X-fi as the sound quality is tons better. I like more than one sound to be able to be played back at the same time from multiple apps at the same time so hardware mixing is the only way to do this currently unless you want to use a cpu hogging pulseaudio setup.
                    Nah, works just fine here without PulseAudio. And on everyone else's system too. I used to require hardware mixing back when I had a 486 at 33Mhz. And even *then* it wan't *that* big of a deal. Nowadays, it's totally irrelevant. Creative still tries to convince people it's important though, even though benchmarks show clearly that software mixing uses sub-1% CPU utilization.

                    Just a relict from the past.

                    Comment


                    • #25
                      Originally posted by RealNC View Post
                      The quest for Linux-compatible hardware is actually quite simple:

                      * Get any sound card except X-Fi.
                      * Get an NVidia graphics card.

                      That's all there is to it
                      I call BS: My graphics server at work is using an nVidia Quadro NVS 420,
                      Mandriva 2010.0 PowerPack, and a Dell 3008WFP monitor running at 2560x1600
                      physical / 3200x2400 virtual.

                      It REFUSES to respect the virtual screen size, and I have to jump through
                      hoops to make any of the window managers usable because of that failure.
                      I have not had that trouble with any previous graphics card, counting
                      nine different machines, four different video-card vendors, and many releases
                      of RedHat, Krud, Fedora, CentOS, SuSE, and Mandriva over the last fifteen
                      years.

                      And don't tell me not to use virtual: for the kind of work I do (very high res GIS),
                      I need all the screen area I can get.

                      Comment


                      • #26
                        This will be very interesting

                        For example I would like to find out what's the 1Gbit ethernet card with lowest cpu usage in network-heavy tests. And then the same divided by card price (or would that be better multiplied?).

                        Comment


                        • #27
                          You rock!

                          There's only one thing I can think of to properly describe what you're talking about:
                          FUCKING AWESOME!

                          Comment


                          • #28
                            Originally posted by Yfrwlf View Post
                            If you don't care about standards as I do so be it, but standards and openness give you more options and empower you and the community, so those kinds of things are better to support.
                            Oh, man! I've just been talking about this with one of my clients. Took the words right out of my mouth. Open standards are the guarantee of freedom. Right now, that freedom comes at a cost not everyone is willing to pay, but they're definitely the best way to go in the long run.

                            Comment


                            • #29
                              A little exaggerated

                              I've been a sys admin using Linux for over 13 years, and I've VERY seldom encountered hardware that was not supported, and I've dealt with countless name brand workstation and server machines. If you are building your own systems as I think this article implies, then yes you will come across problems. But most users are probably purchasing systems from vendors like HP, Acer, Dell, etc. In that case, you have a very good chance of everything just working since there are a LOT of users out there testing the same systems. There are many more variables when you are building your system from scratch, not every single device can be tested.

                              In regards to graphics cards, stick with nVidia which has excellent support. There is both a proprietary and open source driver available. Stay away from ATI which has a long history of issues with Linux.

                              Comment


                              • #30
                                Originally posted by apexwm View Post
                                In regards to graphics cards, stick with nVidia which has excellent support.
                                And your remarks are why we need such a database of support. Not all NVidia parts have "excellent support". If you buy a laptop with Optimus tech in it, you'll get nothing but the Intel GPU supported for that machine...ever... That's something with NVidia. It's about the same story as the X300 Mobile parts that ATI fielded. Big fat joke for Linux.

                                There is both a proprietary and open source driver available.
                                Whoo... That's a completely accurate statement, and abjectly worthless to most people looking for answers for Linux graphics support.

                                1) Neither driver supports Optimus technology GPUs.
                                2) The FOSS driver doesn't support everything and is still quite in it's infancy.
                                3) While the closed driver works well in many cases, it can still give you fits when it does things an application doesn't expect from it- and it's had issues from time to time.

                                Stay away from ATI which has a long history of issues with Linux.
                                That's a mixed bag, really. Back when Doom3 came out, it was a bit of a push. For some configurations, it worked well. I know, I had a high-end ATI card with a dual screen setup for work when I was doing stock market software. There was some variablility in things over release versions and each person has their pet-peeve on the driver (streaming video being one of the more consistent ones for most people...).

                                It should be noted that the FOSS drivers for AMD are actually in better shape and perform well with more stuff than the NVidia ones. And, moreover, ATI gave out info freely to the Linux community back as far back as the Rage PRO. I know, I have some of that info from when I was doing Utah-GLX work. Same with the Rage128. It was when they went to the Radeon that the lawyers and that sort got in the way of things and cut us off for that long period before we got info and help again from them.

                                So...your info's not quite as good as you think of it.

                                Comment

                                Working...
                                X