Announcement

Collapse
No announcement yet.

Linux Can Deliver A Faster Gaming Experience Than Mac OS X

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Linux Can Deliver A Faster Gaming Experience Than Mac OS X

    Phoronix: Linux Can Deliver A Faster Gaming Experience Than Mac OS X

    Earlier this week on Phoronix were new benchmarks of Ubuntu Linux vs. Mac OS X using a new Apple Mac Book Pro with an Intel Core i5 CPU and a NVIDIA GeForce GT 330M graphics processor. When looking at the tests results overall it ended up being a competitive race between these two Microsoft Windows competitors. In some areas, like the OpenCL computational performance, Apple's operating system commanded a sizable lead. In other areas, like the OpenGL graphics performance, Ubuntu Linux backed by NVIDIA's official but proprietary driver was in control. Here's an additional set of tests showing the measurable leads of NVIDIA Linux over Mac OS X with Apple's NVIDIA driver.

    http://www.phoronix.com/vr.php?view=15543

  • #2
    It would also be interesting to see Wine comparisons of the systems, after all, both of them rely on it for gaming a whole lot.

    Comment


    • #3
      I wonder if NVIDIA actively helps with development of Apple's drivers, or if they are just given documentation and Apple does the rest.

      Comment


      • #4
        it's because apple only cares about gay men, not gaymen.

        Comment


        • #5
          Originally posted by damg View Post
          I wonder if NVIDIA actively helps with development of Apple's drivers, or if they are just given documentation and Apple does the rest.
          Nvidia develops their own drivers but Apple control the graphics stack. This is like Direct3D on Windows, where Microsoft provides the stack and IHVs implement a specific interface to communicate with the actual hardware.

          Comment


          • #6
            maybe the mac's graphic stack is slower but at least they have native steam :'((

            Comment


            • #7
              While clearly Ubuntu outperforms OS on all these benchmarks, it's also worth pointing out that in most of them the better results are probably not going to be visible because the frame rates are far higher than the monitor's frame rate. The most demanding game seems to be Nexius, where OS X doesn't reach the 30fps threshold under any resolution. All the other games stay above 60fps, sometimes by such a huge margin it's almost a joke. A lot of LCD monitors are locked at a 60fps frame rate.
              That's for average fps, but something that can hurt perceived performance is the minimum fps; if a game frequently dips into the sub-30fps unplayable territory under load, it can be more frustrating than having a constant 45-50fps experience. Even the lowest of the low (non-Nexius) fps here is ~70, so with these games it'd be hard to argue that Ubuntu gives a noticeably smoother experience. Actually, that makes me curious about quality/AA settings used in these benchmarks. Automatically maxed out?

              Comment


              • #8
                Originally posted by Wheels View Post
                Actually, that makes me curious about quality/AA settings used in these benchmarks. Automatically maxed out?
                I don't think any of these tests make use of AA by default.

                BTW, that minimum fps that you see in the graphs isn't the absolute minimum framerate that the game hit, but rather the minimum average framerate (since the graphs only show that) from all the test resolutions. The current PTS doesn't record minimum and maximum framerates for each test run, only average. It would be cool if it did though, as as you have said, that's a much better measure of playability than average fps in cases where this number is above 60fps.

                It's also a shame that there are no nouveau numbers in those graphs.

                Comment


                • #9
                  Originally posted by Wheels View Post
                  While clearly Ubuntu outperforms OS on all these benchmarks, it's also worth pointing out that in most of them the better results are probably not going to be visible because the frame rates are far higher than the monitor's frame rate. The most demanding game seems to be Nexius, where OS X doesn't reach the 30fps threshold under any resolution. All the other games stay above 60fps, sometimes by such a huge margin it's almost a joke. A lot of LCD monitors are locked at a 60fps frame rate.
                  That's for average fps, but something that can hurt perceived performance is the minimum fps; if a game frequently dips into the sub-30fps unplayable territory under load, it can be more frustrating than having a constant 45-50fps experience. Even the lowest of the low (non-Nexius) fps here is ~70, so with these games it'd be hard to argue that Ubuntu gives a noticeably smoother experience. Actually, that makes me curious about quality/AA settings used in these benchmarks. Automatically maxed out?
                  If you ever played online fps, you will know that anything before 80fps is not acceptable. The 60hz thing is what once medics found out, but it turned out to be still eye-restraining(on CRT) so it was later highered to 70hz. Still the best hz non-eye restraining started at 85hz. Same for fps - 60 is acceptable, 45 and lower unplayable. And of course you remember the first recommendation of 24fps that is absolute horror and cinema people any fast action by blending several fast frames into one 24fps ones so it "looks" like its fast(if you pause, such frame looks totally unsharp).


                  In fact my HD4770 system with Athlon II x4 630 reaches ONLY 60 fps on opensource radeon drivers (fullhd though) in OpenArena and it is much less playable than current nvidia chipset 8300 system with proprietary that Im now typing from(not at home) - 120fps+.

                  We can talk 100 pages about how LCD Vtrace is limited at 60 frames anyway, but in practice anything before 85fps is not playable in fps shooters. You need two systems to be able to compare. Of course some persons are SO slow, that they cannot distiquish 30 and 60 fps. Its highly personal and reaction based.

                  Comment


                  • #10
                    Originally posted by crazycheese View Post
                    Still the best hz non-eye restraining started at 85hz.
                    You are talking about screen refresh and not game fps. The thing is that CRTs are much faster at clearing the individual images (they have lower persistance) than LCDs, so while it is true that a 60Hz screen refresh makes your eyes bleed on a CRT, the same refresh rate on an LCD is totally acceptable. This, however isn't entirelly related to the perception of smoothness on an animation.

                    Originally posted by crazycheese View Post
                    Same for fps - 60 is acceptable, 45 and lower unplayable.
                    This very much depends on your expectations and experiences. Back in the good ol' DOS days anything above 15fps was considered playable. People back then had much higher tolerance to lower framerates and, unless it dipped to 5fps, it didn't really affected gameplay that much. Of course that the higher the framerate, the more realistic the motion is going to seem. I think that's the main issue. I find that saying that something running at 45fps is unplayable is a bit of exaggeration. It may not feel very realistic, but it's not unplayable.

                    Originally posted by crazycheese View Post
                    And of course you remember the first recommendation of 24fps that is absolute horror and cinema people any fast action by blending several fast frames into one 24fps ones so it "looks" like its fast(if you pause, such frame looks totally unsharp).
                    That's a trick that adds back the sense of realistic and smooth motion even with low framerates and it makes perfect sense because nobody is going to watch a movie frame by frame. It's the feeling of smoothness that matters most and not the actual framerate.

                    Originally posted by crazycheese View Post
                    We can talk 100 pages about how LCD Vtrace is limited at 60 frames anyway, but in practice anything before 85fps is not playable in fps shooters. You need two systems to be able to compare. Of course some persons are SO slow, that they cannot distiquish 30 and 60 fps. Its highly personal and reaction based.
                    Well, if the LCD is only able to output 60fps and the gpu 85, than you will only see 60fps. You can't beat physics. The problem is probably related to the lower dips in framerates since these tend to vary quite a lot depending on the complexity of the scene. That, in turn, is going to affect the perceived smoothness. Also, you're right that the difference between something like 50fps and 100fps is noticeable even with a screen refresh of 60hz, but that's probably more related to the fact that a gpu struggling to output 50fps will probably dip way lower at times and that is going to affect the smoothness of the motion.

                    Comment


                    • #11
                      Originally posted by crazycheese View Post
                      We can talk 100 pages about how LCD Vtrace is limited at 60 frames anyway, but in practice anything before 85fps is not playable in fps shooters. You need two systems to be able to compare. Of course some persons are SO slow, that they cannot distiquish 30 and 60 fps. Its highly personal and reaction based.
                      60fps is far enough for anyone, even in fast-paced online games.
                      IF you're lucky to get stable 60fps, you're in good condition to play: it's 16 msec resolution, I don't think humans can take decisions in less than 16 msec...

                      Albeit I'm slow because I can't distinguish between 60 fps and 120 fps

                      Comment


                      • #12
                        "60fps ought to be enough for everyone"

                        Comment


                        • #13
                          Originally posted by devius View Post
                          I don't think any of these tests make use of AA by default.

                          BTW, that minimum fps that you see in the graphs isn't the absolute minimum framerate that the game hit, but rather the minimum average framerate (since the graphs only show that) from all the test resolutions. The current PTS doesn't record minimum and maximum framerates for each test run, only average.
                          Well that's some useful info.

                          Originally posted by crazycheese View Post
                          If you ever played online fps, you will know that anything before 80fps is not acceptable. The 60hz thing is what once medics found out, but it turned out to be still eye-restraining(on CRT) so it was later highered to 70hz. Still the best hz non-eye restraining started at 85hz. Same for fps - 60 is acceptable, 45 and lower unplayable.
                          60Hz maybe headache-inducing on the CRTs in my computer classes, but at home I've found it's not a big deal on my LCD monitor or laptop unless there are a lot of horizontal stripes or something.
                          And no, I don't tend to play a lot of online FPSers. Maybe that makes a difference, but to me having more frames served to the monitor than it can actually display for you is just a waste that doesn't improve the experience. Like I said, a lot of LCDs are limited to 60 frames no matter what your graphics card can put out; but if the game drop lower than that frequently it's going to be a less than ideal experience so the more its fps can stay off the floor the better.

                          In fact my HD4770 system with Athlon II x4 630 reaches ONLY 60 fps on opensource radeon drivers (fullhd though) in OpenArena and it is much less playable than current nvidia chipset 8300 system with proprietary that Im now typing from(not at home) - 120fps+.
                          Is that because you get frequent fps sags on the open source driver?

                          We can talk 100 pages about how LCD Vtrace is limited at 60 frames anyway, but in practice anything before 85fps is not playable in fps shooters.
                          Wouldn't that mean that shooters aren't playable on most LCD monitors "in practice" because of the frame rate limit?

                          Comment


                          • #14
                            Apple's stack probably sucks because they use LLVM in there somewhere Every time I've seen LLVM comparisons, on Phoronix and elsewhere, LLVM sucks, and completely fails to live up to the hype that it makes things magically faster. The ONLY thing it does faster is it produces valid binaries faster than GCC when compiling C/C++. But so what? Would you rather spend a few more seconds compiling in order to create faster-executing binaries, or build quick-and-dirty binaries that are poorly optimized? What's a few seconds on a multi-core build server? Unless we're building binaries in-place on mobile devices, build time just doesn't matter, as long as it isn't unmanageable. Even the ugliest of builds I've ever seen -- things like OpenOffice, the whole Mozilla suite, or the Linux kernel with everything built as a module -- can be easily tackled with a small cluster of icecreamed build servers.

                            Comment


                            • #15
                              Originally posted by allquixotic View Post
                              Apple's stack probably sucks because they use LLVM in there somewhere Every time I've seen LLVM comparisons, on Phoronix and elsewhere, LLVM sucks, and completely fails to live up to the hype that it makes things magically faster. The ONLY thing it does faster is it produces valid binaries faster than GCC when compiling C/C++. But so what? Would you rather spend a few more seconds compiling in order to create faster-executing binaries, or build quick-and-dirty binaries that are poorly optimized? What's a few seconds on a multi-core build server? Unless we're building binaries in-place on mobile devices, build time just doesn't matter, as long as it isn't unmanageable. Even the ugliest of builds I've ever seen -- things like OpenOffice, the whole Mozilla suite, or the Linux kernel with everything built as a module -- can be easily tackled with a small cluster of icecreamed build servers.
                              Going by the last benches it would show LLVM to be pretty much on par with GCC.

                              http://www.phoronix.com/scan.php?pag...gcc_atom&num=1

                              Of course LLVM also has additional pluses like using clang which has better expressive diagnostics then GCC does and your not tied to the limitations of the GPL licensing.

                              Comment

                              Working...
                              X