Announcement

Collapse
No announcement yet.

Linux Can Deliver A Faster Gaming Experience Than Mac OS X

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Linux Can Deliver A Faster Gaming Experience Than Mac OS X

    Phoronix: Linux Can Deliver A Faster Gaming Experience Than Mac OS X

    Earlier this week on Phoronix were new benchmarks of Ubuntu Linux vs. Mac OS X using a new Apple Mac Book Pro with an Intel Core i5 CPU and a NVIDIA GeForce GT 330M graphics processor. When looking at the tests results overall it ended up being a competitive race between these two Microsoft Windows competitors. In some areas, like the OpenCL computational performance, Apple's operating system commanded a sizable lead. In other areas, like the OpenGL graphics performance, Ubuntu Linux backed by NVIDIA's official but proprietary driver was in control. Here's an additional set of tests showing the measurable leads of NVIDIA Linux over Mac OS X with Apple's NVIDIA driver.

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    It would also be interesting to see Wine comparisons of the systems, after all, both of them rely on it for gaming a whole lot.

    Comment


    • #3
      I wonder if NVIDIA actively helps with development of Apple's drivers, or if they are just given documentation and Apple does the rest.

      Comment


      • #4
        it's because apple only cares about gay men, not gaymen.

        Comment


        • #5
          Originally posted by damg View Post
          I wonder if NVIDIA actively helps with development of Apple's drivers, or if they are just given documentation and Apple does the rest.
          Nvidia develops their own drivers but Apple control the graphics stack. This is like Direct3D on Windows, where Microsoft provides the stack and IHVs implement a specific interface to communicate with the actual hardware.

          Comment


          • #6
            maybe the mac's graphic stack is slower but at least they have native steam :'((

            Comment


            • #7
              While clearly Ubuntu outperforms OS on all these benchmarks, it's also worth pointing out that in most of them the better results are probably not going to be visible because the frame rates are far higher than the monitor's frame rate. The most demanding game seems to be Nexius, where OS X doesn't reach the 30fps threshold under any resolution. All the other games stay above 60fps, sometimes by such a huge margin it's almost a joke. A lot of LCD monitors are locked at a 60fps frame rate.
              That's for average fps, but something that can hurt perceived performance is the minimum fps; if a game frequently dips into the sub-30fps unplayable territory under load, it can be more frustrating than having a constant 45-50fps experience. Even the lowest of the low (non-Nexius) fps here is ~70, so with these games it'd be hard to argue that Ubuntu gives a noticeably smoother experience. Actually, that makes me curious about quality/AA settings used in these benchmarks. Automatically maxed out?

              Comment


              • #8
                Originally posted by Wheels View Post
                Actually, that makes me curious about quality/AA settings used in these benchmarks. Automatically maxed out?
                I don't think any of these tests make use of AA by default.

                BTW, that minimum fps that you see in the graphs isn't the absolute minimum framerate that the game hit, but rather the minimum average framerate (since the graphs only show that) from all the test resolutions. The current PTS doesn't record minimum and maximum framerates for each test run, only average. It would be cool if it did though, as as you have said, that's a much better measure of playability than average fps in cases where this number is above 60fps.

                It's also a shame that there are no nouveau numbers in those graphs.

                Comment


                • #9
                  Originally posted by Wheels View Post
                  While clearly Ubuntu outperforms OS on all these benchmarks, it's also worth pointing out that in most of them the better results are probably not going to be visible because the frame rates are far higher than the monitor's frame rate. The most demanding game seems to be Nexius, where OS X doesn't reach the 30fps threshold under any resolution. All the other games stay above 60fps, sometimes by such a huge margin it's almost a joke. A lot of LCD monitors are locked at a 60fps frame rate.
                  That's for average fps, but something that can hurt perceived performance is the minimum fps; if a game frequently dips into the sub-30fps unplayable territory under load, it can be more frustrating than having a constant 45-50fps experience. Even the lowest of the low (non-Nexius) fps here is ~70, so with these games it'd be hard to argue that Ubuntu gives a noticeably smoother experience. Actually, that makes me curious about quality/AA settings used in these benchmarks. Automatically maxed out?
                  If you ever played online fps, you will know that anything before 80fps is not acceptable. The 60hz thing is what once medics found out, but it turned out to be still eye-restraining(on CRT) so it was later highered to 70hz. Still the best hz non-eye restraining started at 85hz. Same for fps - 60 is acceptable, 45 and lower unplayable. And of course you remember the first recommendation of 24fps that is absolute horror and cinema people any fast action by blending several fast frames into one 24fps ones so it "looks" like its fast(if you pause, such frame looks totally unsharp).


                  In fact my HD4770 system with Athlon II x4 630 reaches ONLY 60 fps on opensource radeon drivers (fullhd though) in OpenArena and it is much less playable than current nvidia chipset 8300 system with proprietary that Im now typing from(not at home) - 120fps+.

                  We can talk 100 pages about how LCD Vtrace is limited at 60 frames anyway, but in practice anything before 85fps is not playable in fps shooters. You need two systems to be able to compare. Of course some persons are SO slow, that they cannot distiquish 30 and 60 fps. Its highly personal and reaction based.

                  Comment


                  • #10
                    Originally posted by crazycheese View Post
                    Still the best hz non-eye restraining started at 85hz.
                    You are talking about screen refresh and not game fps. The thing is that CRTs are much faster at clearing the individual images (they have lower persistance) than LCDs, so while it is true that a 60Hz screen refresh makes your eyes bleed on a CRT, the same refresh rate on an LCD is totally acceptable. This, however isn't entirelly related to the perception of smoothness on an animation.

                    Originally posted by crazycheese View Post
                    Same for fps - 60 is acceptable, 45 and lower unplayable.
                    This very much depends on your expectations and experiences. Back in the good ol' DOS days anything above 15fps was considered playable. People back then had much higher tolerance to lower framerates and, unless it dipped to 5fps, it didn't really affected gameplay that much. Of course that the higher the framerate, the more realistic the motion is going to seem. I think that's the main issue. I find that saying that something running at 45fps is unplayable is a bit of exaggeration. It may not feel very realistic, but it's not unplayable.

                    Originally posted by crazycheese View Post
                    And of course you remember the first recommendation of 24fps that is absolute horror and cinema people any fast action by blending several fast frames into one 24fps ones so it "looks" like its fast(if you pause, such frame looks totally unsharp).
                    That's a trick that adds back the sense of realistic and smooth motion even with low framerates and it makes perfect sense because nobody is going to watch a movie frame by frame. It's the feeling of smoothness that matters most and not the actual framerate.

                    Originally posted by crazycheese View Post
                    We can talk 100 pages about how LCD Vtrace is limited at 60 frames anyway, but in practice anything before 85fps is not playable in fps shooters. You need two systems to be able to compare. Of course some persons are SO slow, that they cannot distiquish 30 and 60 fps. Its highly personal and reaction based.
                    Well, if the LCD is only able to output 60fps and the gpu 85, than you will only see 60fps. You can't beat physics. The problem is probably related to the lower dips in framerates since these tend to vary quite a lot depending on the complexity of the scene. That, in turn, is going to affect the perceived smoothness. Also, you're right that the difference between something like 50fps and 100fps is noticeable even with a screen refresh of 60hz, but that's probably more related to the fact that a gpu struggling to output 50fps will probably dip way lower at times and that is going to affect the smoothness of the motion.

                    Comment

                    Working...
                    X