Announcement

Collapse
No announcement yet.

Linux Can Deliver A Faster Gaming Experience Than Mac OS X

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • kgonzales
    replied
    Originally posted by chris200x9 View Post
    it's because apple only cares about gay men, not gaymen.
    Congrats you are a homophobe. That's exactly what the Linux community needs, more bigots.

    Leave a comment:


  • Dukenukemx
    replied
    Not as many.

    Leave a comment:


  • curaga
    replied
    Surprisingly many, actually.

    Leave a comment:


  • Dukenukemx
    replied
    Would have loved to seen a Wine test comparison, or just common commercial games between the two OS's. Lets be realistic, who really plays these open source games?

    Leave a comment:


  • deanjo
    replied
    Originally posted by allquixotic View Post
    Apple's stack probably sucks because they use LLVM in there somewhere Every time I've seen LLVM comparisons, on Phoronix and elsewhere, LLVM sucks, and completely fails to live up to the hype that it makes things magically faster. The ONLY thing it does faster is it produces valid binaries faster than GCC when compiling C/C++. But so what? Would you rather spend a few more seconds compiling in order to create faster-executing binaries, or build quick-and-dirty binaries that are poorly optimized? What's a few seconds on a multi-core build server? Unless we're building binaries in-place on mobile devices, build time just doesn't matter, as long as it isn't unmanageable. Even the ugliest of builds I've ever seen -- things like OpenOffice, the whole Mozilla suite, or the Linux kernel with everything built as a module -- can be easily tackled with a small cluster of icecreamed build servers.
    Going by the last benches it would show LLVM to be pretty much on par with GCC.

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite


    Of course LLVM also has additional pluses like using clang which has better expressive diagnostics then GCC does and your not tied to the limitations of the GPL licensing.

    Leave a comment:


  • allquixotic
    replied
    Apple's stack probably sucks because they use LLVM in there somewhere Every time I've seen LLVM comparisons, on Phoronix and elsewhere, LLVM sucks, and completely fails to live up to the hype that it makes things magically faster. The ONLY thing it does faster is it produces valid binaries faster than GCC when compiling C/C++. But so what? Would you rather spend a few more seconds compiling in order to create faster-executing binaries, or build quick-and-dirty binaries that are poorly optimized? What's a few seconds on a multi-core build server? Unless we're building binaries in-place on mobile devices, build time just doesn't matter, as long as it isn't unmanageable. Even the ugliest of builds I've ever seen -- things like OpenOffice, the whole Mozilla suite, or the Linux kernel with everything built as a module -- can be easily tackled with a small cluster of icecreamed build servers.

    Leave a comment:


  • Wheels
    replied
    Originally posted by devius View Post
    I don't think any of these tests make use of AA by default.

    BTW, that minimum fps that you see in the graphs isn't the absolute minimum framerate that the game hit, but rather the minimum average framerate (since the graphs only show that) from all the test resolutions. The current PTS doesn't record minimum and maximum framerates for each test run, only average.
    Well that's some useful info.

    Originally posted by crazycheese View Post
    If you ever played online fps, you will know that anything before 80fps is not acceptable. The 60hz thing is what once medics found out, but it turned out to be still eye-restraining(on CRT) so it was later highered to 70hz. Still the best hz non-eye restraining started at 85hz. Same for fps - 60 is acceptable, 45 and lower unplayable.
    60Hz maybe headache-inducing on the CRTs in my computer classes, but at home I've found it's not a big deal on my LCD monitor or laptop unless there are a lot of horizontal stripes or something.
    And no, I don't tend to play a lot of online FPSers. Maybe that makes a difference, but to me having more frames served to the monitor than it can actually display for you is just a waste that doesn't improve the experience. Like I said, a lot of LCDs are limited to 60 frames no matter what your graphics card can put out; but if the game drop lower than that frequently it's going to be a less than ideal experience so the more its fps can stay off the floor the better.

    In fact my HD4770 system with Athlon II x4 630 reaches ONLY 60 fps on opensource radeon drivers (fullhd though) in OpenArena and it is much less playable than current nvidia chipset 8300 system with proprietary that Im now typing from(not at home) - 120fps+.
    Is that because you get frequent fps sags on the open source driver?

    We can talk 100 pages about how LCD Vtrace is limited at 60 frames anyway, but in practice anything before 85fps is not playable in fps shooters.
    Wouldn't that mean that shooters aren't playable on most LCD monitors "in practice" because of the frame rate limit?

    Leave a comment:


  • curaga
    replied
    "60fps ought to be enough for everyone"

    Leave a comment:


  • blackshard
    replied
    Originally posted by crazycheese View Post
    We can talk 100 pages about how LCD Vtrace is limited at 60 frames anyway, but in practice anything before 85fps is not playable in fps shooters. You need two systems to be able to compare. Of course some persons are SO slow, that they cannot distiquish 30 and 60 fps. Its highly personal and reaction based.
    60fps is far enough for anyone, even in fast-paced online games.
    IF you're lucky to get stable 60fps, you're in good condition to play: it's 16 msec resolution, I don't think humans can take decisions in less than 16 msec...

    Albeit I'm slow because I can't distinguish between 60 fps and 120 fps

    Leave a comment:


  • devius
    replied
    Originally posted by crazycheese View Post
    Still the best hz non-eye restraining started at 85hz.
    You are talking about screen refresh and not game fps. The thing is that CRTs are much faster at clearing the individual images (they have lower persistance) than LCDs, so while it is true that a 60Hz screen refresh makes your eyes bleed on a CRT, the same refresh rate on an LCD is totally acceptable. This, however isn't entirelly related to the perception of smoothness on an animation.

    Originally posted by crazycheese View Post
    Same for fps - 60 is acceptable, 45 and lower unplayable.
    This very much depends on your expectations and experiences. Back in the good ol' DOS days anything above 15fps was considered playable. People back then had much higher tolerance to lower framerates and, unless it dipped to 5fps, it didn't really affected gameplay that much. Of course that the higher the framerate, the more realistic the motion is going to seem. I think that's the main issue. I find that saying that something running at 45fps is unplayable is a bit of exaggeration. It may not feel very realistic, but it's not unplayable.

    Originally posted by crazycheese View Post
    And of course you remember the first recommendation of 24fps that is absolute horror and cinema people any fast action by blending several fast frames into one 24fps ones so it "looks" like its fast(if you pause, such frame looks totally unsharp).
    That's a trick that adds back the sense of realistic and smooth motion even with low framerates and it makes perfect sense because nobody is going to watch a movie frame by frame. It's the feeling of smoothness that matters most and not the actual framerate.

    Originally posted by crazycheese View Post
    We can talk 100 pages about how LCD Vtrace is limited at 60 frames anyway, but in practice anything before 85fps is not playable in fps shooters. You need two systems to be able to compare. Of course some persons are SO slow, that they cannot distiquish 30 and 60 fps. Its highly personal and reaction based.
    Well, if the LCD is only able to output 60fps and the gpu 85, than you will only see 60fps. You can't beat physics. The problem is probably related to the lower dips in framerates since these tend to vary quite a lot depending on the complexity of the scene. That, in turn, is going to affect the perceived smoothness. Also, you're right that the difference between something like 50fps and 100fps is noticeable even with a screen refresh of 60hz, but that's probably more related to the fact that a gpu struggling to output 50fps will probably dip way lower at times and that is going to affect the smoothness of the motion.

    Leave a comment:

Working...
X