Announcement

Collapse
No announcement yet.

Ubuntu 9.04 vs. Mac OS X 10.5.6 Benchmarks

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Oh. And Microbencharks are worth only so much and can be very misleading.

    For example, the SQLite benchmark can be easily fooled by the file system.

    If SQLite is requesting a fsync() and the file system says it does a fsync()... but doesn't actually sync the contents of the file to the harddrive, then yes I can easily make something that will perform 1000x as fast as Linux does with Ext3.

    Comment


    • #32
      And another thing. This benchmarking suite is ALPHA software. It could very well be screwing in the upwards direction. It seems very unrealistic that two different UNIX-like OSes running on the same hardware could have such a huge difference in the instructions-per-second benchmarks.

      Comment


      • #33
        Originally posted by drag View Post
        Oh. And Microbencharks are worth only so much and can be very misleading.

        For example, the SQLite benchmark can be easily fooled by the file system.

        If SQLite is requesting a fsync() and the file system says it does a fsync()... but doesn't actually sync the contents of the file to the harddrive, then yes I can easily make something that will perform 1000x as fast as Linux does with Ext3.
        Actually when it comes to SQLite in OS X, SQLite sends a F_FULLFSYNC request to the kernel to ensures that the bytes are actually written through to the drive platter. This causes the kernel to flush all buffers to the drives and causes the drives to flush their track caches. So if anything, OS X should be slower.

        Comment


        • #34
          Originally posted by thefirstm View Post
          And another thing. This benchmarking suite is ALPHA software. It could very well be screwing in the upwards direction. It seems very unrealistic that two different UNIX-like OSes running on the same hardware could have such a huge difference in the instructions-per-second benchmarks.
          The suite itself does not determine the benchmark results. The suite is simply a front end for running those individual benchmarks and parsing the results they give.

          Comment


          • #35
            A correct test would be ...

            To format the MAC Mini with Ubuntu 9.04, and then run the benchmarks again.
            Now chart the results!
            Then nobody can bitch about unfair test (hardware wise, that is).

            Cheers

            Comment


            • #36
              Originally posted by yotambien View Post
              that it is a known fact doesn't change a thing.
              It doesn't change the results of the test, no, but that doesn't change the fact that it's an important point, either. If OS X had some small change in a particular version that created a huge regression in performance, that wouldn't be any less relevant to note. What does that fact tell you? That the test is skewed in a sense, because it's not quite representative of the "normal performance" you'd find perhaps in any other version. So while it does mean that that specific version contains regressions which is important, it's not the best test to use to really gauge the "capability" if two competing operating systems in the long term.

              Basically, be as wise to what is going on as possible so you can get a comparison that is the most "fair". All software needs critics.

              But yes, try the same test using Nvidia or ATI graphics, or basically on a machine that has common hardware that isn't nitch to OS X, and that might be a much more fair test. Might be. The problem is, if OS X and Linux are never really geared to run on each other's hardware and code/drivers were never really optimized for it, there might not be any way to really have a "fair" test. Fair is always to some degree an illusion, but at least you can try your best to get close. ^^

              Comment


              • #37
                Originally posted by kwag View Post
                To format the MAC Mini with Ubuntu 9.04, and then run the benchmarks again.
                Now chart the results!
                Then nobody can bitch about unfair test (hardware wise, that is).

                Cheers
                Yes you can. For one thing, there's never "totally fair", you can always just get "closer to fair", so there will *always* be a way to bitch. But in this case, if OS X has really good Intel video drivers because the developers actually spent a lot of time on them optimizing them for that OS, and Linux for whatever reason has crappy drivers (but may be increasing rapidly in quality and capability) that are behind that if OS X's Intel driver performance and features still, and then you compare both OSes and use graphics tests, obviously the OS X tests will win.

                So now it comes down to should we run tests on Linux's known best hardware, or OS X's known best hardware, or try to find a happy middle ground somehow? That's pretty difficult to do. I mean, I don't even know if there are ANY *standard* Nvidia or ATI cards that work with OS X, I thought they were basically all "the Mac version". They shouldn't have any major differences, but who knows, the fact that they aren't mainstream cards means the Linux drivers might not have cared too much about them and thus they might suck.

                But yeah, I'd be curious to see ATI and Nvidia tests as well as tests with Fedora 11 if some of the regressions have been fixed in it.

                Comment


                • #38
                  Originally posted by susikala View Post
                  Doesn't that computer have a stock x86 CPU?

                  ---

                  Also, from Phoronix's site description: 'Phoronix is the leading technology website for GNU/Linux and Solaris hardware news'.

                  I don't see any reason to post benchmarks of proprietary garbage here, honestly... there are enough Mac/Windows sites for that.
                  ?If you know the enemy and know yourself, you need not fear the result of a hundred battles. If you know yourself but not your enemy, for every victory gained you will also suffer a defeat. If you know neither the enemy nor yourself, you will succumb in every battle.?

                  This is not war, but this quote is applicable to most every day life. You're dumb for excluding competition because they didn't tell you how they tied their shoe laces.

                  Comment


                  • #39
                    Originally posted by Yfrwlf View Post
                    Yes you can. For one thing, there's never "totally fair", you can always just get "closer to fair", so there will *always* be a way to bitch. But in this case, if OS X has really good Intel video drivers because the developers actually spent a lot of time on them optimizing them for that OS, and Linux for whatever reason has crappy drivers (but may be increasing rapidly in quality and capability) that are behind that if OS X's Intel driver performance and features still, and then you compare both OSes and use graphics tests, obviously the OS X tests will win.

                    So now it comes down to should we run tests on Linux's known best hardware, or OS X's known best hardware, or try to find a happy middle ground somehow? That's pretty difficult to do. I mean, I don't even know if there are ANY *standard* Nvidia or ATI cards that work with OS X, I thought they were basically all "the Mac version". They shouldn't have any major differences, but who knows, the fact that they aren't mainstream cards means the Linux drivers might not have cared too much about them and thus they might suck.

                    But yeah, I'd be curious to see ATI and Nvidia tests as well as tests with Fedora 11 if some of the regressions have been fixed in it.
                    OK first of all the "Mac Editions" of the hardware are EFI ready. Once the video is initialized it identical to the function of any PC card. Thus is the reason why you can use plain jane Nvidia driver when running windows on a mac. Any nvidia/ati card given that they have a large enough bios rom can be flashed to their mac counterparts and vice versa. Spec out a PC to the exact same hardware as a Mac, install the same version of linux on both and they will give you identical results. The only difference between a PC and a modern Mac hardware wise is the EFI capability. EFI on PC's are still generally only found on server/workstation class boards.

                    Sure you may get close to comparable performance using blobs on linux. Part of that is because you would be getting optimized GL stacks instead of dealing with a generic "one-size-fits-all" mesa and blobs bypass much of X. But then again you would be going outside the "purity" of linux. Open source video solutions, frankly, suck for the most part on anything requiring more demanding that benifits from a optimized stack. They are supposed to be a "just work out of the box" solution, they are not how ever a "work out of the box and totally kick ass in performace" solution. You want max performance, you go closed sourced blobs instead of a generic solution.

                    OS X is not worried about a "open" video solution for their products. They use blobs for the best performance because when you look at their primary competitor, windows, all they care is about performance as well, as does 90+ % of the world does when it comes to desktop usage.

                    Bottom line is that put pure OS X against a "pure" ubuntu on the same hardware OS X will come on top on most areas that the average desktop user will use. No matter what the reasons are, the results still stand valid. The hardware does not contain any "specially designed hardware just for Apple". These are off the shelf components.
                    Last edited by deanjo; 13 May 2009, 12:24 AM.

                    Comment


                    • #40
                      Originally posted by deanjo View Post
                      Actually when it comes to SQLite in OS X, SQLite sends a F_FULLFSYNC request to the kernel to ensures that the bytes are actually written through to the drive platter. This causes the kernel to flush all buffers to the drives and causes the drives to flush their track caches. So if anything, OS X should be slower.
                      SQL is threaded to high heaven. OS/X is a microkernel OS. It loves threads. It's built from threads. A microkernal OS will always destroy a giant kernel OS in SQL and graphics.
                      You'd need GNU Debian HURD to even play in the same ballpark and that's like barely existing. Linux and unix's will likely evolve into macrokernel like OS's just because IBM, nvidia, ati, AMD etc all need them to be like that so OpenCL doesn't turn into a turd for linux. IBM probably already has it that way for custom CELL implimentations that it doesn't share with the general public.



                      But judging from the size of kernels in linux these days. They seem to be making them more and more monolithic for now. I think once the kernel reaches about 250mb compiled and has drivers built in for everything they will stop and turn around.
                      Last edited by Hephasteus; 13 May 2009, 01:52 AM.

                      Comment

                      Working...
                      X