Announcement

Collapse
No announcement yet.

Mac OS X 10.5 vs. Ubuntu 8.10 Benchmarks

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #71
    Originally posted by deanjo View Post
    xorg has been needed until just recently. Mesa DOES have a huge factor in performance as well. Why do you think blobs bypass MUCH of X? The drivers that ubuntu DO use mesa for the intel chipset. Mesa is not used for OS X.
    Simply, Enemy Territory runs faster for me under Linux then under XP. It can be due to drivers, but that's why I think that mesa has no big influence to binary blobs (and NVIDIA does many things their way).

    That is their names. I when I spell darwin in small letter too.

    As would be expected in a OS that has many years upon a OS that was not marketed for servers until recently
    So, we can say that Mac OS need years to catch up with Linux. I should add in server area, but you didn't say before what you meant.

    Go right ahead. As far as firewall goes most people have a external firwall anyways whether it's through a router or modem. Glad you keep up with the times.
    Many people don't. You can say what you want, but firewall disabled by default is little lame. Sorry for little off topic...

    FYI TheArgos brought up X, I commented and corrected him on a few things. So read the bloody thread before you start accusing of trolling. It seems you cannot keep up with the conversation.

    Not a fanboy at all, I use pretty much every OS equally on a daily basis, they all have their weaknesses and strengths.
    I'm sometimes quite lazy, but you said that Linux needs years to catch up and that wasn't just correction. Btw. we can mention many arguments, but it usually leads to flame. I can agree that Quartz is probably much modern way than X to do some things, but as you said every OS has it's own advantages and disadvantages.

    Comment


    • #72
      Originally posted by kraftman View Post
      Simply, Enemy Territory runs faster for me under Linux then under XP. It can be due to drivers, but that's why I think that mesa has no big influence to binary blobs (and NVIDIA does many things their way).
      You best look at implementations of drivers on the same OS to draw that conclusion say firelgx vs radeonhd vs radeon and then take snapshots of the screens and do a differential comparision of them on the same frame.

      So, we can say that Mac OS need years to catch up with Linux. I should add in server area, but you didn't say before what you meant.
      Shouldn't have too as the topic has nothing to do with server editions of either OS neither did the tests. If Apple want's to play in the server market then, yes, there are areas they are behind as well. But that isn't a primary concern for them. OS X server is there to manage OS X networks. And it does that EXTREMELY well.

      Many people don't. You can say what you want, but firewall disabled by default is little lame.
      In YOUR opinion, others view having to disable and make exceptions in firewalls lame just to do simple filesharing

      I'm sometimes quite lazy, but you said that Linux needs years to catch up and that wasn't just correction.
      Take the bloody comment in context. Read before you type.

      It's a pretty massive enhancement, not a complete rewrite that's for sure with more focus being put on openCL, Grand Central, and HD video acceleration (at least it wasn't when I left Apple a couple months back and the HD acceleration can be found on the new Macbook line). Still, linux will be playing catch up for a few years after 10.6 comes out.
      You started firing guns before you even thought about the comment.
      which in the end you eventually agree with:

      Btw. we can mention many arguments, but it usually leads to flame. I can agree that Quartz is probably much modern way than X to do some things, but as you said every OS has it's own advantages and disadvantages.
      Last edited by deanjo; 13 November 2008, 06:02 PM.

      Comment


      • #73
        Originally posted by deanjo View Post
        You started firing guns before you even thought about the comment. which in the end you eventually agree with:
        Someone else started. I thought that you won't continue. And even if I agree, I just don't like your way you said that.

        Take the bloody comment in context. Read before you type.
        It's not this comment that I was talking about...

        You best look at implementations of drivers on the same OS to draw that conclusion say firelgx vs radeonhd vs radeon and then take snapshots of the screens and do a differential comparision of them on the same frame.
        Why not compare NVIDIA binary driver under Linux and NVIDIA binary driver under Windows XP? It's more objective in my opinion and if I have little better result on Linux mesa doesn't slow down anything in this case as I mentioned before. You can be sure that screenshots using those binary drivers will look the same. It looks like you sometimes don't know what I'm talking about.

        OS X server is there to manage OS X networks. And it does that EXTREMELY well.
        I'd love to hear more about that and about Quartz, but you didn't said anything special.

        Comment


        • #74
          Originally posted by kraftman View Post
          Someone else started. I thought that you won't continue. And even if I agree, I just don't like your way you said that.
          Read the whole tread topic what does it say?

          It's not this comment that I was talking about...
          That is the exact quote you quoted.

          Why not compare NVIDIA binary driver under Linux and NVIDIA binary driver under Windows XP? It's more objective in my opinion and if I have little better result on Linux mesa doesn't slow down anything in this case as I mentioned before. You can be sure that screenshots using those binary drivers will look the same. It looks like you sometimes don't know what I'm talking about.
          And what would that prove by using 2 drivers that do not use mesa?

          You might want to read up here:

          Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite



          I'd love to hear more about that and about Quartz, but you didn't said anything special.
          I posted a link on Quartz and it's subsystems. If you can't be bothered to read it there then why bother?
          Last edited by deanjo; 13 November 2008, 07:22 PM.

          Comment


          • #75
            Originally posted by deanjo View Post
            Read the whole tread topic what does it say?
            That is the exact quote you quoted.
            I thought about that: "Still, linux will be playing catch up for a few years after 10.6 comes out." I quoted other comment, because I was answering to some part of it.

            And what would that prove by using 2 drivers that do not use mesa?
            You said before:
            Mesa DOES have a huge factor in performance as well. Why do you think blobs bypass MUCH of X?
            Maybe it will proove that you're wrong?

            I posted a link on Quartz and it's subsystems. If you can't be bothered to read it there then why bother?
            Sorry, but I don't see it... Btw. I'm not interested in Apple's sweet talk. I'd prefer to see real advantages of Quartz over X.

            Comment


            • #76
              Originally posted by kraftman View Post
              I thought about that: "Still, linux will be playing catch up for a few years after 10.6 comes out." I quoted other comment, because I was answering to some part of it.
              That is the only area where I even mentioned OS X being better.


              Maybe it will proove that you're wrong?
              Testing 2 non-Mesa drivers against each other in different OS doesn't prove squat when it comes to finding out if Mesa is the limiting factor or not.

              Sorry, but I don't see it... Btw. I'm not interested in Apple's sweet talk. I'd prefer to see real advantages of Quartz over X.
              Look here:







              For a reasonable laymans guide although be it is missing some of the 10.5 changes:
              Apple's latest OS X release (10.4) is about to hit the streets. Tiger brings a …


              Keep in mind as well that compositing didn't have to be disabled on the mac. It doesn't effect games as it does on linux where it slows openGL games and such.

              Comment


              • #77
                I think it is pertinent some comments about the graphics library and drivers used on both systems... And maybe illustrate a bit more how do graphics subsystems work on both platforms.

                First Mesa Vs Binary Blobs

                Mesa is an Open Source implementation (keyword for whatever other concepts exposed here after) of the OpenGL specification, that implicitly means that Mesa is NOT necessarily the same as OpenGL (just as Wine is not Windows, but an implementation of the WinAPI under Unix), and provides some of the functionality OpenGL does. OpenGL is very bureaucratic and is governed by a series of board members (ARB) who agree on a series of basic set of features and functionality the API and library has to provide. OpenGL is deployed on systems through the use of ICD (or Installable Client Driver) which usually gives the whole bunch of OpenGL functionality. This functionality by consensus is provided by a system library, the libGL.so library under Unix systems and OpenGL32.dll under Windows ; and a driver (ICD) which makes use of such library. More often than not, the drivers also include their own optimized implementation of the API as part of the ICD (i.e, their own libGL.so or OpenGL32.dll [or similar]) in order to provide optimized OpenGL rendering. What in fact is happening whenever you test the Open Drivers (Intel, ATi, Matrox, 3dfx, etc) under Linux/FreeBSD/OpenSolaris to any of the binary blobs (nVidia/ATi) is that you are actually comparing apples to oranges. These blobs are indeed an ICD, while the Open Drivers use Mesa as their OpenGL implementation. nVidia, ATi/AMD, Intel, SGI, Sun, Apple, Microsoft (they abandoned the ARB before Vista's release [and the OpenGL 2.0 specification release] IIRC, but I believe they're back), etc are actually members of the Architecture Review Board for OpenGL, what does this mean? Well, simply that the degree of performance you can expect to obtain from the implementation of any of these ARB members would be orders of magnitudes above Mesa (which is not an ARB member [though I believe there are special considerations towards Mesa]), then there is the degree of optimization that the driver coders can implement into the driver using Mesa (remember Mesa, even though implementing the whole OpenGL spec, may not be as optimized as an ARB member ICD).

                This is why comparing Mesa Vs a proper ICD usually ends up resulting in Mesa being much slower, if not due to features or system optimization, due to the large number of device architectures that make use of it. And this is another differentiator between Mesa and OpenGL ICDs. Mesa does not require by itself (just as OpenGL doesn't either) to be hardware accelerated, for accelerating Mesa (and OpenGL in Linux) the DRI mechanism was devised. As it name implies the Direct Rendering Infrastructure seeks to attain direct hardware access for accelerating OpenGL through Mesa (or a vendor's ICD) and is where X is more involved in the acceleration process, since by itself X does much of its rendering in an indirect manner (which is part of the core of the problems that it now presents for more sophisticated rendering on the desktop, by the way) and so there are two extra components that help accelerate in hardware the rendering through direct access, the userland DRI library, and the kernel-side Direct Rendering Manager kernel module, responsible for direct I/O to the hardware parsing calls from the DRI library above (carrying Mesa GL commands along to the graphics hardware). This is an overly simplified explanation and I'm sure that any DRI/DRM developers will have a heart attack by reading my explanation of how this works [or I think it works]).

                At any rate, a much more "fair" comparison would involve putting together a system that would match in hardware configuration and capabilities vis-a-vis comparable to the MacMini, however using another ICD OpenGL backend instead (like nVidia with an nVidia 7050 IGP or fglrx with an ATi HD3200/790G IGP), then the graphics comparison would be much more leveled (simply due to sheer OpenGL support)

                The XServer Vs Quartz.

                Again, overly simplified, as noted X11 was engineered to do indirect-over-the-network-through-sockets rendering, and it has excelled at that over the last 24 years. However this robustness, is its Achilles heel for fast desktop rendering. In a nutshell, the client-server architecture of X (which makes it network transparent) has the server running on the local computer, and clients (other machines, programs) connect to it for rendering, using network packets. This is rather convoluted (though it is very convenient for a lot of tasks such as mainframes and centralized rendering, etc), and that's where DRI comes in, it is a system to by pass the XServer and allow applications direct access to the graphics hardware instead. In this model the XServer is still the responsible for the rendering of the desktop, etc. From what I have been able to make out of Quartz (or rather Core Graphics), in its design the render path is much faster as it indeed has direct access to the hardware through the Quartz compositor (kind of like the XServer) and Quartz Extreme (using OpenGL), and has been very much optimized through the use of SIMD instructions (AltiVec, SSE) and hardware acceleration (OpenGL).

                The design and architecture of both rendering systems, though not mutually exclusive, do let you see the difference in applications they were thought for. Quartz was built from the ground-up to be a desktop rendering system, while X allows for more distributed rendering and seems to have been one of its original goals. It is not surprising that Apple decided to depart from X when they created OS X and built from scratch (well based on NextStep) Quartz and its compositor, to better suit the desktop needs. They indeed implemented X11, but instead of having X have its own server, it does render through Quartz as the server, providing the protocol and libraries for X client applications. I think that in Linux in the not so distant future, that is going to become the natural evolution for X, from an inherently indirect rendering nature, to a direct rendering one, and keeping backwards compatibility through support libraries. This transition (IMHO) started with the addition of Composite, Damages and other extensions which will have a more centric role in X... Still a LOT of work. What would be better, allow X to naturally transition, or have another renderer like Wayland provide X11 compatibility through a mechanism similar to Mac OS X's Quartz? I don't know, and most likely both systems will coexist for some time before a decision is made in what direction Linux on the desktop is taken.

                PS Sorry for the long post.
                Last edited by Thetargos; 14 November 2008, 10:18 AM.

                Comment


                • #78
                  Originally posted by Thetargos View Post
                  PS Sorry for the long post.
                  No need to apologize +10 on the post. You hit the key points right on the head.

                  Comment


                  • #79
                    Originally posted by deanjo View Post
                    That is the only area where I even mentioned OS X being better.
                    You didn't mentioned in that comment what area you were talking about and that's why I replied (sic!).

                    Testing 2 non-Mesa drivers against each other in different OS doesn't prove squat when it comes to finding out if Mesa is the limiting factor or not.
                    So, why you asked:
                    Why do you think blobs bypass MUCH of X
                    ? If those drivers aren't using mesa? Man, you're amazing me, but it doesn't matter...


                    I asked for non Apple's link, but thanks. In this case they're probably objective.

                    @Thetargos

                    Thanks for this exhaustive comment .

                    Comment


                    • #80
                      Originally posted by kraftman View Post
                      So, why you asked: ? If those drivers aren't using mesa? Man, you're amazing me, but it doesn't matter...
                      The blobs don't use mesa that's why.

                      Comment

                      Working...
                      X