No announcement yet.

The Cost Of Running Compiz

  • Filter
  • Time
  • Show
Clear All
new posts

  • The Cost Of Running Compiz

    Phoronix: The Cost Of Running Compiz

    Earlier this week we published benchmarks comparing Arch Linux and Ubuntu. There were only a few areas where the two Linux distributions actually performed differently with many of their core packages being similar, but one of the areas where the results were vastly different was with the OpenGL performance as Ubuntu uses Compiz by default (when a supported GPU driver is detected) where as Arch does not. This had surprised many within our forums so we decided to carry out a number of tests with different hardware and drivers to show off what the real performance cost is of running Compiz as a desktop compositing manager in different configurations.

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    I must say I disagree with the default ubuntu/compiz setting that keeps indirectly rendering fullscreen windows. This is the correct setting, but only in an ideal world where the driver stack is perfect and compiz is much improved, but for now as you can see from this article, it causes slowdowns. But in my experience it is worse than shown by all these inane benchmarks. In some situations, such as with the nvidia binary drivers, having compiz enabled you can pretty much be sure that compiz or anything running full screen or within the compiz desktop will not be vsynced, or be vsynced badly, for example, frame cadence issues in movies, and tearing in quake live. These vsync issues with compiz enabled happen even if you have vsync set in the nvidia control panel.

    The best solution, for nvidia users as of right now, IMO, is to keep compiz running, because its pretty, but enable the option in compiz control center (under the general tab) of "unredirect fullscreen windows" The only side effect of this setting is a very fast flickering when switching from a fullscreen application (pretty much a game or full screen movie only) back to the desktop, its really not that bad, and what you get from it is great: Your videos will have perfect vsync and the correct frame cadence and there will be no performance drop or other vsync issues with games.


    • #3
      Good test but...
      I don't understand why you don't turn off Compiz when you play games


      • #4
        Suprised to see ATI performing better than nvidia in this aspect. According to my earlier (and subjective) tests, running compiz in your default X while running your game in a seperate X-session will remove any performance penalty given by compiz, at the cost of more RAM being used of course.


        • #5
          Originally posted by mum1989 View Post
          Good test but...
          I don't understand why you don't turn off Compiz when you play games
          Because, you shouldn't need to.

          Enabling composite windows in KDE4.x apparently also gives a drop. From

          With compositing off (using KDE) I get slightly better results :

          Heaven Benchmark v2.0FPS: 57.2
          Scores: 1441
          Min FPS: 22.7
          Max FPS: 146.9

          Compared to this with compositing on :

          Heaven Benchmark v2.0FPS: 41.7
          Scores: 1049
          Min FPS: 20.6
          Max FPS: 126.8
          But, you shouldn't need to turn it off.


          • #6
            I have always had the impression that turning compiz off didn't yield any speed boost for me with fglrx.


            • #7
              Running on nvidia without --loose-binding is nonsense anyway.


              • #8
                Originally posted by Hans View Post
                I have always had the impression that turning compiz off didn't yield any speed boost for me with fglrx.
                If your fps dropped from 150 to 120 there's nothing amazing


                • #9
                  This article is a response to another article that shows Arch Linux having almost double the frame rate in a game compared to Ubuntu.

                  They brushed it off saying it was due to Compiz.

                  However, these results show quite the opposite. Worst case scenario, you're running nVidia and get a 20% performance drop.
                  If you're using Intel, maybe a 15% performance drop.
                  Using ATI and you're either gaining from Compiz or having no change at all.

                  So where does this 40-50% performance drop come from in Ubuntu? I'm guessing you can't blame Ubuntu.

                  So can you run the Arch vs Ubuntu videogame test again with Compiz on/off?


                  • #10
                    What good does it do to push 400 FPS when your monitor only refreshes itself at 60hz?

                    Even the low FPS (in the 30s to 60s) reported by the open-source drivers should be sufficient.

                    The article is useless because the benchmarks ignore the monitor issue -- in practice, you'll never see the different between 100 and 400 FPS. Your screen simply does not refresh itself that many times per second. Most screens run at 50 to 70 Hz at max resolution.