Announcement

Collapse
No announcement yet.

Why Software Defaults Are Important & Benchmarked

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Something I've always wondered: if all these tweaks give extra performance and should be benchmarked instead of the defaults, then why are those tweaks not the defaults? Default values are (normally) set for a good reason, and while I do think that tweaks, multiple hardware configurations, etc, are required for a comprehensive analysis of a particular item, benchmarking alone requires some baseline, and that will be your default system.
    When various tweaks prove to be suitable and stable, I'm sure they'll turn into default values.

    Comment


    • #12
      Hahah, I liked this article.

      Comment


      • #13
        clearly most people are content with the defaults.
        You're forgetting the possibility that they don't know about other things.

        Seriously, in your benchmarking. People already know about optimizations and benchmarking. The most of your users are already quite experienced with it.

        But the more casual user won't know all these things.

        The last time I saw someone picking a computer he just looked at the keyboard.
        He didn't even look at the memory or the CPU or GPU.
        Can you believe that! When I told him that those things were important. He said "what's the difference?".

        Please don't throw everybody on a They choose for this so they want it and everything about it. That's a big mistake.

        Comment


        • #14
          Originally posted by mirv View Post
          Something I've always wondered: if all these tweaks give extra performance and should be benchmarked instead of the defaults, then why are those tweaks not the defaults? Default values are (normally) set for a good reason, and while I do think that tweaks, multiple hardware configurations, etc, are required for a comprehensive analysis of a particular item, benchmarking alone requires some baseline, and that will be your default system.
          When various tweaks prove to be suitable and stable, I'm sure they'll turn into default values.
          I haven't checked the blobs recently, but the free drivers sync the rendering to the refresh interval of the display by default.

          Which means that any and every benchmark will always give a maximum of 60fps on a modern LCD display, using default settings. That is a sane default for most things -- and it removes tearing when playing video, for example -- but makes for an utterly pointless benchmark -- and makes the results less reliable, since most frame rates are cut to 20, 30 or 60 fps.

          Which is why most benchmarks disable vsync in order to test the drivers' and the cards' ability to render complex scenes at their limit. And this is already not the default configuration, and introduces tearing. I don't see a fundamental difference between disabling vsync (causes tearing, improves performance) and disabling SwapBuffersWait (causes tearing, improves performance).

          Comment


          • #15
            Originally posted by mtippett View Post
            Well, xorg.conf is supposed to be gone already (interesting when you think about the topic of the post .
            That works great in theory but in many situations a xorg.conf is still required.

            Comment


            • #16
              Originally posted by pingufunkybeat View Post
              I haven't checked the blobs recently, but the free drivers sync the rendering to the refresh interval of the display by default.

              Which means that any and every benchmark will always give a maximum of 60fps on a modern LCD display, using default settings. That is a sane default for most things -- and it removes tearing when playing video, for example -- but makes for an utterly pointless benchmark -- and makes the results less reliable, since most frame rates are cut to 20, 30 or 60 fps.

              Which is why most benchmarks disable vsync in order to test the drivers' and the cards' ability to render complex scenes at their limit. And this is already not the default configuration, and introduces tearing. I don't see a fundamental difference between disabling vsync (causes tearing, improves performance) and disabling SwapBuffersWait (causes tearing, improves performance).
              Ah, I hadn't considered that one, thanks. Yes, some things like that should probably be mentioned - I had a quick read of the latest ubuntu article (normally I don't read them - what ubuntu does has little to no interest for me) and there was a mention about disabling SwapBuffersWait to improve performance. Perhaps a few more details like that in articles?
              I still think testing defaults is ok though, as if nothing else it provides a baseline, but mention of what can skew results (and why it is/isn't done) is definitely useful.

              Comment


              • #17
                Originally posted by pingufunkybeat View Post
                I haven't checked the blobs recently, but the free drivers sync the rendering to the refresh interval of the display by default.

                Which means that any and every benchmark will always give a maximum of 60fps on a modern LCD display, using default settings. That is a sane default for most things -- and it removes tearing when playing video, for example -- but makes for an utterly pointless benchmark -- and makes the results less reliable, since most frame rates are cut to 20, 30 or 60 fps.

                Which is why most benchmarks disable vsync in order to test the drivers' and the cards' ability to render complex scenes at their limit. And this is already not the default configuration, and introduces tearing. I don't see a fundamental difference between disabling vsync (causes tearing, improves performance) and disabling SwapBuffersWait (causes tearing, improves performance).
                I don't know if Michael has a 120hz display or not, but he had a few benchmarks in this article where the free drivers got over 60fps. One even had 90fps.

                Comment


                • #18
                  Originally posted by pvtcupcakes View Post
                  I don't know if Michael has a 120hz display or not, but he had a few benchmarks in this article where the free drivers got over 60fps. One even had 90fps.
                  If vsync is off, then this is already not a default configuration.

                  As for default config as a baseline, I'm all for it. I have nothing against testing the default. But I personally think that Ubuntu has made some bad decisions -- like using Compiz, which cannot suspend compositing on the fly AFAIK -- which harms performance on free drivers disproportionately.

                  KDE users do not have this issue.

                  It's unfair if poor defaults (in this case, unrelated to the driver) make the driver look bad.

                  Comment


                  • #19
                    What is wrong with benchmarks

                    A lot is wrong with this kind of article, if not merely with the way the testing is done. I posted in some considerable depth about this in December.

                    One of my main gripes is that even though subject X might gain more from performance tuning than subject Z, subject Z might be favored by the environment and performance/benchmarks articles are all too often used as a decision maker.

                    Example:
                    • Candidates: XFCE and Gnome
                    • Test Environment: Old Computer
                    • (Add performance test results in here)
                    • Conclusion: XFCE is faster, thus better than Gnome.


                    The PROBLEM is that people who use performance numbers to make decisions don't always realize (or choose to ignore) the features and any other benefits/advantages of the other subjects in the test.

                    I explained my gripe with this kind of article in much more detail in the post. In particular I ask numerous questions about the test environment used in the article about the performance of various file systems, and point out specific flaws in the testing and reporting.

                    Comment


                    • #20
                      you have a tough job.

                      maybe you just need to be clear why you are doing the benchmarking. that would then influence the options. this site does lots of different sorts of benchmarking for different reasons.

                      if you are benchmarking to see which distro/OS is fastest, then defaults might be best.
                      if you want to compare 2 drivers for the same hardware, then you need to make sure the drivers are doing the same task.
                      if you want to compare several gcc releases then a range of options is good (e.g. does -lto make a difference).
                      comparing 2 different compiles, i'd say find the fastest option that still makes a functioning program.

                      for graphics drivers more than FPS than the screen refresh is meaning less and useless for a user. it is fast enough. maybe you need to find more demanding settings for the benchmark.

                      Comment

                      Working...
                      X