Announcement

Collapse
No announcement yet.

Trying Out The Ubuntu "-O3" Optimized Build For Greater Performance

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by sophisticles View Post

    Or it would serve as a gauge of how many poorly educated people there are.

    As an example, look at al the likes you received for your comment; that tells me that there are dozens of people here that do not write code, or if they do, they do not do so correctly.
    Some things are constant in time, programmer hubris is one.

    Expecting every programmer to catch nearly every chance at optimization is simply utopic. It's never going to happen and it's a massive waste of time. All that time would be much better spent at doing these things automatically.

    Same goes with people claiming pretty much the same about C and safety. The track record shows that repeating the "everyone just has to become a better programmer" mantra does not work out.

    Comment


    • #42
      Originally posted by Avamander View Post

      Some things are constant in time, programmer hubris is one.

      Expecting every programmer to catch nearly every chance at optimization is simply utopic. It's never going to happen and it's a massive waste of time. All that time would be much better spent at doing these things automatically.

      Same goes with people claiming pretty much the same about C and safety. The track record shows that repeating the "everyone just has to become a better programmer" mantra does not work out.
      Ah, the ole "just git good" mantra that ignores that 14yo Tommy doesn't have 50000 hours experience in FPS shooters when compared to 25yo Steve.

      That's why we have better languages now. It can take 10-20 years to have the kind of experience and skills necessary to write optimized code so a language and compiler that can add 10 years of free experience to any developer is worth considering. It also helps to use an IDE that knows where to put the yellow paint.

      Even DARPA is working on a way to convert C to Rust due to C safety concerns.

      Comment


      • #43
        Originally posted by MarkG View Post

        I completely agree with this comment (and completely disagree with @tabicat's comment). Back in the day (almost 20 years go), the human (at another division of the company) in charge of daily builds and running regression tests used to change my -O2 to -O3. That introduced regressions (all proven to be compiler bugs), performance slow-downs, and bloated the code. It took a contentious meeting of my upper management with the other division's upper management to get that to stop.
        Consider that 20 years have passed and things are quite different.

        Comment


        • #44
          Originally posted by darkonix View Post

          Consider that 20 years have passed and things are quite different.
          Actually, -O3 still does not beat my -O2 code. I recently proved this (to myself) on both gcc and llvm, though I never did a deep dive on why my code ran fastest on Rocky/Alma (and yeah, I wrote my own benchies).

          Perhaps optimizers have gotten better on average code, but not on code written back when it still mattered,

          Comment


          • #45
            While Ubuntu are experimenting with compiler options, I'd be much more interested in the impact of -Os. It's a shame that binary sizes weren't also compared in this article. Optimising for size in my experience often has surprising performance benefits, especially on mobile (but in fairness, my relevant experience is probably out of date at this point).

            Comment


            • #46
              Originally posted by imipak View Post

              I'm not sure I'd entirely agree. Code, properly written, should be clean, easy to read, easy to maintain, easy to validate, and as generalised as the context allows.

              It's actually a very bad programming technique to hand-optimise high-level code, and extremely bad if you try to code in a way that forces use of specific opcodes.

              Having said that, you're absolutely right that there should not be large improvements. Improvements should be moderate, but no greater.
              ​​​​​
              I would have assumed it there was a large benefit to O3 it would already be default. This is the case with many packages. Where all or parts of it is already optimized with O3. So Ubuntu are essentially only changing the optimization levels for packages where the programmers have not already raised the level. Hence it would only improve for lazy progammers that never considered having O3 defaults.

              Comment


              • #47
                Originally posted by tabicat View Post

                Canonical is the package maintainer. And the upstream developer rarely cares.
                This is one of the reasons why Flathub exists. Third-party package maintainers shouldn't be mucking around and introducing potential regressions to software which users expect they're getting an upstream-developer-approved version of. These compiler flags should only be touched by the upstream developers IMO.

                Comment

                Working...
                X