Announcement

Collapse
No announcement yet.

Squeezing More Juice Out Of Gentoo With Graphite, LTO Optimizations

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by In_Between_Names View Post
    It is indeed true that LTO will increase compile times relative to a non LTO build. However, it's really not as bad as you might think! For binary distros, the compilation times would be transparent anyways since the users almost never see them. For source based distros like Gentoo, most users have a kernel setup that allows them to compile in the background seamlessly while they work too. Usually you compile once and run many times, and so even a small speedup will add up over time. For the record, I'm on bleeding edge Gentoo too, so I compile probably a lot more than most people.
    I suppose it depends on the use case. Personally I don't use Gentoo on desktops to begin with; I use it on servers (that need reliability) and low-power systems (that benefit from the optimisations a lot but I definitely spend more time compiling than using the packages).

    Comment


    • #42
      Originally posted by duby229 View Post

      nitpick, it's the same difference.
      It's actually not the same at all. O3 does not *cause* undefined behaviour. O3 assumes your code *has* no undefined behaviour moreso than O2 and performs transformations accordingly. This can be enough to cause invalid code to fail. Look at how many security bugs have happened over the years because of this: buffer overflows, return to libc, you name it. We should not be encouraging people to develop with O2 to excuse this behaviour. We should be holding C and C++ code to a higher standard that contains no UB to begin with.

      Comment


      • #43
        Originally posted by GreatEmerald View Post

        I suppose it depends on the use case. Personally I don't use Gentoo on desktops to begin with; I use it on servers (that need reliability) and low-power systems (that benefit from the optimisations a lot but I definitely spend more time compiling than using the packages).
        Of course, if you need reliability I strongly advise against using this configuration--but you already knew that :P

        On low power systems, believe it or not, I've been running LTOed LEDE builds on my router for quite some time now. I haven't had any problems, but of course, I do like pushing the limits and YMMV. I use -Os and --fast-math on there with the appropriate mcpu and mtune, but I have toyed with the idea of using -Ofast as well. It was much easier setting up LTO on LEDE than it was for a full desktop Gentoo system--I have only two exceptions needed.

        Comment


        • #44
          Originally posted by In_Between_Names View Post

          It's actually not the same at all. O3 does not *cause* undefined behaviour. O3 assumes your code *has* no undefined behaviour moreso than O2 and performs transformations accordingly. This can be enough to cause invalid code to fail. Look at how many security bugs have happened over the years because of this: buffer overflows, return to libc, you name it. We should not be encouraging people to develop with O2 to excuse this behaviour. We should be holding C and C++ code to a higher standard that contains no UB to begin with.
          And to take things to an extreme. I know many Windows applications are built with optimization disabled (essentially -O0), usually the MSVC compiler is blamed for breaking the code when optimizing, and while the MSVC compiler is bad, that is not the main reason why so much code break, it breaks because it is wrong. It just goes to show how even something similar to -O0 vs -O2 can break things if people get too used to developing with -O0, and then thinks the compiler is broken because -O2 doesn't work.

          Comment


          • #45
            Originally posted by In_Between_Names View Post

            It's actually not the same at all. O3 does not *cause* undefined behaviour. O3 assumes your code *has* no undefined behaviour moreso than O2 and performs transformations accordingly. This can be enough to cause invalid code to fail. Look at how many security bugs have happened over the years because of this: buffer overflows, return to libc, you name it. We should not be encouraging people to develop with O2 to excuse this behaviour. We should be holding C and C++ code to a higher standard that contains no UB to begin with.
            It is a nitpick, because no C or C++ is bug free. You can write the simplest code and compile it with different compilers and get different binaries. You have to assume that all C or C++ code exhibits undefined behavior. Most code does.

            Comment


            • #46
              Originally posted by duby229 View Post

              It is a nitpick, because no C or C++ is bug free. You can write the simplest code and compile it with different compilers and get different binaries. You have to assume that all C or C++ code exhibits undefined behavior. Most code does.
              If it exhibits UB, some day it *will* break. And I'm always going to argue for making things break earlier rather than later.

              Comment


              • #47
                Originally posted by GrayShade View Post

                If it exhibits UB, some day it *will* break. And I'm always going to argue for making things break earlier rather than later.
                That's fine, so you make it break now, and then you correct it for that version of that compiler. That's the best guarantee you can give. But you can also guarantee it will -still- exhibit undefined behavior in some other circumstance.

                Comment


                • #48
                  Originally posted by duby229 View Post

                  It is a nitpick, because no C or C++ is bug free. You can write the simplest code and compile it with different compilers and get different binaries. You have to assume that all C or C++ code exhibits undefined behavior. Most code does.
                  Citation needed re: "no C or C++ is bug free". It is entirely possible to write correct C and C++ code. The standards documents that govern those languages document in *excruciating* detail where the language is defined and undefined. Clang and GCC even have a nice suite of UB sanitizers now that you can test your code with. They are growing more complete with each release. It may be difficult to write correct code, but it is far from impossible.

                  Comment


                  • #49
                    Originally posted by In_Between_Names View Post

                    Citation needed re: "no C or C++ is bug free". It is entirely possible to write correct C and C++ code. The standards documents that govern those languages document in *excruciating* detail where the language is defined and undefined. Clang and GCC even have a nice suite of UB sanitizers now that you can test your code with. They are growing more complete with each release. It may be difficult to write correct code, but it is far from impossible.
                    Of course. I'm not saying that undefined behavior shouldn't be fixed. I'm just saying that no matter what, it's still going to exhibit undefined behavior some scenario or another. 100% of the time on 100% of C or C++ code. It's just not possible to completely and totally avoid.

                    Comment


                    • #50
                      Originally posted by duby229 View Post

                      Of course. I'm not saying that undefined behavior shouldn't be fixed. I'm just saying that no matter what, it's still going to exhibit undefined behavior some scenario or another. 100% of the time on 100% of C or C++ code. It's just not possible to completely and totally avoid.
                      Are you talking about the difficulty to avoid plain bugs such as array access out of range, or is there some area in the language definition that you think is problematic to follow?

                      Comment

                      Working...
                      X