Announcement

Collapse
No announcement yet.

Autodafe 0.2 Released For Freeing Your Project From Autotools

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by mathletic View Post
    Even LibreOffice has not understood that it should stop using Autotools, see https://bugs.documentfoundation.org/...i?id=113121#c4
    That links suggests that it is a question of funding rather than lack of understanding. If anyone or likely a vendor wants to sponsor that work, I bet it can be done before the next release.

    Comment


    • #12
      Another grandiose ESR project, the greatest hacker alive. Looking over it, I highly doubt that it can keep any of its promises on anything but the most simple test cases.

      Comment


      • #13
        This is a good idea. Usually when I am brought in to maintain or bring up some old crusty project for modern platforms, autotools is typically present somewhere in the muddy puddle.

        My current approach is just to build it on a known working (usually ancient) platform, copy the compile log and the generated config.h and start a new build setup from scratch, partially generated using a collection of homegrown scripts that scrape the compile log. I usually migrate to Makefiles for weird stuff (i.e vendor compilers, lots of domain specific scripts, stupid amounts of dependencies etc) but try to use homogenous CMake when possible since I am happy to help push this as the industry standard, even if it does have limitations. Its better than nothing.

        autodafe looks to currently work on only a fairly small subset of autotools (which if a project was that simple in the first place, questions the dumb dependence on autotools). However I am interested to see how this grows.
        Last edited by kpedersen; 18 April 2024, 09:19 AM.

        Comment


        • #14
          Originally posted by mathletic View Post
          On the other hand, ESR proposing to switch to plain Makefiles is ill-advised: No easy detection of the build environment, detection of available dependencies, altering configuration flags, abstraction of C++-specific fiddling, and a high level abstraction.
          Depends on the use case. Could be an alternative for package developers where detection and portability are less of an issue and things like the XZ situation can be made harder to achieve. Also, as suggested above, could be useful for older source trees and additionally useful for switching to meson or cmake as a makefile is generally easier to read than autotools.

          Comment


          • #15
            Originally posted by jeisom View Post
            Also, as suggested above, could be useful for older source trees and additionally useful for switching to meson or cmake as a makefile is generally easier to read than autotools.
            I think the opposite is true. Autoconf and Automake provide abstractions that are lost in a transition to Makefiles. A (manual) direct migration will result in better CMake code.

            Comment


            • #16
              Originally posted by spicfoo View Post
              That links suggests that it is a question of funding rather than lack of understanding.
              It does not:
              + understand that we don’t want to drop something that works already
              [..]
              + what problem does this solve?
              [..]
              + sitting on the fence (Thorsten)

              One optinion is about a potential external developer that might be paid.
              + would not be great to pay some external developers to do the migration and then let us maintain it

              Originally posted by spicfoo View Post
              If anyone or likely a vendor wants to sponsor that work, I bet it can be done before the next release.
              I doubt that. The inventor of Meson provided a prototype. This should be picked up! Instead it is let bit-rod. Migrate the build-system of such a huge code base will take at least a year and take at least two years to finally remove the old one.

              Comment


              • #17
                Originally posted by mathletic View Post

                I doubt that. The inventor of Meson provided a prototype. This should be picked up! Instead it is let bit-rod. Migrate the build-system of such a huge code base will take at least a year and take at least two years to finally remove the old one.
                You don't need to remove the old code. That prototype can be turned fully functional within a release interval as has been done for many other large projects. We have a long history publicly available. The funding question if addressed will answer for all that readily. So the exact timeline isn't really a problem. If it takes two releases,so be it.

                Comment


                • #18
                  Originally posted by mathletic View Post
                  No easy detection of the build environment, detection of available dependencies, altering configuration flags, abstraction of C++-specific fiddling, and a high level abstraction.
                  There should not be any detection of the build environment, which every package would do in a subtly different way. It is the job of the package manager to provide a build environment with all the dependencies and flags. If something required is missing, the build should immediately FAIL, not try to download missing dependencies, and not try workarounds such as linking against some other libraries.

                  The reason that reproducible builds and cross compilation are so painful is because of pushing of these packaging concerns to individual builds. It's not every developer's job to figure out what's present or absent on the target system. The autotools approach is from the stone age period of dependency management, and we have better tools now.

                  Comment


                  • #19
                    Originally posted by ziguana View Post

                    There should not be any detection of the build environment, which every package would do in a subtly different way. It is the job of the package manager to provide a build environment with all the dependencies and flags.
                    You never worked with C and C++, did you? Nobody talks about package manager and automatic downloads. You have to work with C libraries, that have some common parts and other parts are different. Linux uses glibc, macOS and BSD something different, and Windows has its own ways. C++ has three mayor compilers (GCC, Clang, MS Visual C++) and you have to work with multiple different version. Compiling Boost is non-trivial. On super computers you have to use the MPI implementation provided by the super computer vendor.
                    Now go play with JavaScript, Python, or whatever toy you prefer.

                    Comment


                    • #20
                      Originally posted by mathletic View Post
                      Now go play with JavaScript, Python, or whatever toy you prefer.
                      Calm down grandpa, you're going to have another heart attack. It is because I work with boomer trash such as C++ and Autotools that I want something better. I don't know what the rest of your word salad has to do with my claim. If you have to support every combination of [compiler x libc x vendor lib], then the proper layer to deal with this complexity is above the build system. The build should generate binaries after being provided a configuration by the package manager or the meta build system. The build itself should not probe what's on the host because that's fucking ass-backwards, and you may not even be targeting the host to begin with.

                      Comment

                      Working...
                      X