Announcement

Collapse
No announcement yet.

Developers Explore Meson Build System For Wayland / Weston

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • emblemparade
    replied
    For those still interested in this thread -- I started a new project, Rōnin, which is yet another Python frontend to Ninja. In my eyes it fixes the problems I have with all the others.

    Leave a comment:


  • rleigh
    replied
    Originally posted by s_j_newbury View Post
    I certainly can't argue with new projects choosing the build-system that is the "best fit", that does often seem to be CMake currently. I suppose I just don't want to feel I've wasted all that effort learning autotools! ;-)
    To be honest, when I moved over to CMake after using the autotools for nearly 15 years, that was certainly a factor to consider. However, I think it's also fair to say that a huge amount of your hard-won autotools expertise will translate directly to CMake. It's merely a matter of learning the cmake equivalent for the various autotools features. The concepts are largely identical: project name+version, configurable options, feature detection, header and other file generation, specification of sources, headers, what to build and where to install. For many projects I've been able to rename the configure.ac and Makefile.am to CMakeLists.txt and run straight down them converting each block to cmake as I go. For some projects I did consider leaving them using autotools, but once I'd gained the skill to convert a project in an hour or two, the simplicity and flexibility gained by doing so made it worth it. I even retain the same autoconf/automake split: project configuration at the top level, feature macros from m4/ are now feature macros in cmake/ and the Makefile.am logic in the source directory remains in exactly the same place but in CMakeLists.txt instead. This makes conversion simpler, and easy to adopt for developers already familiar with the old autotools organisation.
    Last edited by rleigh; 04 January 2017, 07:29 PM.

    Leave a comment:


  • s_j_newbury
    replied
    Originally posted by rleigh View Post

    There's not that much churn. Projects don't switch build systems often because there is a huge investment in them which creates vast amounts of inertia. For the last two decades, autotools have been predominant for several reasons. Being "good enough" was one big part, but also people picked it up because it's what everyone else was using. There was a huge amount of institutional knowledge and expertise invested in them, and network effects reinforced that. I used it since 2000. It takes a lot of learning. Several different languages and tools all need mastering and must all work together perfectly to get a working system. It's very complex.

    ...
    Nice reply. You make some good points, I'm still not entirely convinced the improvement is always worth the investment on an existing project, I certainly can't argue with new projects choosing the build-system that is the "best fit", that does often seem to be CMake currently. I suppose I just don't want to feel I've wasted all that effort learning autotools! ;-)

    Leave a comment:


  • Emmanuel Deloget
    replied
    Originally posted by TLE02 View Post
    I must confess I get a little annoyed with the continued misconceptions about Python and performance and whether it is even relevant. So let me just ask the Python haters. Considering the amount of io and computation that goes into a build; how large a part of that would you imagine is carried out by the build environment itself and how large a part is carried by the tools the build envirument calls?
    From my experience it's surprisingly large - of course, compiling a huge program will always be a larger task that handling how this program is built, but the build system is always important.

    You have to consider two things: the first, when the middle-user (I won't call him the end-user: as its name implies the end user uses the software). compiles the program once and installs the resulting binaries. For him, the build system is nothing but a bunch of scripts or file that's helpful to organize the build. He will only use it once.

    The, there is the developer case, who tends to be a bit more picky about the build system because A) he will have to build the project often, sometimes after a very small code change B) he may have to modify the build rules themselves. Both points are quite important. I don't want my build system to use 30s to organize a complex build, only to spend 1s building the changed file. I don't want to trigger a full rebuild each time I change a small rule in the build system. I don't want to have a hard time changing this build rule - it should be far simpler than changing a line of code.

    So computation speed might be at stake, as well as ease of use and ease of changes.

    (please note that I do not imply that Python is not fast enough to do the job; I was just replying to the message I replied to).

    Leave a comment:


  • rleigh
    replied
    Originally posted by s_j_newbury View Post
    Why all the continuous build-system churn? I suppose this is an unpopular opinion, but IMHO, autotools is good enough, with well defined dependencies scalable, and well understood.
    There's not that much churn. Projects don't switch build systems often because there is a huge investment in them which creates vast amounts of inertia. For the last two decades, autotools have been predominant for several reasons. Being "good enough" was one big part, but also people picked it up because it's what everyone else was using. There was a huge amount of institutional knowledge and expertise invested in them, and network effects reinforced that. I used it since 2000. It takes a lot of learning. Several different languages and tools all need mastering and must all work together perfectly to get a working system. It's very complex.

    Regarding dependencies, yes they are well defined. But there are a whole bunch of them, and I need to be an expert with each. With cmake I have a single tool which is very well documented.

    Regarding scalability, we push CMake to do things which the autotools could not imagine doing. It's limited to supporting make, and that means it's unable to make use of alternatives such as Ninja, which are vastly faster. And with super-builds, we can build vast software collections in a single run, all parallelised. While the autotools could do some of these things, it's capabilities are simply inferior to what more moderns tools provide.

    Many projects suffer from the autotools overcomplexity and the fact that it's not evolved to solve today's portability needs. It was fine in the '90s and 2000s, but today it's lacking. It's a huge barrier for people who want to make changes, and that's why projects like KDE switched away. In a large project, the maintenance burden is huge. Tools like CMake are vastly simpler, while also being more powerful and featureful. That's why my work projects all use CMake today, so that all the other team members can understand it and make changes with confidence, requiring knowledge of a single tool only. They aren't required to understand shell, m4, make, autoconf macros and automake (and libtool) and all the learned experience to glue that all together. It's also why I converted all my autotools-using projects to CMake. It's vastly more maintainable, works on more platforms, and comes with zero downsides. And regarding network effects, that's why autotools is being replaced by more modern tools like CMakeif you need a macro to do something, chances are it's already been contributed, and if not you can write it and contribute it. That continuous improvement from users all over the world means it's got a huge amount of functionality by default (I've contributed a few myself, and maintain a few modules). That's something which the autotools have always lacked (I did contribute some bits to ac-archive, but it was never particularly active, as well as a handful of changes to autoconf itself)--you might have a handful of custom macros for a single project, but they wouldn't get submitted upstream, leading to a fragmented ecosystem with lots of projects doing odd special-case stuff. With CMake, when I write a module for any of my projects, I generally push it straight back upstream via a pull request. My projects are not accumulating project-specific cruft. CMake has become the new home for all that institutional knowledge we previously had with the autotools, but the difference is that it's shared to a much greater extent due to the much more active community around it. (I was on the autotools mailing lists for many years.)

    Leave a comment:


  • TLE02
    replied
    I must confess I get a little annoyed with the continued misconceptions about Python and performance and whether it is even relevant. So let me just ask the Python haters. Considering the amount of io and computation that goes into a build; how large a part of that would you imagine is carried out by the build environment itself and how large a part is carried by the tools the build envirument calls?

    Leave a comment:


  • s_j_newbury
    replied
    Originally posted by bkor View Post

    What build systems have the GNOME developers gone through? I only know about autotools. The bit about autotools being good enough is addressed by the article.
    I meant generally in the OSS community. There's a lot of effort spent on learning build-systems, converting projects, and advocacy! It's a largely solved problem, some systems are more suitable than others depending on the goals of the project, but if it works already, why not just focus energy on coding? Of course people work on what they enjoy, I guess for some that's the search for the holy grail of build-systems...

    Leave a comment:


  • bkor
    replied
    Originally posted by s_j_newbury View Post
    Why all the continuous build-system churn? I suppose this is an unpopular opinion, but IMHO, autotools is good enough, with well defined dependencies scalable, and well understood.
    What build systems have the GNOME developers gone through? I only know about autotools. The bit about autotools being good enough is addressed by the article.

    Leave a comment:


  • s_j_newbury
    replied
    Why all the continuous build-system churn? I suppose this is an unpopular opinion, but IMHO, autotools is good enough, with well defined dependencies scalable, and well understood.

    Leave a comment:


  • TingPing
    replied
    Originally posted by AsuMagic View Post
    But.. You don't need to depend on any build system if you have CMake, don't you? The end-user can generate the kind of makefile he wants, from make to visual studio by ninja if they want.
    Meson can generate XCode and Visual Studio projects also.

    Leave a comment:

Working...
X