I know this is beside the point but how impressive is that rock2 platform? It's clocked much lower than the laptop, has 1/8 the ram and a slow storage interface, but still puts in surprisingly quick build times.
Announcement
Collapse
No announcement yet.
Developers Explore Meson Build System For Wayland / Weston
Collapse
X
-
Originally posted by andreano View PostI doubt Jussi Pakkanen would have written Meson if he was content with CMake – CMake is already 16 years old.
Meson feels very similar to CMake in what it does – comparing it with autotools is a joke.
In terms of what it looks like, I much prefer the pythonic typed Meson syntax over CMake's magical variables which I can't wrap my head around.
The Meson language is made to feel familiar to python programmers, but *is not* python – nothing prevents Meson from being rewritten in any other language.
Comment
-
Originally posted by AsuMagic View PostBut.. You don't need to depend on any build system if you have CMake, don't you? The end-user can generate the kind of makefile he wants, from make to visual studio by ninja if they want.
- Likes 1
Comment
-
Originally posted by s_j_newbury View PostWhy all the continuous build-system churn? I suppose this is an unpopular opinion, but IMHO, autotools is good enough, with well defined dependencies scalable, and well understood.
- Likes 1
Comment
-
Originally posted by bkor View Post
What build systems have the GNOME developers gone through? I only know about autotools. The bit about autotools being good enough is addressed by the article.
Comment
-
I must confess I get a little annoyed with the continued misconceptions about Python and performance and whether it is even relevant. So let me just ask the Python haters. Considering the amount of io and computation that goes into a build; how large a part of that would you imagine is carried out by the build environment itself and how large a part is carried by the tools the build envirument calls?
Comment
-
Originally posted by s_j_newbury View PostWhy all the continuous build-system churn? I suppose this is an unpopular opinion, but IMHO, autotools is good enough, with well defined dependencies scalable, and well understood.
Regarding dependencies, yes they are well defined. But there are a whole bunch of them, and I need to be an expert with each. With cmake I have a single tool which is very well documented.
Regarding scalability, we push CMake to do things which the autotools could not imagine doing. It's limited to supporting make, and that means it's unable to make use of alternatives such as Ninja, which are vastly faster. And with super-builds, we can build vast software collections in a single run, all parallelised. While the autotools could do some of these things, it's capabilities are simply inferior to what more moderns tools provide.
Many projects suffer from the autotools overcomplexity and the fact that it's not evolved to solve today's portability needs. It was fine in the '90s and 2000s, but today it's lacking. It's a huge barrier for people who want to make changes, and that's why projects like KDE switched away. In a large project, the maintenance burden is huge. Tools like CMake are vastly simpler, while also being more powerful and featureful. That's why my work projects all use CMake today, so that all the other team members can understand it and make changes with confidence, requiring knowledge of a single tool only. They aren't required to understand shell, m4, make, autoconf macros and automake (and libtool) and all the learned experience to glue that all together. It's also why I converted all my autotools-using projects to CMake. It's vastly more maintainable, works on more platforms, and comes with zero downsides. And regarding network effects, that's why autotools is being replaced by more modern tools like CMakeif you need a macro to do something, chances are it's already been contributed, and if not you can write it and contribute it. That continuous improvement from users all over the world means it's got a huge amount of functionality by default (I've contributed a few myself, and maintain a few modules). That's something which the autotools have always lacked (I did contribute some bits to ac-archive, but it was never particularly active, as well as a handful of changes to autoconf itself)--you might have a handful of custom macros for a single project, but they wouldn't get submitted upstream, leading to a fragmented ecosystem with lots of projects doing odd special-case stuff. With CMake, when I write a module for any of my projects, I generally push it straight back upstream via a pull request. My projects are not accumulating project-specific cruft. CMake has become the new home for all that institutional knowledge we previously had with the autotools, but the difference is that it's shared to a much greater extent due to the much more active community around it. (I was on the autotools mailing lists for many years.)
Comment
-
Originally posted by TLE02 View PostI must confess I get a little annoyed with the continued misconceptions about Python and performance and whether it is even relevant. So let me just ask the Python haters. Considering the amount of io and computation that goes into a build; how large a part of that would you imagine is carried out by the build environment itself and how large a part is carried by the tools the build envirument calls?
You have to consider two things: the first, when the middle-user (I won't call him the end-user: as its name implies the end user uses the software). compiles the program once and installs the resulting binaries. For him, the build system is nothing but a bunch of scripts or file that's helpful to organize the build. He will only use it once.
The, there is the developer case, who tends to be a bit more picky about the build system because A) he will have to build the project often, sometimes after a very small code change B) he may have to modify the build rules themselves. Both points are quite important. I don't want my build system to use 30s to organize a complex build, only to spend 1s building the changed file. I don't want to trigger a full rebuild each time I change a small rule in the build system. I don't want to have a hard time changing this build rule - it should be far simpler than changing a line of code.
So computation speed might be at stake, as well as ease of use and ease of changes.
(please note that I do not imply that Python is not fast enough to do the job; I was just replying to the message I replied to).
Comment
-
Originally posted by rleigh View Post
There's not that much churn. Projects don't switch build systems often because there is a huge investment in them which creates vast amounts of inertia. For the last two decades, autotools have been predominant for several reasons. Being "good enough" was one big part, but also people picked it up because it's what everyone else was using. There was a huge amount of institutional knowledge and expertise invested in them, and network effects reinforced that. I used it since 2000. It takes a lot of learning. Several different languages and tools all need mastering and must all work together perfectly to get a working system. It's very complex.
...
Comment
Comment