Announcement

Collapse
No announcement yet.

X.Org Server Bids Farewell To Autotools

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • gornor
    replied
    An "automated" build system can be convenient when it works. All I want to know is how hard is the damage control when it inevitably fails.

    Leave a comment:


  • insilications
    replied
    almost anything is better than autotools

    Leave a comment:


  • sdack
    replied
    Originally posted by andreano View Post

    Note that downloading is only supposed to be a fallback for pkg-config: The user can disable it with `--wrap-mode=nodownload`, which is the intended mode for distro packagers and others who don't want the static linking bloat. As for pkg-config, yes, it will find custom installations of dependencies in the standard location (/usr/local/lib), and you can add to that with --pkg-config-path.

    As a developer, if you want full control of the dependencies, you either bundle them, or if you prefer the multi-repo approach, clone the dependencies alongside your own repo. In this use case, the download mechanism is rather a replacement for git-submodule (without its unrealistic expectation that devs & users always remember to run `git submodule update --init --recursive` for every action that could have touched the prescribed dependency versions).
    What is the build option to have it order beer and "All Meat" pizza of the Internet at the end of the build process?

    Leave a comment:


  • andreano
    replied
    Originally posted by sdack View Post
    Can it also provide the dependencies when the Internet is down?
    Note that downloading is only supposed to be a fallback for pkg-config: The user can disable it with `--wrap-mode=nodownload`, which is the intended mode for distro packagers and others who don't want the static linking bloat. As for pkg-config, yes, it will find custom installations of dependencies in the standard location (/usr/local/lib), and you can add to that with --pkg-config-path.

    As a developer, if you want full control of the dependencies, you either bundle them, or if you prefer the multi-repo approach, clone the dependencies alongside your own repo. In this use case, the download mechanism is rather a replacement for git-submodule (without its unrealistic expectation that devs & users always remember to run `git submodule update --init --recursive` for every action that could have touched the prescribed dependency versions).
    Last edited by andreano; 30 October 2021, 04:28 AM.

    Leave a comment:


  • Keats
    replied
    Originally posted by sdack View Post
    Of course. You copy it from another device.

    Can your method provide the dependencies when one has no admin privileges?

    Can it install the dependencies somewhere else, i.e. a non-standard location when there is not enough space left?

    What happens when the IP address or URL has changed after a year?

    And how does a user find out which exactly are the missing dependencies when the details are embedded in the build system?

    I sure get how this is super user-friendly when it works, but the reality is that the Internet is notoriously unreliable and a build system is only as robust as its weakest link. It does not matter if this is a shell script, or perl or python script. It is not user-friendly when it is only friendly when it works. It is user-friendly when it is friendly to the user whether it works or fails.
    If one is not a privileged user, one should probably not be installing software that requires additional dependencies. In fact, I'm pretty sure most organizations try to make sure that an ordinary end user can only install authorized software without having to contact support.

    Leave a comment:


  • sdack
    replied
    Originally posted by X_m7 View Post
    Can you get any code/dependencies/libraries/etc that you don't already have when the Internet is down?
    Of course. You copy it from another device.

    Can your method provide the dependencies when one has no admin privileges?

    Can it install the dependencies somewhere else, i.e. a non-standard location when there is not enough space left?

    What happens when the IP address or URL has changed after a year?

    And how does a user find out which exactly are the missing dependencies when the details are embedded in the build system?

    I sure get how this is super user-friendly when it works, but the reality is that the Internet is notoriously unreliable and a build system is only as robust as its weakest link. It does not matter if this is a shell script, or perl or python script. It is not user-friendly when it is only friendly when it works. It is user-friendly when it is friendly to the user whether it works or fails.

    Leave a comment:


  • X_m7
    replied
    Originally posted by sdack View Post
    Can it also provide the dependencies when the Internet is down?
    Can you get any code/dependencies/libraries/etc that you don't already have when the Internet is down?

    I mean, I'm not sure where else they're supposed to come from, unless you're saying that every program/library should include all of their dependencies with them. It's either you download them via the build system, or your system's package manager, or directly from the author and compile it yourself, no?

    Leave a comment:


  • ihatemichael
    replied
    I still remember the times when I used to compile XFree86 with "make World", feeling old.

    Leave a comment:


  • sdack
    replied
    Originally posted by andreano View Post
    ... Note that you can pin dependency versions as exactly as you want in all 3 cases ...
    Can it also provide the dependencies when the Internet is down?

    Leave a comment:


  • andreano
    replied
    Originally posted by sdack View Post
    For you perhaps this is true, but why even build the software and not just download it all when you already download everything else. Save yourself the trouble.

    For actual development, meaning not just a one-time compile-and-install, do I need my build processes to be reproducible. It is otherwise a nightmare to debug.
    Use case, not person. Please. Yes, distributing binaries and bundling dependencies instead of having a user friendly build system is fine for closed source work, I can imagine. Note that you can pin dependency versions as exactly as you want in any case, whether it be installed, downloaded, bundled or a redundant combination of these.
    Last edited by andreano; 29 October 2021, 09:00 PM.

    Leave a comment:

Working...
X