Announcement

Collapse
No announcement yet.

Rust 1.36 Brings Offline Support In Cargo, Stable Alloc Crate

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #91
    Originally posted by rene View Post
    I'm not doing it manually, I want to do it automated as part of our Linux distribution build process (#t2sde). But as I said already rust & cargo are not very cooperative to be packaged as Linux distribution package. Which should already say everything about the state of their amazing software deployment methods.
    Alright, regarding that I have only a little bit of experience with building Rust applications on OpenEmbedded and and agree with you.
    I hope this will improve in the future and that the Rust devs don't ignore this issue.

    I was mostly annoyed by people saying "But building everything for source for no good reason is slow!". This case is different.

    Comment


    • #92
      Originally posted by jjkk View Post
      Well I am one of those few users. What is the logic here. It seems you and moltonel are trying to shift focus from the initial question to "we can not do it, and that is why it is a stupid idea".
      I am not trying to shift focus, you are. My first comment was a reply on some stupid YouTuber being surprised that it takes a long time to compile a compiler even though he's not even gaining anything from compiling it from source in his case. He only cares for compiling Firefox from source and has decided to also compile Rust from source without any modifications which is pointless.

      You CAN do it, it's just slow to compile a large compiler which should not be a surprise.

      Originally posted by jjkk View Post
      Why even advocating Rust when such sharp contradiction exist? It strives to be general purpose language and in the reality it implies that only a certain group of people will be able to use it without pain and suffering. Yes, majority of users do not care at all and they indeed should not. But still there are distro maintainers who say that supporting Rust-dependent packages and even Rust itself is absolute hell. And they do not say it about any of dozens other languages. Why this special treatment? Why it should be so?
      The bad support for distros I agree with you and it's a shame. However, it's not really getting any special treatment IMO and supporting languages which have their own package manager is very hard as every package uses different versions of their dependencies which makes everything a hassle to manage. cargo itself also has issues with packaging rust and its dependency properly for distros, but as far as I know the issue is simply just that it's not the cargo devs have more important things to do currently and that's simply just proof that the toolchain need some more time to mature.

      Comment


      • #93
        Originally posted by jjkk View Post
        Well I am one of those few users. What is the logic here. It seems you and moltonel are trying to shift focus from the initial question to "we can not do it, and that is why it is a stupid idea".
        Rust can somewhat do it. It's low-priority, not stupid. It's hard not to counter your arguments because, by aggressively shooting down Rust on the ground of its bootstrap process or dependency handling, you're IMHO missing out and actually shooting down towards your foot.

        Originally posted by jjkk View Post
        Why even advocating Rust when such sharp contradiction exist? It strives to be general purpose language and in the reality it implies that only a certain group of people will be able to use it without pain and suffering. Yes, majority of users do not care at all and they indeed should not. But still there are distro maintainers who say that supporting Rust-dependent packages and even Rust itself is absolute hell. And they do not say it about any of dozens other languages. Why this special treatment? Why it should be so?
        What's the contradiction ? All technologies have pros and cons, and it's perfectly natural to advocate when you think the pros beat the cons, at least in some settings. The pros of rust are great, and you're IMHO overestimating the pain and suffering of bootstraping rust. Asking for too high a standard, like an alternative-medicine proponent asking an evidence-based medicine proponent to prove them wrong.

        Concerning the "myriad of small packages" argument, other languages pose the exact same issue to distributions, for example perl, python, ruby. The question being "how to maintain packages for all the deps ? Should we ?". Usually distribs employ a hybrid approach, packaging some, including others in the package "source".

        For that purpose, Rust is actually easyer on distribs thanks to cargo, as all rust binaries do their dependency handling themselves with the same config file, which can't be said of perl/python/ruby programs. The easyest way to package a rust program is to list its deps as source files for your program, let your package manager download them, and compile normaly. There's a cargo subcommand that'll create such gentoo packages for you.

        There's less incentive to package individual deps for rust because of static linking, LTO, and the ease of patching a different version in Cargo.toml at build time. This kind of program-side dependency selection is much less harmful than the heavy vendoring that is done by chromium, for example.

        Comment


        • #94
          > and we'd much rather use an uncertified language/compiler that prevents memory issues and provides seamless compiling/packaging than a certified one that has been shown to be a security minefield and has build systems that eat up productivity

          Great point. Someone could have written a few KLOCs of worthy code in the meantime instead of nagging about fictituous "broken chain of trust" and doing a pointless compiler bootstrapping all the way back into manually flipping bits via analog triggers.
          Last edited by dremon_nl; 07-08-2019, 06:38 AM.

          Comment


          • #95
            Originally posted by rene View Post
            Gcc is for decades bootstrapped using the pre-existing system compiler, and then compiled again by itself.
            Apart from not having decades of existence, the chain you describe applies equally to rust. As long as you're not bringing an independently-reproduced compiler build into the mix, the chain of trust has the same strength.

            What does reproducible Rust compiler help me when the whole 150 cargo micro package stuff pulled form the internet is working against distribution packages and cluttering either the home, or a system-wide directory with 1.x GB of build artefacts?
            • Reproducible bootstrap has nothing to do with dependency management, it's a separate problem, see above.
            • Just to reiterate, cargo downloads crate sources, not binary artefacts.
            • If you're using a source distro, then these fit naturally alongside the sources of gcc, kde, and whatever else you install, nothing specific about rust, nothing in your $HOME, and much less clutter in your installed files because there's no header files or .so files to include.
            • If you're maintaining a binary distro, then yes cargo will put those files in a user-specific location of its choosing unless you configure otherwise. It's 800Mb on my machine, compiling a lot of rust. How is that different from having the gcc sources hanging around ? At least you can thank cargo for taking care of the download and harmonizing the build system over the whole rust ecosystem.
            • This is the recommended way to rust, and it's actually working *for* the distribs, by streamlining the creation/maintenance of packages.
            • But I understand you want to create individual packages for each lib crate. That's the right thing to do in a C/C++ world, and is is indeed a lot more work in a rust world. But it's also mostly pointless in the rust world, because binaries link against a specific version of each lib, so you can't update a system-wide lib like you would with C. The only thing you gain is some compile time (offset by the extra package maintenance burden, and reducible using a build cache) and some disk space (only significant in embedded so small that you're probably not using a classic package manager to begin with).
            Your frustration against rust is palpable, and I can see it's not baseless. But you're choosing a complicated path for IMHO little benefit. Maybe you should take a step back, check your priorities, mark the lower ones as TODO, and get on with it.

            Comment


            • #96
              Originally posted by moltonel View Post

              Apart from not having decades of existence, the chain you describe applies equally to rust. As long as you're not bringing an independently-reproduced compiler build into the mix, the chain of trust has the same strength.
              • Reproducible bootstrap has nothing to do with dependency management, it's a separate problem, see above.
              • Just to reiterate, cargo downloads crate sources, not binary artefacts.
              • If you're using a source distro, then these fit naturally alongside the sources of gcc, kde, and whatever else you install, nothing specific about rust, nothing in your $HOME, and much less clutter in your installed files because there's no header files or .so files to include.
              • If you're maintaining a binary distro, then yes cargo will put those files in a user-specific location of its choosing unless you configure otherwise. It's 800Mb on my machine, compiling a lot of rust. How is that different from having the gcc sources hanging around ? At least you can thank cargo for taking care of the download and harmonizing the build system over the whole rust ecosystem.
              • This is the recommended way to rust, and it's actually working *for* the distribs, by streamlining the creation/maintenance of packages.
              • But I understand you want to create individual packages for each lib crate. That's the right thing to do in a C/C++ world, and is is indeed a lot more work in a rust world. But it's also mostly pointless in the rust world, because binaries link against a specific version of each lib, so you can't update a system-wide lib like you would with C. The only thing you gain is some compile time (offset by the extra package maintenance burden, and reducible using a build cache) and some disk space (only significant in embedded so small that you're probably not using a classic package manager to begin with).
              Your frustration against rust is palpable, and I can see it's not baseless. But you're choosing a complicated path for IMHO little benefit. Maybe you should take a step back, check your priorities, mark the lower ones as TODO, and get on with it.
              As I want to compile Firefox I can not really move Rust & Cargo around on my TODO too much. This also does not integrate well in source distributions because we would want to have control about the downloads, which appears to be difficult with Cargo, as well as about the installed files just like a binary distribution. Unless you delete all build artefacts each build of any Rust based package will build and install those cached build artefacts. I also would have rather some headers and libraries than each package re-building any dependency again and again. The benefit of shared libraries was also not only saved space (and compile time) but also fixing bugs once and for all, and not having to rebuild and re-link each and every application package based on them. I see many drawbacks and disadvantages here, and while re-downloading all kinds of dependencies might not matter for many people in an office with good internet, I also travel and am used to be able to work on stuff without an internet connection in trains, planes, and remote places. Other people raised concerns about rural and such broadband and driving somewhere to sync sources before, too. But that is only the last drop on already quite a collection of other issues. And in todays insecure and easily hacked cloud infrastructure I rather have a cache of downloaded and signature checked tarballs, than source packages connecting to some cloud repository behind my back with unknown outcome of what it may or may not install and hack into my system.

              Comment


              • #97
                Originally posted by rene View Post
                As I want to compile Firefox I can not really move Rust & Cargo around on my TODO too much.
                Yes, but you do not have to get a bootstraped-from-C rust from the get go: you can use the standard rust bootstrap which uses a rust-lang.org binary.

                This also does not integrate well in source distributions because we would want to have control about the downloads, which appears to be difficult with Cargo, as well as about the installed files just like a binary distribution.
                Have a look at gentoo's ripgrep package for example: https://gitweb.gentoo.org/repo/gento...-11.0.1.ebuild There's the list of crate dependencies, some standard metadata, and a couple of specific instructions to says where shell completions and docs must be installed. Portage (gentoo's packaging system) takes care of downloading the crates and verifying their checksums, then cargo picks those up and starts the build (during which it has no network access available). With cargo's new `--offline` switch, the package maintainer could even tweak the crate versions there without touching Cargo.toml. The result is a very simple package, and the distrib has full control.

                Unless you delete all build artefacts each build of any Rust based package will build and install those cached build artefacts.
                Hum... No ? Unless you're rebuilding the same project or use a compile cache, the only thing that gets reused is the downloaded crate sources, not build artifacts.

                I also would have rather some headers and libraries than each package re-building any dependency again and again. The benefit of shared libraries was also not only saved space (and compile time) but also fixing bugs once and for all, and not having to rebuild and re-link each and every application package based on them.
                Rust doesn't use header files, hurray !

                I mentioned the "fix a library bug system-wide" usecase and indeed it's not possible in rust due to strict library versioning in the ABI. But there's a very good reason (and some minor reasons) for this versioning: enabling the use of multiple versions in a single binary. This is not just a convenience for the programmer, it enables faster uptake of new library versions because you're never stuck on an old vulnerable lib version because some sub-sub-dep you don't control only supports the older version.

                It does mean that you have to rebuild your binaries to use an updated lib, which is a bummer. But at least you've got better control over that than over the stubborn sub-sub-dep which would otherwise prevent the system-wide lib update. Time will tell and it depends on the usecase, but I think it's an overall win.

                I see many drawbacks and disadvantages here, and while re-downloading all kinds of dependencies might not matter for many people in an office with good internet, I also travel and am used to be able to work on stuff without an internet connection in trains, planes, and remote places. Other people raised concerns about rural and such broadband and driving somewhere to sync sources before, too. But that is only the last drop on already quite a collection of other issues. And in todays insecure and easily hacked cloud infrastructure I rather have a cache of downloaded and signature checked tarballs, than source packages connecting to some cloud repository behind my back with unknown outcome of what it may or may not install and hack into my system.
                There's nothing specific to rust here, you face the same cache-or-download dilema whatever the language. Rust source crates aren't any bigger than their counterparts. Complaining about artifact reuse and source redownload in the same post seems a bit contradictory.

                Comment


                • #98
                  Originally posted by moltonel View Post
                  ...

                  There's nothing specific to rust here, you face the same cache-or-download dilema whatever the language. Rust source crates aren't any bigger than their counterparts. Complaining about artifact reuse and source redownload in the same post seems a bit contradictory.
                  Not having and using share libraries sound like a huge step backwards to me. As does coupling a module / library system with some automatic cloud download. The 150+ micro dependencies for just the compiler bootstrap also appear to me more like some ad-hoc chaos, than a well sorted standard library foundation. For a distribution case, I would need to extract over 150 micro dependency downloads just for the Rust & Cargo packages, and I just tested some RGB lighting Rust test code glue, and even that pulled some good 20 dependencies down from the interwebts :-/

                  Comment


                  • #99
                    Originally posted by rene View Post

                    I'm not doing it manually, I want to do it automated as part of our Linux distribution build process (#t2sde). But as I said already rust & cargo are not very cooperative to be packaged as Linux distribution package. Which should already say everything about the state of their amazing software deployment methods.
                    I've packaged a number of Rust projects for distribution on Pop!_OS. There's a tool for Rust called `cargo-vendor`, it can be used to create a tarball of vendored dependencies. Integrating that in a Makefile to enable offline builds in a schroot is pretty trivial.

                    Comment

                    Working...
                    X