Announcement

Collapse
No announcement yet.

Canonical Working On Zstd-Compressed Debian Packages For Ubuntu

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by sdack View Post
    True, but replacing a compression library with another isn't much of a change really. It still requires full testing, but will likely succeed quickly and without hiccups, which is why they aim to add quickly.

    Larger or more fundamental changes require more effort and more testing. The change you're suggesting really has to come from Debian themselves, or as you've pointed out will it increase the differences between Debian and Ubuntu.
    Decompressing packages in the background, in parallel, isn't that difficult of a change to make. I'd give it a day of work.

    Comment


    • #12
      Originally posted by mmstick View Post
      Decompressing packages in the background, in parallel, isn't that difficult of a change to make. I'd give it a day of work.
      Thank you. That's very generous of you. When are you going to send in the patches?

      Comment


      • #13
        zstd shines where decompression speed actually matters. It is silly to use it for packages if you have more efficient compression available imho. Space and bandwidth savings trump some decompression time improvements.

        Comment


        • #14
          Originally posted by sdack View Post
          Thank you. That's very generous of you. When are you going to send in the patches?
          When they move to GitHub / GitLab and start accepting patches written in Rust.

          Comment


          • #15
            Originally posted by log0 View Post
            zstd shines where decompression speed actually matters. It is silly to use it for packages if you have more efficient compression available imho. Space and bandwidth savings trump some decompression time improvements.
            And the issue with APT package installations & upgrades is that debian installation process is majorly flawed. Rather than utilizing package hooks to push script events to the end of the package installation process, they execute them after each package is upgraded / installed. That means that they will often times run the same hooks repeatedly for each package, rather than only once at the end.

            Arch Linux's pacman and Solus's eopkg package managers are better examples. They install and update significantly faster than apt / apt-get, precisely due to registering package hooks.

            Comment


            • #16
              im quite sure this compression agent is now supported in RPM 4.14

              Comment


              • #17
                Originally posted by discordian View Post

                You misunderstand what Ubuntu wants to do then. They want use the binary format of zstd, and not just some commandline tool. Means anyone that packages a tool on the new ubuntu will by default create a .deb package that wont install on other debian-based distros or the previous ubuntu version.
                There are tons off different compression algos that have different (dis-)advantages to xz and gzip, zstd is no notable "knight in shining armor" here. Such modest gains are more that debatable as you throw out compatibility at the same time.
                Particularly if you could keep the 6% lower filesize and just decompress the archives ahead in background threads (ie. as they are transferred over the wire), making those speed gains a non issue.
                How often do you create deb:s for other distros and versions? Since you will end up with linking to wrong versions of libraries anyway I always do "native" builds by scripting the build over several different chroots.

                Comment


                • #18
                  btw, is there a reason why deb-based distributions don't use delta by default?
                  basically, instead of downloading the full packages (for an update), one downloads the difference between the packages in use and new ones. While some packages (like kernel) are still downloaded in full, the others' size can be limited by ~75% on average (up to 96% in some cases). but hey, suse and others rpm-based used it for over a decade...

                  Comment


                  • #19
                    Originally posted by F.Ultra View Post

                    How often do you create deb:s for other distros and versions? Since you will end up with linking to wrong versions of libraries anyway I always do "native" builds by scripting the build over several different chroots.
                    If you distribute an app as binary its common to depend on libc and not much else. This is less about the apps I bundle myself (I can just use xz compression myself), but the few closed apps I depend on. Often there are 2 variants - 32bit and 64bit ignoring distros.

                    Grabbing and installing a package for a similar distro should be possible, and not artificially hindered.

                    Comment


                    • #20
                      Be aware of ZSTD at latest in Linux. Looks like zstd is stolen project and the original author protests and submits a lawsuit. Text in non english, use translate: https://translate.google.pl/translat...gle&edit-text=

                      Comment

                      Working...
                      X