Announcement

Collapse
No announcement yet.

Canonical Working On Zstd-Compressed Debian Packages For Ubuntu

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by discordian View Post
    If you distribute an app as binary its common to depend on libc and not much else. This is less about the apps I bundle myself (I can just use xz compression myself), but the few closed apps I depend on. Often there are 2 variants - 32bit and 64bit ignoring distros.

    Grabbing and installing a package for a similar distro should be possible, and not artificially hindered.
    Ah I see. But are not such apps usually distributed as a large binary bash file (at least some closed source disk array utilities that I used over the time have been this way)?

    Comment


    • #22
      Originally posted by xpris View Post
      Be aware of ZSTD at latest in Linux. Looks like zstd is stolen project and the original author protests and submits a lawsuit. Text in non english, use translate: https://translate.google.pl/translat...gle&edit-text=
      No, according to your linked article it's the other way around. I.e Google have taken a patent on some algorithms that Duda developed and published for free on the Internet. So he is not claiming that zstd is a stolen project (and his main claim is for the video compression algorithms and not zstd), he is simply claiming that Google should not be allowed to file a patent for something that he invented.

      Comment


      • #23
        Originally posted by mmstick View Post

        Decompressing packages in the background, in parallel, isn't that difficult of a change to make. I'd give it a day of work.
        The problem is that tar isn't a very good format to extract in parallel since it's made for tape devices / streaming, which means you don't know ahead where the individual files start and end.

        Comment


        • #24
          Originally posted by nils_ View Post

          The problem is that tar isn't a very good format to extract in parallel since it's made for tape devices / streaming, which means you don't know ahead where the individual files start and end.
          Tar has no compression at all, compressors like xz could already use blocks that can be decompressed in parallel (using those independend blocks would reduce compression however).

          but tbh, the talk is about decompressing the next deb (or multiple) while installing the current. This could already start during downloading if taken to the extreme. Aslong as the memory is there, this should not be a big problem.

          Comment


          • #25
            Originally posted by F.Ultra View Post

            Ah I see. But are not such apps usually distributed as a large binary bash file (at least some closed source disk array utilities that I used over the time have been this way)?
            thats a horrible mess if you have dependencies, particularly if itsan 32bit app. A .deb fixes that for you and can easily be uninstalled(with now vacant dependencies) or upgraded.

            in fact, if i end up with an app (like the games from gog), first thing i do is making a deb out of it.

            Comment


            • #26
              Originally posted by discordian View Post
              Tar has no compression at all, compressors like xz could already use blocks that can be decompressed in parallel (using those independend blocks would reduce compression however).
              Yes, but the problem is that you can't determine which compressed blocks form a file so you need to merge the results back in ram and then extract from the tar.

              Comment


              • #27
                Originally posted by nils_ View Post
                Yes, but the problem is that you can't determine which compressed blocks form a file so you need to merge the results back in ram and then extract from the tar.
                Any high compression format merges files and compresses a whole stream of them - typically ignoring file boundaries for blocks aswell. And as long you gotta uncompress all files anyway I don't know where you're getting with this comment.
                The only issue is that you need the additional space in a ram-disk, which won't be a problem with most packages.

                Comment


                • #28
                  Originally posted by discordian View Post
                  Any high compression format merges files and compresses a whole stream of them - typically ignoring file boundaries for blocks aswell. And as long you gotta uncompress all files anyway I don't know where you're getting with this comment.
                  The only issue is that you need the additional space in a ram-disk, which won't be a problem with most packages.
                  What I'm getting at is that to speed up the whole process further you would probably need to be able to also write the output files in parallel which isn't possible with tar.

                  Comment


                  • #29
                    Originally posted by nils_ View Post

                    What I'm getting at is that to speed up the whole process further you would probably need to be able to also write the output files in parallel which isn't possible with tar.
                    It's not about parallel decompressing a single tar.gz / tar.xz, but parallel decompressing many tar.xz / tar.gz archives in parallel. We do this in our debrep tool when parsing data from debian archives, actually. Every core is pinned to 100%.

                    Comment

                    Working...
                    X