Mozilla Firefox Switches To .tar.xz For Linux Packaging

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts
  • Weasel
    Senior Member
    • Feb 2017
    • 4500

    #51
    Originally posted by Anux View Post
    Exactly, that's why I quoted the thing that says 0,8% size increase. You seem to believe that downloading and decompression happens at the same time but all package managers I know do it serial. So decompression is added to download time.

    It obviously depends on your hardware, if you have a RasPi with Gb-internet then zstd or lz4 is better, if you have a threadripper with a 65k modem then xz would be better.

    But since there is practically no difference in size with the change to zstd people with slower hardware do profit while others hardly notice the difference.
    I don't think I got such close compression ratios. They probably aren't using max compression for xz, so it's their problem.

    It also sounds to me like they should have made it parallel or at least install the requirement packages while downloading the next ones first, rather than change every single package's compression method. Work smarter.

    Comment

    • Anux
      Senior Member
      • Nov 2021
      • 1941

      #52
      Originally posted by Weasel View Post
      I don't think I got such close compression ratios. They probably aren't using max compression for xz, so it's their problem.
      Of course, they used standard xz settings I think. The reason behind that and why Mozilla is also using standard xz, they want to get their updates out immediately and not wait hours for compression to finish also it would increase resource usage/time on the client side as well.
      It also sounds to me like they should have made it parallel or at least install the requirement packages while downloading the next ones first, rather than change every single package's compression method. Work smarter.
      One doesn't exclude the other, it should be possible to order downloads by dependency's and start decomp/install while the rest is still downloading. Not sure why no one has done that jet but maybe this will be a project for me in the future when I have more free time.

      Comment

      • Weasel
        Senior Member
        • Feb 2017
        • 4500

        #53
        Originally posted by Anux View Post
        Of course, they used standard xz settings I think. The reason behind that and why Mozilla is also using standard xz, they want to get their updates out immediately and not wait hours for compression to finish also it would increase resource usage/time on the client side as well.
        Decompression is faster if you spend more time compressing since there's less data to pull through. Increasing dictionary size does increase memory usage when decompressing though.

        Them using default options is either laziness or they have no idea what they are doing. If they wanted least strain while building (but more strain when people download it) they would have used gz or something cheap and crap. But when you build something once and it gets downloaded a million times then spend some fucking time compressing it well.

        Originally posted by Anux View Post
        One doesn't exclude the other, it should be possible to order downloads by dependency's and start decomp/install while the rest is still downloading.
        It kinda does exclude it because it's completely pointless if the install is much faster than the download. You'll be completely bottlenecked by the download and the installer will sit around idling waiting for the next download.

        This is why xz > zstd for this kind of job.

        Comment

        • Anux
          Senior Member
          • Nov 2021
          • 1941

          #54
          Originally posted by Weasel View Post
          Decompression is faster if you spend more time compressing
          Hm, strange it actually gets faster up to level 5 then it stagnates.

          It kinda does exclude it because it's completely pointless if the install is much faster than the download. You'll be completely bottlenecked by the download and the installer will sit around idling waiting for the next download.
          Just a theoretical calculation, real numbers depend on your system of course:
          download speed 10 MB/s
          xz decomp speed 10 MB/s
          zstd decomp speed 100 MB/s
          install speed 100 MB/s
          pkg size 200 MB
          pkg size xz 50 MB
          pkg size zstd 65 MB

          Time for xz: 50 / 10 + 50 / 10 + 200 / 100 = 12 s
          Time for zstd: 65 / 10 + 65 / 100 + 200 / 100 = 9 s

          That is a big difference and you can't expect everyone having a high end CPU, that is not what's being sold the most. Obviously the faster the CPU in this case, the less decompression time will have an influence.

          Comment

          • energyman
            Senior Member
            • Jul 2008
            • 1754

            #55
            But with xz you can start downloading the next pkg, while zst is still downloading so across several pkg, xz will win.

            Comment

            Working...
            X