Announcement

Collapse
No announcement yet.

Ubuntu 13.04 To Look At XZ-Compressed Packages

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • phoronix
    started a topic Ubuntu 13.04 To Look At XZ-Compressed Packages

    Ubuntu 13.04 To Look At XZ-Compressed Packages

    Phoronix: Ubuntu 13.04 To Look At XZ-Compressed Packages

    Another interesting topic for the Ubuntu 13.04 Developer Summit later this month is about using XZ compression by default for its packages, which would lead to a reduction in file-size...

    http://www.phoronix.com/vr.php?view=MTIwNTk

  • chithanh
    replied
    Be aware that I only referred to decompression speed, not compression ratio. For meaningful results on compression ratio, a more diverse benchmark would indeed be necessary.

    One can however say with some confidence that the relative decompression speed will not change on different types of data.

    Leave a comment:


  • JanC
    replied
    Originally posted by chithanh View Post
    xz (lzma) is an ok choice, but it is by far not the best compressor. The only advantage is that it is already installed by default almost everywhere. If decompression speed is important, then e.g. lzham would be more suitable.

    http://mattmahoney.net/dc/text.html
    That benchmark is not very useful on its own, as it only tests compression of text files (and it's a text file with very specific characteristics too), and they only test with the highest available compression ratio. Most .deb packages don't consist of XML dumps of Wikipedia and it's unlikely that they use the currently used compressors at their highest compression ratio (because often that affects the speed or memory use too much).

    Edit: note that I'm not saying that lzham does badly on binaries, just that that page is not useful as a test for .deb compression (unless it's a .deb that contains mostly text files maybe).
    Last edited by JanC; 10-18-2012, 02:40 PM.

    Leave a comment:


  • marrusl
    replied
    Originally posted by Ibidem View Post
    4. Delta-debs were proposed and rejected on the grounds that a lot of people skip at least one package release, and also you can't rely on the debs being present locally if they clean the package archive. If you want them, explain to the Ubuntu developers why those aren't a problem once you've read the relevant Brainstorm pages.
    Hmm. That's not the way I remember the discussion at UDS-O. I think I found the right Brainstorm link. But I don't see it as being rejected. What I found was marked "Being Implemented"... although obviously it stalled after that.

    What I do remember is that it was a blueprint for Oneiric but ended up getting blocked/postponed during that development cycle. I'm not sure how intrusive it really would be, but I suspect it was then a little too radical for an LTS. After that the interest seemed to die out. But I'm not positive what the whole story was. In any case, there wasn't a discussion at UDS-Q.

    I remember the UDS debdelta session a few cycles ago. It sounded a little ambitious (not nerly as easy as many seem to think) but doable. It is a bummer that it didn't end up landing. So far it's not on the schedule for UDS-R but perhaps it will be brought up in the dpkg-xz session in couple weeks.

    Leave a comment:


  • bug77
    replied
    Originally posted by chithanh View Post
    Yes, I was referring to the lzma algorithm, not the software package with the same name.
    There's your problem. Choosing a solution for something with such a widespread use, it has to be quasi-ubiquitous and rock solid in the first place. Who knows, a few years down the road maybe that algorithm will prove itself, see wide spread adoption and replace xz. But it has to happen in that order.

    Leave a comment:


  • chithanh
    replied
    Originally posted by bug77 View Post
    Is lzham proven? Is it installed an nearly every linux box?
    It is not installed everywhere, it would be the distribution's installer and package manager who has to ensure that it is available before extracting lzham compressed packages.

    Leave a comment:


  • chithanh
    replied
    Yes, I was referring to the lzma algorithm, not the software package with the same name.

    The lzham algorithm itself is correct. The implementation could contain bugs: as lzham is not in widespread use, one might be less confident in the code. But even if you insert an extra checksumming and verification step, you would still be ahead of xz(lzma).

    Leave a comment:


  • bug77
    replied
    Originally posted by chithanh View Post
    lzham offers much faster decompression than lzma (9 vs. 36 seconds in enwik9), at a very modest increase in archive size on average. It is even faster than gzip in decompression.
    Is lzham proven? Is it installed an nearly every linux box?
    Because if you have even the silghtest error comressing/decompressing, it literally means thousands of broken installation after one update. And you'd wish you spent a little more time decompressing.

    Leave a comment:


  • XorEaxEax
    replied
    Originally posted by chithanh View Post
    lzham offers much faster decompression than lzma (9 vs. 36 seconds in enwik9), at a very modest increase in archive size on average. It is even faster than gzip in decompression.
    Well lzma (as in the utils, not the overall compression algorithm) is no longer being developed in favour of xz, so any worthwhile comparison should be made against xz utils rather than lzma utils as I assume there has been improvements made since the development switch to xz.

    Leave a comment:


  • chithanh
    replied
    lzham offers much faster decompression than lzma (9 vs. 36 seconds in enwik9), at a very modest increase in archive size on average. It is even faster than gzip in decompression.

    Leave a comment:

Working...
X