Be aware that I only referred to decompression speed, not compression ratio. For meaningful results on compression ratio, a more diverse benchmark would indeed be necessary.
One can however say with some confidence that the relative decompression speed will not change on different types of data.
Announcement
Collapse
No announcement yet.
Ubuntu 13.04 To Look At XZ-Compressed Packages
Collapse
X
-
Originally posted by chithanh View Postxz (lzma) is an ok choice, but it is by far not the best compressor. The only advantage is that it is already installed by default almost everywhere. If decompression speed is important, then e.g. lzham would be more suitable.
http://mattmahoney.net/dc/text.html
Edit: note that I'm not saying that lzham does badly on binaries, just that that page is not useful as a test for .deb compression (unless it's a .deb that contains mostly text files maybe).Last edited by JanC; 18 October 2012, 02:40 PM.
Leave a comment:
-
Originally posted by Ibidem View Post4. Delta-debs were proposed and rejected on the grounds that a lot of people skip at least one package release, and also you can't rely on the debs being present locally if they clean the package archive. If you want them, explain to the Ubuntu developers why those aren't a problem once you've read the relevant Brainstorm pages.
What I do remember is that it was a blueprint for Oneiric but ended up getting blocked/postponed during that development cycle. I'm not sure how intrusive it really would be, but I suspect it was then a little too radical for an LTS. After that the interest seemed to die out. But I'm not positive what the whole story was. In any case, there wasn't a discussion at UDS-Q.
I remember the UDS debdelta session a few cycles ago. It sounded a little ambitious (not nerly as easy as many seem to think) but doable. It is a bummer that it didn't end up landing. So far it's not on the schedule for UDS-R but perhaps it will be brought up in the dpkg-xz session in couple weeks.
Leave a comment:
-
Originally posted by chithanh View PostYes, I was referring to the lzma algorithm, not the software package with the same name.
Leave a comment:
-
Originally posted by bug77 View PostIs lzham proven? Is it installed an nearly every linux box?
Leave a comment:
-
Yes, I was referring to the lzma algorithm, not the software package with the same name.
The lzham algorithm itself is correct. The implementation could contain bugs: as lzham is not in widespread use, one might be less confident in the code. But even if you insert an extra checksumming and verification step, you would still be ahead of xz(lzma).
Leave a comment:
-
Originally posted by chithanh View Postlzham offers much faster decompression than lzma (9 vs. 36 seconds in enwik9), at a very modest increase in archive size on average. It is even faster than gzip in decompression.
Because if you have even the silghtest error comressing/decompressing, it literally means thousands of broken installation after one update. And you'd wish you spent a little more time decompressing.
Leave a comment:
-
Originally posted by chithanh View Postlzham offers much faster decompression than lzma (9 vs. 36 seconds in enwik9), at a very modest increase in archive size on average. It is even faster than gzip in decompression.
Leave a comment:
-
lzham offers much faster decompression than lzma (9 vs. 36 seconds in enwik9), at a very modest increase in archive size on average. It is even faster than gzip in decompression.
Leave a comment:
-
Originally posted by chithanh View Postxz (lzma) is an ok choice, but it is by far not the best compressor. The only advantage is that it is already installed by default almost everywhere. If decompression speed is important, then e.g. lzham would be more suitable.
http://mattmahoney.net/dc/text.html
So a solution which provides great compression and great decompression speed is likely a prime candidate. On my machines the packages I get from the Arch repos (xz compressed) unpack and install very quickly but then again I have a core i5 and a core i7 so it's hard for me to judge how effective it is overall.
Still, lzma should have proved itself as striking a good balance between compression/decompression speed and compression ratio given that it is used in so many compression tools.
Leave a comment:
Leave a comment: