Google Unveils "Zopfli" Compression Algorithm
Phoronix: Google Unveils "Zopfli" Compression Algorithm
Google has announced Zopfli, a new general purpose data compression library that's open-source. Zopfli implements the Deflate compression algorithm that yields a smaller output size than previous techniques...
Didn't 7zip already deliver gains over 2-3% with less CPU time required?
Is this improvement really even worth a Google engineer's time?
I'm curious: it's the same algorithm...so can zlib decompressors handle it?
Originally Posted by http://googledevelopers.blogspot.com/2013/02/compress-data-more-densely-with-zopfli.html
Last edited by Ibidem; 03-01-2013 at 05:10 PM.
Theres a mistake in the article. Its not 2-3x slower, its 10-100x slower.
All the same it looks cool The great thing about it is that its an implementation of deflate so the existing clients can decompress it just the same, it just shaves an additional 3-8% off serving any static content where you can devote the server resources to it.
For sure, a more modern algorithm such as sported by 7zip or bzip2 or a myriad of others are better choices where you can dictate the client can decode it, but for mobile especially, a low complexity decoder that does better than stock zlib is great.
Since I'm already using advdef for these purposes (zlib-compatible, but better compression), had to run a quick bench.
gzip and pigz with -9
zopfli defaults (15 rounds)
Time not included in this chart as only zopfli and advdef were timed.
It compresses about ~2% better than advdef, while using 30% more time. Advdef uses the 7-zip deflate algo.
24278 1.04809 pigz.gz
24277 1.04805 gzip.gz
23591 1.01843 ad/gzip.gz
23591 1.01843 ad/pigz.gz
23164 1 ad/zop.gz
23164 1 zop.gz
Can I use it for compressing initrd?
Well, from what I read this was done on the 20% of work time a Google developer is allowed to spend on whatever they want. And yes I can see that if you serve alot of compressed (as in deflate, which is the compression algorithm all browsers support) static content you'd likely be happy to shave of 3-8% of your bandwidth by compressing said content with zopfli.
Originally Posted by MartinN
No, it's not going to cause a revolution on the web, it will simply allow certain content to decrease bandwidth use, as such it is a nice tool.
Parallel Zopfli compressor
Zopfli had some interesting developments recently:
One, its inclusion as compressionlevel 11 in pigz
Second, Charles Bloom take on Zopfli here