Announcement

Collapse
No announcement yet.

New Compression Codecs Risk Making Zlib Obsolete

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • New Compression Codecs Risk Making Zlib Obsolete

    Phoronix: New Compression Codecs Risk Making Zlib Obsolete

    Zlib is likely the most widely-used data compression library on open-source systems, but it's now at great risk of becoming obsoleted by more modern codecs for data compression...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    I use zlib in a lot of memory-constrained places. Brotli with its huge dictionary is completely out of the question, I'll reserve judgement on BitKnit until I can see the source. But so far looks like zlib won't be obsolete any time soon.

    Comment


    • #3
      Why bother with small steps? Just write a library that is capable of analysing the content and pick the best algorithm for each file. Or treat the whole data as a stream and switch algorithms based on regions of the stream. Dynamically and on-the-fly. I'm (semi)joking.

      Comment


      • #4
        I don't think zlib will be obsolete anytime soon either.
        It is well established, implemented and vetted code. Gains are still minor and not without trade-offs.
        I think the author forgets why something becomes a standard.
        CDs, mp3s etc are not the pinnacle of technology by any means yet they managed to become their respective technology standard for a very long time.
        Every time a competitor came along it was shot down because the standard had become ubiquitous and the competing ones did not offer any noticeable advantages to the average user on the receiving end.
        In this case it is the developers that are mostly on the receiving end. Do most developers really care about compression algorithm as long as it is well rounded and has decent performance? It is just another library with another function. Of course, some will. But calling zlib soon to be obsolete...

        It is hard to appreciate just how ubiquitous zlib really is.
        A lot of protocols, data compressions, boot compressions, stream compressions etc. Just about everything, everywhere relies on zlib. It is not going away anytime soon.

        Comment


        • #5
          A more likely scenario is that LZMA and bzip2 are going to be made obsolete. Deflate (https://en.wikipedia.org/wiki/DEFLATE) is a very popular compression algorithm.

          Comment


          • #6
            Well, decrompression throughput might be important for many situations like package distribution, i.e. compress once, decrompress many times. However, the compression speed seems not irrelevant for me. E.g. I am regularity compressing data from computer experiments. In this situation I compress more often, than decompressing things, as i generate a lot of data that is not always needed at the end.

            -> talking about obsolete zlib without giving the compression speeds (which vary a lot more than decompression speeds) seems a bit over the top.

            Comment


            • #7
              "Risk of making zlib obsolete"? I don't see a risk, I welcome older implementations and techniques becoming obsolete when new and superior technologies replace them. As long as the new algorithms are Free and open source I don't mind at all.

              Comment


              • #8
                Originally posted by tillschaefer View Post
                Well, decrompression throughput might be important for many situations like package distribution, i.e. compress once, decrompress many times. However, the compression speed seems not irrelevant for me. E.g. I am regularity compressing data from computer experiments. In this situation I compress more often, than decompressing things, as i generate a lot of data that is not always needed at the end.

                -> talking about obsolete zlib without giving the compression speeds (which vary a lot more than decompression speeds) seems a bit over the top.
                Just look at the assault of data caps. Devices will pretty much have to compress before sending, if ISPs have their way. Thus the ability to compress fast will not go away.

                Comment


                • #9
                  The same issue on this blog post applies to this case?

                  In 2007 I wrote about using PNGout to produce amazingly small PNG images [https://blog.codinghorror.com/getting-the-most-out-of-png/]. I still refer to this topic frequently, as seven years later, the average PNG I encounter on the Internet is very unlikely to be optimized. For example, consider this recent Perry Bible

                  Comment


                  • #10
                    Originally posted by bug77 View Post
                    Why bother with small steps? Just write a library that is capable of analysing the content and pick the best algorithm for each file. Or treat the whole data as a stream and switch algorithms based on regions of the stream. Dynamically and on-the-fly. I'm (semi)joking.
                    they compress concatenated files without restarting, compressing files one by one will decrease compression ratio. so you suggested to analize all partial permutations of source files. which is simple, but very time consuming

                    Comment

                    Working...
                    X