Announcement

Collapse
No announcement yet.

New Compression Codecs Risk Making Zlib Obsolete

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • bug77
    replied
    Originally posted by pal666 View Post
    do you seed torrents from your device? most traffic in the world is inbound
    Not when every kid can upload a video of themselves crossing the street. Well, technically the video can only be uploaded once and can be seen several times, but you get my drift.

    Leave a comment:


  • pal666
    replied
    Originally posted by bug77 View Post
    Just look at the assault of data caps. Devices will pretty much have to compress before sending, if ISPs have their way. Thus the ability to compress fast will not go away.
    do you seed torrents from your device? most traffic in the world is inbound

    Leave a comment:


  • pal666
    replied
    Originally posted by bug77 View Post
    Why bother with small steps? Just write a library that is capable of analysing the content and pick the best algorithm for each file. Or treat the whole data as a stream and switch algorithms based on regions of the stream. Dynamically and on-the-fly. I'm (semi)joking.
    they compress concatenated files without restarting, compressing files one by one will decrease compression ratio. so you suggested to analize all partial permutations of source files. which is simple, but very time consuming

    Leave a comment:


  • andrei_me
    replied
    The same issue on this blog post applies to this case?

    In 2007 I wrote about using PNGout to produce amazingly small PNG images [https://blog.codinghorror.com/getting-the-most-out-of-png/]. I still refer to this topic frequently, as seven years later, the average PNG I encounter on the Internet is very unlikely to be optimized. For example, consider this recent Perry Bible

    Leave a comment:


  • bug77
    replied
    Originally posted by tillschaefer View Post
    Well, decrompression throughput might be important for many situations like package distribution, i.e. compress once, decrompress many times. However, the compression speed seems not irrelevant for me. E.g. I am regularity compressing data from computer experiments. In this situation I compress more often, than decompressing things, as i generate a lot of data that is not always needed at the end.

    -> talking about obsolete zlib without giving the compression speeds (which vary a lot more than decompression speeds) seems a bit over the top.
    Just look at the assault of data caps. Devices will pretty much have to compress before sending, if ISPs have their way. Thus the ability to compress fast will not go away.

    Leave a comment:


  • Azpegath
    replied
    "Risk of making zlib obsolete"? I don't see a risk, I welcome older implementations and techniques becoming obsolete when new and superior technologies replace them. As long as the new algorithms are Free and open source I don't mind at all.

    Leave a comment:


  • tillschaefer
    replied
    Well, decrompression throughput might be important for many situations like package distribution, i.e. compress once, decrompress many times. However, the compression speed seems not irrelevant for me. E.g. I am regularity compressing data from computer experiments. In this situation I compress more often, than decompressing things, as i generate a lot of data that is not always needed at the end.

    -> talking about obsolete zlib without giving the compression speeds (which vary a lot more than decompression speeds) seems a bit over the top.

    Leave a comment:


  • eugene2k
    replied
    A more likely scenario is that LZMA and bzip2 are going to be made obsolete. Deflate (https://en.wikipedia.org/wiki/DEFLATE) is a very popular compression algorithm.

    Leave a comment:


  • milkylainen
    replied
    I don't think zlib will be obsolete anytime soon either.
    It is well established, implemented and vetted code. Gains are still minor and not without trade-offs.
    I think the author forgets why something becomes a standard.
    CDs, mp3s etc are not the pinnacle of technology by any means yet they managed to become their respective technology standard for a very long time.
    Every time a competitor came along it was shot down because the standard had become ubiquitous and the competing ones did not offer any noticeable advantages to the average user on the receiving end.
    In this case it is the developers that are mostly on the receiving end. Do most developers really care about compression algorithm as long as it is well rounded and has decent performance? It is just another library with another function. Of course, some will. But calling zlib soon to be obsolete...

    It is hard to appreciate just how ubiquitous zlib really is.
    A lot of protocols, data compressions, boot compressions, stream compressions etc. Just about everything, everywhere relies on zlib. It is not going away anytime soon.

    Leave a comment:


  • bug77
    replied
    Why bother with small steps? Just write a library that is capable of analysing the content and pick the best algorithm for each file. Or treat the whole data as a stream and switch algorithms based on regions of the stream. Dynamically and on-the-fly. I'm (semi)joking.

    Leave a comment:

Working...
X