Hi everyone,
I thought the internet would have this kind of information ('cause that's the internet, right?) but apparently even the latest article published on this topic is at least 4 years old and all articles are somewhat narrow, they don't take into account ALL the possible alternatives. From multi-GB multi-file .tar.gz archives that takes several minutes just to read just because tar is not indexed to random files archives, to overall compression time/efficiency comparisons and multi-core compression extensions of original methods, what rules of thumbs can you give me, regarding anything?
I hope this actually becomes the Ultimate compression Thread on Internet and I can learn some good thing!
I thought the internet would have this kind of information ('cause that's the internet, right?) but apparently even the latest article published on this topic is at least 4 years old and all articles are somewhat narrow, they don't take into account ALL the possible alternatives. From multi-GB multi-file .tar.gz archives that takes several minutes just to read just because tar is not indexed to random files archives, to overall compression time/efficiency comparisons and multi-core compression extensions of original methods, what rules of thumbs can you give me, regarding anything?
I hope this actually becomes the Ultimate compression Thread on Internet and I can learn some good thing!