It has never even struck me that there is a need for this. The file I/O is probably going to be a lot slower than decompressing so threading it will add nothing, unless the file is small enough to be completely cached in which case it's only going to take a second or two to decompress it anyway on a modern machine, once again making threading uninteresting.
Announcement
Collapse
No announcement yet.
Should Tarballs Be On Their Way Out The Door In 2017?
Collapse
X
-
-
If someone wants to ditch tar and replace it with something modern then by all means look back some years to the XPK libraries that existed for the Amiga computers. A nice container format that allows the content to be packed with any weird compression format. All you application have to do is to support the XPK format and voila it will be able to compress and decompress any archiver supported by the XPK libs. Hence all you need to do to get all your programs to support a new compression format is just to install a extra compression plugin for XPK and magically all programs support it. They really did some things better in the old days!
http://www.dirtcellar.net
Comment
-
XZ is great, we shouldn't be changing it, more considerations then the final result are involved. Tar itself does have a few known limitations, but in no way should a kid get near such a low level concern, XAR is the only thing I'd seen that gets close. Maintaining systems implementation quality is brutally difficult, and clearly not what anyone who creates a C++ implementation intends to overcome.
Comment
-
Originally posted by Beherit View PostSorry, alpha_one_x86 and speculatrix, could you rephrase your posts? I can't understand what you're trying to say.
Tar doesn't have "bad" compression ratio, it has no compression ratio at all as it's not a compressor.
For what it's worth, I don't think it'll be replaced anytime soon. In the Windows world almost everything is .zip, even though superior competitors such as Winrar and 7zip have been available for decades.
In the Linux/Unix/FOSS community, tar.gz was the de-facto standard for everything just up until recently when some distro's started using the xz compressor. Which is quite a feat as both bzip, bzip2 and lzma have been around much longer without getting much love.
- Likes 1
Comment
-
Originally posted by Staffan View PostIt has never even struck me that there is a need for this. The file I/O is probably going to be a lot slower than decompressing so threading it will add nothing, unless the file is small enough to be completely cached in which case it's only going to take a second or two to decompress it anyway on a modern machine, once again making threading uninteresting.
Comment
Comment