Announcement

Collapse
No announcement yet.

Should Tarballs Be On Their Way Out The Door In 2017?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    It has never even struck me that there is a need for this. The file I/O is probably going to be a lot slower than decompressing so threading it will add nothing, unless the file is small enough to be completely cached in which case it's only going to take a second or two to decompress it anyway on a modern machine, once again making threading uninteresting.

    Comment


    • #22
      Originally posted by kalrish View Post

      It happens to everyone.
      Good one, I didn't know it

      Comment


      • #23
        Originally posted by starshipeleven View Post
        The only thing I'd like to have is a SINGLE tool that compresses. As it is now, many tools on Windows (and also on Linux) first decompress the xz/gz/whatever into a tar, and then I need to open/decompress the tar too.
        tar x{z/j/J}

        Comment


        • #24
          IMO XZ should be the one used , why Mozilla dont use it beats me

          Comment


          • #25
            btw, if you want random access, you could use tar of compressed files instead of compressed tar. it will have worse compression ratio obviously

            Comment


            • #26
              If someone wants to ditch tar and replace it with something modern then by all means look back some years to the XPK libraries that existed for the Amiga computers. A nice container format that allows the content to be packed with any weird compression format. All you application have to do is to support the XPK format and voila it will be able to compress and decompress any archiver supported by the XPK libs. Hence all you need to do to get all your programs to support a new compression format is just to install a extra compression plugin for XPK and magically all programs support it. They really did some things better in the old days!

              http://www.dirtcellar.net

              Comment


              • #27
                XZ is great, we shouldn't be changing it, more considerations then the final result are involved. Tar itself does have a few known limitations, but in no way should a kid get near such a low level concern, XAR is the only thing I'd seen that gets close. Maintaining systems implementation quality is brutally difficult, and clearly not what anyone who creates a C++ implementation intends to overcome.

                Comment


                • #28
                  Originally posted by Beherit View Post
                  Sorry, alpha_one_x86 and speculatrix, could you rephrase your posts? I can't understand what you're trying to say.

                  Tar doesn't have "bad" compression ratio, it has no compression ratio at all as it's not a compressor.

                  For what it's worth, I don't think it'll be replaced anytime soon. In the Windows world almost everything is .zip, even though superior competitors such as Winrar and 7zip have been available for decades.

                  In the Linux/Unix/FOSS community, tar.gz was the de-facto standard for everything just up until recently when some distro's started using the xz compressor. Which is quite a feat as both bzip, bzip2 and lzma have been around much longer without getting much love.
                  I think x86 was saying that if each file has its own context/dictionary then the overall ratios will likely suffer for archives full of small files. Also not sure though.

                  Comment


                  • #29
                    Originally posted by pal666 View Post
                    tar x{z/j/J}
                    Also these days tar will detect the format and use the right decompressor. xvf are my usual flags.

                    Comment


                    • #30
                      Originally posted by Staffan View Post
                      It has never even struck me that there is a need for this. The file I/O is probably going to be a lot slower than decompressing so threading it will add nothing, unless the file is small enough to be completely cached in which case it's only going to take a second or two to decompress it anyway on a modern machine, once again making threading uninteresting.
                      This will soon not be true. I already have volumes on M.2 drives which read faster than I can decompress high ratio gzip or mid-ratio LZMA/LZMA2. LZ4 and a couple others are still faster than flash; but I'm thinking perhaps they won't be forever.

                      Comment

                      Working...
                      X