Announcement

Collapse
No announcement yet.

Should Tarballs Be On Their Way Out The Door In 2017?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Should Tarballs Be On Their Way Out The Door In 2017?

    Phoronix: Should Tarballs Be On Their Way Out The Door In 2017?

    The .tar archive file format has been around for decades, but GNOME / free software developer Jussi Pakkanen suggests that it's time for a modern solution...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Hello,
    As squashfs, handle random file access hurt the compression ratio, then you should take care of this.
    Tar have bad compression ratio on small file because the hole do a small noise.
    I have experimente and confirm it on CatchChallenger project on compression search.
    I'm to keep container format and compression format (tar or other + gzip/xz), to split it and easy change of compression type.
    Cheers,
    Developer of Ultracopier/CatchChallenger and CEO of Confiared

    Comment


    • #3
      I find squashfs to be a very useful file format, although it's immutable. I've used it successfully for archives of web sites* where you want a snapshot of the entire site - use httrack to grab and fix all the links. For a bonus, use the namazu free text database to make a searchable index.

      e.g. the CIA World Factbook, it comes as a zip, which isn't so useful, so unpack, turn into squashfs and add an nmz index.

      Comment


      • #4
        Sorry, alpha_one_x86 and speculatrix, could you rephrase your posts? I can't understand what you're trying to say.

        Tar doesn't have "bad" compression ratio, it has no compression ratio at all as it's not a compressor.

        For what it's worth, I don't think it'll be replaced anytime soon. In the Windows world almost everything is .zip, even though superior competitors such as Winrar and 7zip have been available for decades.

        In the Linux/Unix/FOSS community, tar.gz was the de-facto standard for everything just up until recently when some distro's started using the xz compressor. Which is quite a feat as both bzip, bzip2 and lzma have been around much longer without getting much love.

        Comment


        • #5
          The only thing I'd like to have is a SINGLE tool that compresses. As it is now, many tools on Windows (and also on Linux) first decompress the xz/gz/whatever into a tar, and then I need to open/decompress the tar too.

          Can this next-gen compressor do both tasks at once?

          Comment


          • #6
            Beherit squashfs is a useful way of taking a bunch of files and storing them in a compressed form in a single file, yet also be a mountable file system. Whilst tar with compression achieves the first, it fails at the second, as does zip.

            Comment


            • #7
              Methinks that whatever someone intends to replace it with, it shouldn't be born in a whim of a moment withuut further consideration.

              Comment


              • #8
                Originally posted by Beherit View Post
                In the Linux/Unix/FOSS community, tar.gz was the de-facto standard for everything just up until recently when some distro's started using the xz compressor. Which is quite a feat as both bzip, bzip2 and lzma have been around much longer without getting much love.
                xz is the successor to lzma and the "XZ Utils" package which provides the "xz" command was previously called "LZMA Utils".

                It's not really fair to say that xz burst onto the scene while lzma had been around much longer when the former is just a revised version of the latter to better match the featureset of gzip's on-disk format.

                As for bzip2, I suspect it may just not have been a significant enough improvement over gzip. Not only does the LZMA algorithm compress better, it's been my experience that it gives you more compression per amount of CPU time spent.

                Comment


                • #9
                  Originally posted by starshipeleven View Post
                  The only thing I'd like to have is a SINGLE tool that compresses. As it is now, many tools on Windows (and also on Linux) first decompress the xz/gz/whatever into a tar, and then I need to open/decompress the tar too.

                  Can this next-gen compressor do both tasks at once?
                  I can't imagine that being a problem, but it goes against Unix' philosophy of "less is more". I think it's cleaner to have a utility that takes a stream and compresses it and another utility that takes whatever and turns it into a stream. And before you ask, it took me quite some time to learn tar's CLI (not because it's particularly difficult, but because I was using it rarely and kept forgetting).

                  Comment


                  • #10
                    TAR has been around for so long because it is useful and widely supported, and not because nobody couldn't think of any better.

                    He's just another idiot who uses the wrong tools, then thinks old equals bad and wants to implement yet another archiving tool. As if we needed yet another archiver.

                    And calling it JPAK when his name is Jussi Pakkanen sure ticks another box.

                    Comment

                    Working...
                    X