Announcement

Collapse
No announcement yet.

Should Tarballs Be On Their Way Out The Door In 2017?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by Beherit View Post
    In the Windows world almost everything is .zip, even though superior competitors such as Winrar and 7zip have been available for decades.
    I wish ...

    I keep encountering that proprietary .rar crap even though .zip or .7z is perfectly good enough. .cbr files have been a frequent source of trouble for me (incompatibilities between okular and unrar), when .cbz was perfectly fine.

    Comment


    • #32
      My main grudge with TAR is that it doesn't preserve various file attributes, xattrs and ACLs. That IMHO should be #1 requirement for a modern replacement.

      Otherwise I agree that archiving and compression should be handled by the same program so that random access read can be efficient.

      Comment


      • #33
        NO, theres a place for them as self contained independent entities, generate them on the spot from your git frontend if need be

        Comment


        • #34
          Originally posted by techzilla View Post
          Maintaining systems implementation quality is brutally difficult, and clearly not what anyone who creates a C++ implementation intends to overcome.
          don't like brutal difficulties - learn c++

          Comment


          • #35
            Originally posted by microcode View Post
            This will soon not be true. I already have volumes on M.2 drives which read faster than I can decompress high ratio gzip or mid-ratio LZMA/LZMA2.
            read faster compressed size or decompressed? because without compression you have to read decompressed and it can be dozens of times larger

            Comment


            • #36
              Originally posted by kalrish View Post

              It happens to everyone.
              No it doesn’t. That’s what man pages are for.

              Comment


              • #37
                Originally posted by kalrish View Post

                It happens to everyone.
                Surely

                tar --help

                is a valid command as you will still get exit success?

                Comment


                • #38
                  But why would one even want random file access? The only scenario I can think of where that would be useful is distributing updates to only unpack updated files, but we already have Delta for that.

                  Comment


                  • #39
                    For file-based distribution of file archives, this is sufficiently covered by both tar and zip. They are both in widespread use, and they both fill our needs pretty well. It's difficult to see the advantage of a new format here given how entrenched the existing formats are. Particularly if the improvements are marginal at best.

                    What would be a little more forward looking is investigating different mechanisms for the distribution of file archives, which are a completely different way of thinking about the problem, and solves problems that a new file format does not. Examples include:

                    - use of git shallow clones; pull it directly from the upstream source repository. This completely eliminates the need for some intermediary "transportable" file format; though this could continue to be offered for archival purposes.
                    - use of zfs recv. Pull down a whole filesystem directly to your system. curl xxx | zfs recv /destination/path. Rather than downloading a tar or zip, download and directly import the filesystem image. Could be used for source releases, git repos, binary builds, large data files like game assets. The best part of this is that it can be incremental to allow updates. For big multi-gigabyte datasets, the incremental updates are much better than rsync in both load on client and server and speed. The system serving the data doesn't even need to use zfs; it can host the output of zfs send and the incremental updates as plain files, or it can generate them on the fly if using zfs. This also has the nice effect of preserving all of the extended filesystem metadata, which not all zip and tar implementations can preserve.

                    Comment


                    • #40
                      Maybe i'm looking at this the wrong way, but isn't every distro free to use their own format?
                      And isn't at least one distro already on the xz format?
                      I fail to see an issue here...

                      Comment

                      Working...
                      X