Originally posted by discordian
View Post
Announcement
Collapse
No announcement yet.
Canonical Working On Zstd-Compressed Debian Packages For Ubuntu
Collapse
X
-
Originally posted by xpris View PostBe aware of ZSTD at latest in Linux. Looks like zstd is stolen project and the original author protests and submits a lawsuit. Text in non english, use translate: https://translate.google.pl/translat...gle&edit-text=
Comment
-
Originally posted by mmstick View Post
Decompressing packages in the background, in parallel, isn't that difficult of a change to make. I'd give it a day of work.
- Likes 1
Comment
-
Originally posted by nils_ View Post
The problem is that tar isn't a very good format to extract in parallel since it's made for tape devices / streaming, which means you don't know ahead where the individual files start and end.
but tbh, the talk is about decompressing the next deb (or multiple) while installing the current. This could already start during downloading if taken to the extreme. Aslong as the memory is there, this should not be a big problem.
Comment
-
Originally posted by F.Ultra View Post
Ah I see. But are not such apps usually distributed as a large binary bash file (at least some closed source disk array utilities that I used over the time have been this way)?
in fact, if i end up with an app (like the games from gog), first thing i do is making a deb out of it.
Comment
-
Originally posted by discordian View PostTar has no compression at all, compressors like xz could already use blocks that can be decompressed in parallel (using those independend blocks would reduce compression however).
Comment
-
Originally posted by nils_ View PostYes, but the problem is that you can't determine which compressed blocks form a file so you need to merge the results back in ram and then extract from the tar.
The only issue is that you need the additional space in a ram-disk, which won't be a problem with most packages.
Comment
-
Originally posted by discordian View PostAny high compression format merges files and compresses a whole stream of them - typically ignoring file boundaries for blocks aswell. And as long you gotta uncompress all files anyway I don't know where you're getting with this comment.
The only issue is that you need the additional space in a ram-disk, which won't be a problem with most packages.
Comment
-
Originally posted by nils_ View Post
What I'm getting at is that to speed up the whole process further you would probably need to be able to also write the output files in parallel which isn't possible with tar.
Comment
Comment