Announcement

Collapse
No announcement yet.

Zlib "Next Generation" Preparing Massive Decompression Speed-Up

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by avis View Post
    Nice improvements but the core algorithm is outdated and ZSTD nowadays runs circles around zlib. It has basically made zlib obsolete.

    I wonder if the PNG standard could add ZSTD compression. That would be fantastic.

    Actually someone did that over five years ago but it's not gained any traction: https://github.com/catid/Zpng
    I might be wrong, but isn't this the Next Gen algorithm? It is still based on zlib, but updated to take advantage of modern cpu simd like instructions. It is not backwards compatible with the original format, but is meant to be a drop in code replacement for older projects that already use zlib.

    The obvious advantage is the developer can put this in, run this on all files, and speed up their existing programs without having to learn anything new.

    The disadvantage is they can miss out on compression offered by newer algorithms, but the time savings for the dev in not adding new and troubleshooting can be attractive to companies.

    So who might use this? Releasing updates to games, companies with custom file formats with zlib compression and so on.

    Comment


    • #22
      Too little too late. Literally everything is moving to ZSTD now.

      Comment


      • #23
        Originally posted by dragorth View Post

        I might be wrong, but isn't this the Next Gen algorithm? It is still based on zlib, but updated to take advantage of modern cpu simd like instructions. It is not backwards compatible with the original format, but is meant to be a drop in code replacement for older projects that already use zlib.

        The obvious advantage is the developer can put this in, run this on all files, and speed up their existing programs without having to learn anything new.

        The disadvantage is they can miss out on compression offered by newer algorithms, but the time savings for the dev in not adding new and troubleshooting can be attractive to companies.

        So who might use this? Releasing updates to games, companies with custom file formats with zlib compression and so on.
        AFAIK the format is exactly the same as the one from zlib so it is 100% compatible. It is "just" a fork of zlib with all the performance patches added in and a lot of old cruft for old dead systems removed.

        edit: yes found this on their github:

        The compressed output stream that zlib-ng produces is compatible with the DEFLATE, GZIP and ZLIB formats. Zlib-ng uses different deflate algorithms, hashing methods, matching logic, and configuration settings than zlib so the output will likely not be the same. So the difference in binary output between zlib-ng and zlib is expected.​


        Originally posted by rmfx View Post
        Great project, however the code is split into way too many small files imho.
        ​Why would that be a problem? It is the objectively best way to manage a project
        Last edited by F.Ultra; 28 April 2023, 05:15 PM.

        Comment


        • #24
          Nice to know about this, I may actually have a use case for it at work: An archive with millions of S3 storage objects, each of which is zlib compressed, and while there is no business case to read/re-compress/write those objects, lowering CPU cost and decompression time, whenever those long-term archived objects are read, is a welcome improvement!

          Comment


          • #25
            Originally posted by archkde View Post

            How does it compare to lossless JPEG XL?
            As JPEG XL was initially supported by Google, it uses brotli for compression. So, the comparison is between brotli and gzip, then.

            Comment


            • #26
              Originally posted by anarki2 View Post
              Too little too late. Literally everything is moving to ZSTD now.
              As people noted here before, that does not change the fact that zillions of archives are still gzip/zlib compressed and will benefit from faster decompression, even if only to convert them to zstd. Which of course will probably not happen, not everybody has that much time to kill for little gain.

              Comment


              • #27
                Originally posted by stormcrow View Post

                It won't even gain a lot of user facing use even in the open source world until the tar programs natively support piping data streams to zstd same as it does bzip2, xz, & gzip.
                tar does natively support piping data stream to zstd as this is what appears in the help document for tar 1.34


                Code:
                Compression options:
                
                -a, --auto-compress use archive suffix to determine the compression
                program
                -I, --use-compress-program=PROG
                filter through PROG (must accept -d)
                -j, --bzip2 filter the archive through bzip2
                -J, --xz filter the archive through xz
                --lzip filter the archive through lzip
                --lzma filter the archive through lzma
                --lzop filter the archive through lzop
                --no-auto-compress do not use archive suffix to determine the
                compression program
                --zstd filter the archive through zstd
                -z, --gzip, --gunzip, --ungzip filter the archive through gzip
                -Z, --compress, --uncompress filter the archive through compress​
                EDIT: just checked and bsdtar/libarchive also support natively piping if you first have the libzstd installed when compiling
                Last edited by silverhikari; 29 April 2023, 03:32 AM.

                Comment


                • #28
                  Originally posted by silverhikari View Post

                  tar does natively support piping data stream to zstd as this is what appears in the help document for tar 1.34


                  Code:
                  Compression options:
                  
                  -a, --auto-compress use archive suffix to determine the compression
                  program
                  -I, --use-compress-program=PROG
                  filter through PROG (must accept -d)
                  -j, --bzip2 filter the archive through bzip2
                  -J, --xz filter the archive through xz
                  --lzip filter the archive through lzip
                  --lzma filter the archive through lzma
                  --lzop filter the archive through lzop
                  --no-auto-compress do not use archive suffix to determine the
                  compression program
                  --zstd filter the archive through zstd
                  -z, --gzip, --gunzip, --ungzip filter the archive through gzip
                  -Z, --compress, --uncompress filter the archive through compress​
                  Interesting there's LZOP and ZSTD but no LZ4.

                  Comment


                  • #29
                    higher zlib and gz type compression ratios are offered by libdeflate. OpenWrt is using it for some devices. Patches welcome to add to more devices. After testing of course.

                    Comment


                    • #30
                      Originally posted by dwagner View Post
                      Nice to know about this, I may actually have a use case for it at work: An archive with millions of S3 storage objects, each of which is zlib compressed, and while there is no business case to read/re-compress/write those objects, lowering CPU cost and decompression time, whenever those long-term archived objects are read, is a welcome improvement!
                      I can happily report that it works wonders as a drop-in replacement, just compile with --zlib-compat and then you can move the zlib.so and zlib.so.1 links to libz.so.1.2.13.zlib-ng and every old application will just use this new library instead, no recompile or anything needed.

                      Comment

                      Working...
                      X