Originally posted by timtas
View Post
Announcement
Collapse
No announcement yet.
Zlib "Next Generation" Preparing Massive Decompression Speed-Up
Collapse
X
-
-
Originally posted by Anux View PostHu, that doesn't sound logic at all. I just took a random png (15,7 MB) to 7z (14,9 MB) and a random jpeg 103 KB to 76 KB. You could do such things yourself in a few seconds before you post wild claims on the internet.
Even common software may be broken if you start to use PNG features to compress better, because no one ever used them, so no one really tested them.
So, yes, you can compress something compressed, but that just mean that was not really compressed (like 99.999% of PNG in the world are, not really compressed).
So, yes, it's make sense to compress at file system level even to store compressed files. I do it myself anyway. But the logic should be “what's already compressed should not be really compressible more”, the fact compressed files can be recompressed and save sometime a lot of size isn't logic at all, because it means compressed files aren't really compressed, which is not logic.
- Likes 1
Comment
-
Originally posted by illwieckz View Post
Well, especially for png, that's expected. PNG is just something absolutely not optimal to compress images, also almost no one PNG-producing tools actually efficiently compress PNG neither use PNG features to efficiently compress them. Like, do the try, pick a collection of PNG, runs oxipng on them, you'll save between 20% to 50% of size per PNG, but both before and after it would be PNG.
Even common software may be broken if you start to use PNG features to compress better, because no one ever used them, so no one really tested them.
So, yes, you can compress something compressed, but that just mean that was not really compressed (like 99.999% of PNG in the world are, not really compressed).
So, yes, it's make sense to compress at file system level even to store compressed files. I do it myself anyway. But the logic should be “what's already compressed should not be really compressible more”, the fact compressed files can be recompressed and save sometime a lot of size isn't logic at all, because it means compressed files aren't really compressed, which is not logic.
Applications, which create PNG files, choose the level of compression, which gives an acceptable response time and decent compression level.
Not the best compression possible, since that would be too slow.
And it is not true that PNG files will break if you use advanced optimization options. Almost all software reads/writes PNG via libpng, nobody writes its own version of PNG. You can optimize the PNG by optipng, and use advanced optimization features. The resulting file is still a valid PNG, and will be read without problems by all software, that uses libpng. And that is practically all software that supports PNG.
Comment
-
Originally posted by dpeterc View PostYou fail to understand that compression is a compromise between time and size.
The major PNG compression efficiency doesn't come from zlib and its compression level.
PNG compression efficiency comes from the fact it provides like 16 (I'm lazy to count) different profiles that works better or not given the input, basically you can chose for the colors between 1-bit, 2-bit, 4-bit, 8-bit palettes, 1-bit, 8-bit grayscale, 24-bit rgb, then for the alpha channel 1bit alpha or 8 bit alpha, or none. Just for images with alpha channel there is 8 combinations. I haven't counted the combinations without alpha channel. Picking the right profile for the data is not like running zlib with -9, but that's what will give you a bump of 25% or 50% of compression.
I'm not saying libpng will choose, I'm saying you will chose, as a developer writing a software with libpng.
It happens that all those variants are poorly tested. I myself had to fix a software that had broken png support on some of those variants no one uses. 99% of the softwares out there produce RGBA PNG even when there is no alpha data to store. These days I just identified a bug in Python Pillow module (standard Python Image library), that reads as 1-bit (black-and-white) some greyscale PNG variant. It's not just a random software no one uses, actually all Python applications doing PNG with the standard library are affected. But no one cares, because no one does something else than RGBA PNG.
Applications, which create PNG files, choose the level of compression, which gives an acceptable response time and decent compression level.
And it is not true that PNG files will break if you use advanced optimization options.
The resulting file is still a valid PNG, and will be read without problems by all software, that uses libpng.
Yes, it's a valid PNG. No it will not be read without any problems by all software that uses libpng. The quoted NetRadiant software was using libpng but had a bug. Why? Because libpng is a micromanaged library, I'm not talking about project management, but how the developer deals with the libpng library in the software he writes code for. One has to write code to configure libpng for this or that png profile.
One doesn't do:
Code:pixmap = libpng.read(filepath);
Code:png_read_info( png_ptr, info_ptr ); int bit_depth = png_get_bit_depth( png_ptr, info_ptr ); int color_type = png_get_color_type( png_ptr, info_ptr ); if ( color_type == PNG_COLOR_TYPE_GRAY || color_type == PNG_COLOR_TYPE_GRAY_ALPHA ) { png_set_gray_to_rgb( png_ptr ); } else if ( color_type == PNG_COLOR_TYPE_PALETTE ) { png_set_palette_to_rgb( png_ptr ); } if ( color_type == PNG_COLOR_TYPE_GRAY && bit_depth < 8 ) { png_set_expand_gray_1_2_4_to_8( png_ptr ); } if ( png_get_valid( png_ptr, info_ptr, PNG_INFO_tRNS ) ) { png_set_tRNS_to_alpha( png_ptr ); } else if ( !( color_type & PNG_COLOR_MASK_ALPHA ) ) { png_color_16 my_background, *image_background; if ( png_get_bKGD( png_ptr, info_ptr, &image_background ) ) { png_set_background( png_ptr, image_background, PNG_BACKGROUND_GAMMA_FILE, 1, 1.0 ); } else { png_set_background( png_ptr, &my_background, PNG_BACKGROUND_GAMMA_SCREEN, 0, 1.0 ); } png_set_filler( png_ptr, 0xff, PNG_FILLER_AFTER ); } png_read_update_info( png_ptr, info_ptr ); color_type = png_get_color_type( png_ptr, info_ptr ); bit_depth = png_get_bit_depth( png_ptr, info_ptr ); int width = png_get_image_width( png_ptr, info_ptr ); int height = png_get_image_height( png_ptr, info_ptr ); RGBAImage* image = new RGBAImage( width, height );
Chosing the compression level of a PNG is not about turning one knob, it's about turning between 10 and 20 knobs. Most of the applications even don't turn the first one.Last edited by illwieckz; 01 May 2023, 10:07 PM.
- Likes 1
Comment
-
Here I produced a list of reference PNG files for various PNG formats, the ones names `test-*.png`:
A test asset repository for the Unvanquished game project - File not found · UnvanquishedAssets/UnvanquishedTestAssets
There is only 8 of them because I haven't produced the non-alpha variants, so it's only half the PNG formats, a complete collection would have 16 of them, if no more if I missed some. There is not one PNG format, there are at least 16 PNG formats.Last edited by illwieckz; 01 May 2023, 10:13 PM.
Comment
-
Originally posted by ll1025 View Postand if it helps, you're probably doing something sub-optimal.
For instance if you use the fastest zip compression against a text file, you will be leaving some compression headroom on the table.
So in your examples, you probably could have used a higher compression level in PNG and ended with a file smaller than 14.9MB-- or you could have used a better image format.
Also if we allready have a big list of algorithms that "don't count", how does that still fall under "Data compressed using most compression algos"?
Originally posted by blackshard View PostI did a quick benchmark by myself: stripped any metadata out of a jpeg file with GIMP (ICC color profile, thumbnail, EXIF things, etc...), obtaining a 168,6kb file. After compressing it to .7z, it becomes 168,5kb.
Originally posted by illwieckz View PostSo, yes, you can compress something compressed, but that just mean that was not really compressed (like 99.999% of PNG in the world are, not really compressed).
In my language "most" means "all with a few special exceptions" but I'm not a native english speaker.
- Likes 1
Comment
-
Originally posted by Anux View PostBut avis said "Data compressed using most compression algos" not "Data compressed using best possible compression algos with maximum settings".
Anyway, there are usually remaining ways to compress things that already compressed. For example a zip archive compresses files separately, so if you put 3 times the same file in a zip, the files are stored 3 times compressed and not deduplicated. If you zip the zip, you can actually deduplicate those compressed files. Some other tricks may be more crazy, like settings all files from a zip to the same date and time would allow to save more space if compressing the zip with zip, the best can be obtained if you set the file datetime withing the zip to 0 (zip epoch) and zip the zip.
- Likes 1
Comment
-
Originally posted by illwieckz View Post... just because 7z compresses better than zlib, not because 7z is better for images ...
Only the current best algorithms in its fields are able to compress without further possible improvements (JXL for lossless images or PPMd for text files) and thats far from most. I'm especially not arguing about PNG beeing the best.
Comment
Comment