Originally posted by Joe2021
View Post
Announcement
Collapse
No announcement yet.
FFmpeg Lands JPEG-XL Support
Collapse
X
-
-
Originally posted by Quackdoc View PostThis is plain wrong. you can losslessly compress jpegs to varying degrees, sometimes 5%, sometimes 50%, cjxl is open source, you can test it directly, or build ffmpeg if you wish to try it. I have a 23M jpeg compressed losslessly to 15M without even using a too slow preset. I would consider that "huge savings" to me.
Originally posted by Quackdoc View Postsome more real nice goodies like progressive rendering.
Comment
-
Originally posted by arzeth View Posthttps://github.com/archlinux/svntogi...s/imlib2/trunk Since yesterday, imlib2 in [testing] now supports JPEG-XL too! Now I am able to use feh (image viewer) to view .jxl!
I've just done lossless compression results comparison on a very-highly-detailed 1280x720 2D-image (non-overclocked Ryzen 5 2600, all these packages are recompiled with my CFLAGS):
Code:optipng -o7 a.png # → 1 170 748 bytes ~2 minutes IIRC. avifenc -j 12 -s 0 --lossless a.png losles.avif # → 1 072 339 bytes, 13 327 ms. cwebp -mt -m 6 -q 100 -lossless on a.png -o losles.webp # → 937 850 bytes, 6432 ms. cwp2 -q 100 -effort 9 a.png -o losles.wp2 # → 869 486 bytes. 10 minutes, only 1 thread was used. Just compiled from git. cjxl a.png -q 100 -e 8 losles8.jxl # → 851 817 bytes, 5300 ms, only 2 threads? cjxl a.png -q 100 -e 9 losles9.jxl # → 828 897 bytes, 50 748 ms (9.5x slower).
Comment
-
Originally posted by curfew View PostInstead of spewing technical jargon and meaningless crap,
What I said is technically accurate, and based on having spent more time on the inside of IJG & libjpeg-derived source code than probably anyone else in this thread.
Originally posted by curfew View Postyou could just read the JPEG XL FAQ and see how they spin it in there:
"The JPEG image is based on the discrete cosine transform (DCT) of 8x8 with fixed quantization tables. JPEG XL offers a much more robust approach, including variable DCT sizes ranging from 2x2 to 256x256 as well as adaptive quantization, of which the simple JPEG DCT is merely a particular case."
Originally posted by curfew View Post"As a result, you do not need to decode JPEGs to pixels to convert them to JPEG XLs. Rather than relying on the JPEG internal representation (DCT coefficients), utilize JPEG XL directly.
Originally posted by curfew View PostEven though only the subset of JPEG XL that corresponds to JPEG is used, the converted images would be 20 percent smaller."
The way Skeevy420 was talking about it seemed to give the impression that lossless conversion from JPEG would give the full advantages advertised about the format.
Comment
-
Originally posted by Joe2021 View PostI have to admit that I made the very same prediction, but reality proved me wrong. I converted a huge JPG archive to JPEGXL and its now reduced to 82% of the former size.
If an 82% reduction were typical, you'd think they'd quote that in their "JPEG lossless conversion" FAQ. So, I'm just trying to figure out what makes your case special.
Originally posted by Joe2021 View PostThe point is: You can convert it back with the same tool and you will get the bit-identical original JPEG! Same sha256 fingerprint.
Originally posted by Joe2021 View PostSo, as this is a bidirectional lossless conversion
Originally posted by Joe2021 View PostBig kudos to the JPEG-XL-People!
Comment
-
Originally posted by coder View Post
That's merely a "computationally cheap" conversion. Not doing a full decode doesn't make it inherently lossless. Even if it's reversible, that still doesn't make it lossless. For it to be truly lossless, it would have to decode to the exact same pixel values, which isn't going to happen if you're changing the DCT size and colorspace.Code:/tmp ❯ cjxl john-french-SNW4DWZEy8I-unsplash.jpg test.jxl JPEG XL encoder v0.6.1 [AVX2,SSE4,SSSE3,Scalar] Read 1920x1280 image, 71.7 MP/s Encoding [Container | JPEG, lossless transcode, squirrel | JPEG reconstruction data], 2 threads. Compressed to 236958 bytes (0.771 bpp). 1920 x 1280, 27.71 MP/s [27.71, 27.71], 1 reps, 2 threads. Including container: 237445 bytes (0.773 bpp). /tmp ❯ djxl test.jxl decoded.jpg JPEG XL decoder v0.6.1 [AVX2,SSE4,SSSE3,Scalar] Read 237445 compressed bytes. Reconstructed to JPEG. 1920 x 1280, 21.17 MP/s [21.17, 21.17], 2.34 MB/s [2.34, 2.34], 1 reps, 2 threads. Allocations: 349 (max bytes in use: 4.304357E+07) /tmp ❯ md5sum decoded.jpg 9d2f4ca592f572678a7442fbb2b7617f decoded.jpg /tmp ❯ md5sum john-french-SNW4DWZEy8I-unsplash.jpg 9d2f4ca592f572678a7442fbb2b7617f john-french-SNW4DWZEy8I-unsplash.jpg
Comment
-
Originally posted by Quackdoc View Postwho knows, not me for sure. but the potential is there. 5 years from now, 10 years. how old is jpeg now, 20 years old? I think this is a smart "future proofing" step.
1Bx1B would be enough to hold an image of the Earth's hemisphere at about 53 pixel/m, which I think is significantly higher-res than the best quality data you'd find on Google Earth. And outside heavily-populated areas or other points of interest, most data on Google Earth is far lower-res.
It would also occupy about 3 Exabytes of memory, if decoded to RGB @ 8bpc. For it to be a practical on-disk representation, the file format would need some support for tiles and indexing. Otherwise, you're better off storing tiles as separate files.
Originally posted by Quackdoc View Postfor instance kickstarter had to override a change from their CDN to use gifs instead of avifs since it broke firefox compat.Last edited by coder; 26 April 2022, 03:59 AM.
Comment
-
Originally posted by arun54321 View PostCode:/tmp ❯ cjxl john-french-SNW4DWZEy8I-unsplash.jpg test.jxl JPEG XL encoder v0.6.1 [AVX2,SSE4,SSSE3,Scalar] Read 1920x1280 image, 71.7 MP/s Encoding [Container | JPEG, lossless transcode, squirrel | JPEG reconstruction data], 2 threads. Compressed to 236958 bytes (0.771 bpp). 1920 x 1280, 27.71 MP/s [27.71, 27.71], 1 reps, 2 threads. Including container: 237445 bytes (0.773 bpp). /tmp ❯ djxl test.jxl decoded.jpg JPEG XL decoder v0.6.1 [AVX2,SSE4,SSSE3,Scalar] Read 237445 compressed bytes. Reconstructed to JPEG. 1920 x 1280, 21.17 MP/s [21.17, 21.17], 2.34 MB/s [2.34, 2.34], 1 reps, 2 threads. Allocations: 349 (max bytes in use: 4.304357E+07) /tmp ❯ md5sum decoded.jpg 9d2f4ca592f572678a7442fbb2b7617f decoded.jpg /tmp ❯ md5sum john-french-SNW4DWZEy8I-unsplash.jpg 9d2f4ca592f572678a7442fbb2b7617f john-french-SNW4DWZEy8I-unsplash.jpg
Also, either I'm misreading that output or you didn't tell us enough information to derive the compression ratio of the .jxl file, over the .jpg.Last edited by coder; 26 April 2022, 03:56 AM.
Comment
-
Originally posted by coder View PostOkay, you missed my point. What I meant is that you should then decode both the .jpg and .jxl versions to something like BMP and compare those.
Also, either I'm misreading that output or you didn't tell us enough information to derive the compression ratio of the .jxl file, over the .jpg.
- Likes 1
Comment
Comment