Announcement

Collapse
No announcement yet.

FFmpeg Lands JPEG-XL Support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • skeevy420
    replied
    Originally posted by cl333r View Post

    That's cool but I wonder is any software actually creating or would like to create this large images? The Hubble telescope probably creates a sequence of images and then when viewing they're glued with software for dynamic interaction (scrolling, zooming) to save RAM and whatnot. I imagine 99.9999% of programmers would agree that you shouldn't store a 1Bx1B image as that would trash your resources even on a supercomputer unless you have a really good reason not to use a tiled image. And what reason/scenario would that be to have a 1Bx1B image? (I'm not arguing just trying to figure out a legit use case).
    Hey, I'm wondering that, too. Both, if that kind of software exists and what you'd take a picture of. A giant panorama of the universe? DNA and cells? Hardware limitations means it'll be made out of stitched-together images...I don't think there's a 1Bn MP camera.

    There is no probably, the Hubble takes multiple black and white photos between various UV and IR wavelengths, stacks them, and then gets false colorization in post processing...with upwards of 1 million seconds of exposure times if you add all the stacks together. I'm not really sure how the astro-scientists view and manipulate the individual shots or do post processing. I'm just getting into that hobby, myself -- like, still buying gear getting into it -- currently debating between an electronic focuser and a focal reducer for my C8 SCT.

    Leave a comment:


  • cl333r
    replied
    Originally posted by skeevy420 View Post
    WebP is 16Kx16K 14 bit, AVIF* is 7,680 x 4,320 12 bit, and JPEG XL is 1Bx1B 24 bit. That's K as in thousand and B as in billion.
    That's cool but I wonder is any software actually creating or would like to create this large images? The Hubble telescope probably creates a sequence of images and then when viewing they're glued with software for dynamic interaction (scrolling, zooming) to save RAM and whatnot. I imagine 99.9999% of programmers would agree that you shouldn't store a 1Bx1B image as that would trash your resources even on a supercomputer unless you have a really good reason not to use a tiled image. And what reason/scenario would that be to have a 1Bx1B image? (I'm not arguing just trying to figure out a legit use case).

    Leave a comment:


  • cl333r
    replied
    Originally posted by Joe2021 View Post
    I converted a huge JPG archive to JPEGXL and its now reduced to 82% of the former size.
    The point is: You can convert it back with the same tool and you will get the bit-identical original JPEG! Same sha256 fingerprint.
    What was the exact tool/command line that you used?

    Leave a comment:


  • billyswong
    replied
    Now the question is, when will Firefox support it outside nightly. They implement AVIF fast but only the static image part, which is quite pointless as AVIF's strength is in replacing animated GIF. Then they waste time arguing whether JPEG-XL is worth implementing and stuck there.

    Leave a comment:


  • Joe2021
    replied
    Originally posted by coder View Post
    Any compression gains you get would be small or negligible, without modifying the content. Lossless conversion means you're not changing the block structure, base DCT transform, or quantization coefficients. That only leaves entropy & bitstream encoding, and there are no huge savings there.
    Wrong!

    I have to admit that I made the very same prediction, but reality proved me wrong. I converted a huge JPG archive to JPEGXL and its now reduced to 82% of the former size.

    The point is: You can convert it back with the same tool and you will get the bit-identical original JPEG! Same sha256 fingerprint.

    So, as this is a bidirectional lossless conversion there is no reason to hesitate to take advantage of it, as you can revert the decission later if you regret it for some reasons. Nobrainer!

    Try it for yourself!

    Big kudos to the JPEG-XL-People!

    Leave a comment:


  • skeevy420
    replied
    Originally posted by arun54321 View Post

    What do you mean? jxl is not computationally heavy to encode.
    I just mean in regards to max resolutions and bit depth. WebP is 16Kx16K 14 bit, AVIF* is 7,680 x 4,320 12 bit, and JPEG XL is 1Bx1B 24 bit. That's K as in thousand and B as in billion. Full resolution JXL is just magnitudes larger than than AVIF. That's what I mean when I say that you have to be using some pretty hardcore computing devices to fully utilize JXL when compared to WebP or AVIF...the kinds of computing and imaging devices in use by scientists and rich professionals with very high end gear.

    *AVIF can max out at 65Kx65K using multiple AVIF images stitched together in a container and is known for having artifacts at the seams -- it's worth mentioning.

    Leave a comment:


  • cl333r
    replied
    Originally posted by matel View Post
    WebP has been a relief, but it does not support HDR, and compression could be better
    Fair enough, FWIW WebP2 [1] has these goals among other things:
    - more efficient lossy compression (~30% better than WebP, as close to AVIF as possible)
    - full 10bit architecture (HDR10)

    [1] https://chromium.googlesource.com/codecs/libwebp2/

    Leave a comment:


  • curfew
    replied
    Originally posted by coder View Post
    Any compression gains you get would be small or negligible, without modifying the content. Lossless conversion means you're not changing the block structure, base DCT transform, or quantization coefficients. That only leaves entropy & bitstream encoding, and there are no huge savings there.
    Instead of spewing technical jargon and meaningless crap, you could just read the JPEG XL FAQ and see how they spin it in there:

    "The JPEG image is based on the discrete cosine transform (DCT) of 8x8 with fixed quantization tables. JPEG XL offers a much more robust approach, including variable DCT sizes ranging from 2x2 to 256x256 as well as adaptive quantization, of which the simple JPEG DCT is merely a particular case."

    "As a result, you do not need to decode JPEGs to pixels to convert them to JPEG XLs. Rather than relying on the JPEG internal representation (DCT coefficients), utilize JPEG XL directly. Even though only the subset of JPEG XL that corresponds to JPEG is used, the converted images would be 20 percent smaller."

    Interestingly enough they seem to directly disagree with most of your assessments.

    Leave a comment:


  • matel
    replied
    Originally posted by cl333r View Post

    It's more like WebP is much better than jpeg/png/gif and it's backed by Google and it's supported by all relevant browsers. Whereas all other new image formats aren't as open, not backed or supported enough, in their infancy or whatnot, which is why it's reasonable to believe that WebP is the future (IIRC WebP2 is in the works), but it doesn't mean other new image formats won't coexist.
    This is what the project says in their FAQ:

    WebP has been a relief, but it does not support HDR, and compression could be better. AVIF has been a significant step into the future, but it lacks efficiency in encoding images. Both WebP and AVIF have been derived from video formats, while JPEG XL has an image architecture from the start in mind.
    Note also the big involvement of Google in the creation of JPEG XL:

    Google's Pik proposal and Cloudinary's FUIF proposal served as the foundation. All three companies have authors and developers responsible for creating JPEG XL.
    For my own websites I already made the decision to not invest my time in WebP or AVIF. I expect the time to look into this will be in 2023 or 2024. I expect a very fast adoption of this new format once the tools are ready. Maybe we´ll even see built-in support in Android later this year.

    Leave a comment:


  • Quackdoc
    replied
    Originally posted by OneTimeShot View Post
    Shouldn’t touch this until we know the patent situation.
    reference software is licenced under apache 2.0, a royalty free patent grant, so assuming no patent trolls crop up there shouldn't be an issue.

    Leave a comment:

Working...
X