Announcement

Collapse
No announcement yet.

Libjpeg 9 Does Better Lossless JPEG Compression

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • XorEaxEax
    replied
    Originally posted by ssvb View Post
    This seems to be WebP
    About WebP, iirc it was based upon the VP8 video codec. Now with the new VP9 video codec out soon(ish) which as reported has a ton of improvements, has there been any word on WebP being updated/enhanced to make use of those improvements?

    Leave a comment:


  • ssvb
    replied
    Originally posted by ultimA View Post
    I read through the .doc, the way I see it they basically took the principle of M/S coding used for stereo audio and adapted it for image coding. The technique is neither new nor high-tech, but Kudos for them for realizing its applicability (no other well-known image format has done it, AFAIK).
    This seems to be WebP Of course, if it qualifies as a "well-known image format". But apparently Guido Vollbeding from JPEG 9 actually got a bit better implementation for this particular filter: http://sourceforge.net/mailarchive/m...sg_id=30353476
    So indeed, he surely deserves a credit for that and we may end up with better open source image codecs as a result. I just think that JPEG should not be a subject for such experiments. We need a good balance between "stable" (traditional baseline JPEG and PNG formats) and "bleeding edge" (WebP and friends).

    Leave a comment:


  • ultimA
    replied
    I read through the .doc, the way I see it they basically took the principle of M/S coding used for stereo audio and adapted it for image coding. The technique is neither new nor high-tech, but Kudos for them for realizing its applicability (no other well-known image format has done it, AFAIK).

    However, sadly they not only broke the image format, they also broke the ABI of the library. They could have just introduced new functions for new functionality and upgrade the decoding (keeping the old interface) for the new format. This would have allowed introducing support for the new format gradually, allowing existing software to at least decode the new images. But hey no they just had to break the interface, which I guess also means that libjpeg-9 won't spread wide until most of the existing software are adjusted. Yeah a version bump is normally a good point to break ABI, but reading through the changelog, I just don't see it justified.

    Leave a comment:


  • ssvb
    replied
    Just found the paper with this lossless coding proposal linked from the wikipedia page: http://jpegclub.org/temp/JPEG_9_Lossless_Coding.doc
    It has some compression ratio comparison tables, showcasing libjpeg-9 lossless coding and allegedly outperforming the competition. I decided to also give WebP a try (using libwebp-0.2.0.tar.gz):
    Code:
    $ wget http://www.r0k.us/graphics/kodak/kodak/kodim01.png
    $ cwebp -m 6 -lossless kodim01.png -o kodim01.webp
    $ ls -l kodim01.webp 
    -rw-r--r-- 1 ssvb ssvb 504672 Jan 14 23:17 kodim01.webp
    The very first file from the set compresses to 504672 bytes with WebP, which seems to be significantly better than 574K reported for libjpeg-9 using the same file. I have not run any tests through the rest of the files, this might be a good exercise for somebody else

    In any case, the paper says: "In April/May 2012, a new feature was found and implemented in the IJG software which significantly improves the lossless compression of continuous-tone color images, outperforming other currently popular methods and thus making the new feature very attractive for practical application. A development version with the new feature is currently presented by InfAI Leipzig and IJG, and is planned for release as a new major IJG version 9 in January 2013."
    But for a new lossless image compression method developed in 2012/2013, totally ignoring WebP and maybe some other modern codecs seems to be a bit unfair, isn't it?

    Just let the aging JPEG and PNG formats keep providing best compatibility with the existing and future software. That's their best feature today and they really have nothing else to offer.

    Leave a comment:


  • TheLexMachine
    replied
    Well, I was obviously misinterpreting something else I was looking at regarding the updates though I am going to look into this SmartScale stuff to see what it's bringing to the table, if anything. As an owner of a high-end photography setup, I like to see what's going on with the latest stuff.

    Leave a comment:


  • ssvb
    replied
    Originally posted by TheLexMachine View Post
    Based on the development activity, it looks like libjpeg-turbo is about to get an update in the near future to bring it up to par with libjpeg9.
    I'm not so sure about this. Quoting a part of libjpeg-turbo maintainer's message from http://sourceforge.net/mailarchive/f...eg-turbo-users: "As far as adoption, IMHO the only major reason why an x86- or ARM-focused project would prefer the upstream code at this point is that said project wanted to implement SmartScale, and IMHO, without any reproducible metrics demonstrating the usefulness of that format, anyone who's implementing it in their software is doing so only out of a desire to be on the bleeding edge and not for any definable need. My question all along has been: why did the IJG use libjpeg as a platform for this new format? Why didn't they implement it in a completely new library? The only answer I can come up with is that they're trying to put the cart before the horse. libjpeg was originally a reference implementation meant to ease the adoption of an accepted industry standard format. IMHO, the current IJG is, instead, building upon the existing reputation of libjpeg to attempt to drive adoption of a format that has not been accepted as a standard yet. They're further using the existing reputation of libjpeg to distribute propaganda against the very standards committee that has not accepted their new format (and now, since jpeg-8d, they're using their position to distribute propaganda against us as well.) That was never the intent of libjpeg."

    A good feature of the old established image formats such as JPEG or PNG is that they are standardized and compatible with a lot of existing browsers, embedded devices, hardware accelerators, etc. Now libjpeg-9 can create the files, which are not standard conforming JPEGs and simply will not work with the other compliant implementations. And if we are to invent some new and incompatible format, it really has to compete against the other newcomers (such as WebP).

    Leave a comment:


  • TheLexMachine
    replied
    Originally posted by XorEaxEax View Post
    Very interesting, I wonder if there's been improvements to the lossy compression aswell?

    Have to say though, displaying a link which appear to be to the source of the information but which instead links straight back to Phoronix is beyond cheap in my opinion.
    Nothing improved there. The most you can really do with the lossy compression scheme at this point is to just speed the JPEG encoding and decoding process up, which is what libjpeg-turbo does with it's SSE accelerations. Unfortunately, few programs use libjpeg-turbo from what I've seen. There are incompatibilities between JPEGs created with libjpeg and libjpeg-turbo and image editing/viewing programs that use one or the other, so many editing/viewing programs continue to use libjpeg to maintain compatibility since it's been around longer. Firefox uses libjpeg-turbo for it's hardware accelerated web browsing and the performance difference between the two is very obvious. Based on the development activity, it looks like libjpeg-turbo is about to get an update in the near future to bring it up to par with libjpeg9.
    Last edited by TheLexMachine; 14 January 2013, 02:50 PM.

    Leave a comment:


  • Michael
    replied
    Originally posted by ov1d1u View Post


    Now I remember why I was using an AdBlocker on Phoronix
    It's an ad bug with the network, refresh and it should go away until the ad provider removes the wrong ad tag from the network.

    Leave a comment:


  • TheLexMachine
    replied
    So we've got some technically good on paper but worthless in practice improvements that break older JPEG software decoders, which might mean that such images are unable to be displayed by hardware devices such as Blu-Ray players, TVs with USB or flash memory ports, and stand-alone media players, which use custom-made JPEG decoders or unspecified versions of libjpeg in their embedded Linux code. Wonderful. I think I'm going to have to test this new code out to see exactly what's going on and what the ramifications are. Here's the right link.

    Leave a comment:


  • XorEaxEax
    replied
    Very interesting, I wonder if there's been improvements to the lossy compression aswell?

    Have to say though, displaying a link which appear to be to the source of the information but which instead links straight back to Phoronix is beyond cheap in my opinion.

    Leave a comment:

Working...
X