Announcement

Collapse
No announcement yet.

Libjpeg 9 Does Better Lossless JPEG Compression

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Libjpeg 9 Does Better Lossless JPEG Compression

    Phoronix: Libjpeg 9 Does Better Lossless JPEG Compression

    Version 9 of the libjpeg library from the Independent JPEG Group has been released. This version of the JPEG library is said to noticeably improve the lossless JPEG compression support even to the point that libjpeg can now output compressed lossless JPEGs of smaller size than PNG images...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Please fix the link, there's a www.phoronix.com before the real url :P

    Comment


    • #3


      Now I remember why I was using an AdBlocker on Phoronix

      Comment


      • #4
        Very interesting, I wonder if there's been improvements to the lossy compression aswell?

        Have to say though, displaying a link which appear to be to the source of the information but which instead links straight back to Phoronix is beyond cheap in my opinion.

        Comment


        • #5
          So we've got some technically good on paper but worthless in practice improvements that break older JPEG software decoders, which might mean that such images are unable to be displayed by hardware devices such as Blu-Ray players, TVs with USB or flash memory ports, and stand-alone media players, which use custom-made JPEG decoders or unspecified versions of libjpeg in their embedded Linux code. Wonderful. I think I'm going to have to test this new code out to see exactly what's going on and what the ramifications are. Here's the right link.

          Comment


          • #6
            Originally posted by ov1d1u View Post


            Now I remember why I was using an AdBlocker on Phoronix
            It's an ad bug with the network, refresh and it should go away until the ad provider removes the wrong ad tag from the network.
            Michael Larabel
            https://www.michaellarabel.com/

            Comment


            • #7
              Originally posted by XorEaxEax View Post
              Very interesting, I wonder if there's been improvements to the lossy compression aswell?

              Have to say though, displaying a link which appear to be to the source of the information but which instead links straight back to Phoronix is beyond cheap in my opinion.
              Nothing improved there. The most you can really do with the lossy compression scheme at this point is to just speed the JPEG encoding and decoding process up, which is what libjpeg-turbo does with it's SSE accelerations. Unfortunately, few programs use libjpeg-turbo from what I've seen. There are incompatibilities between JPEGs created with libjpeg and libjpeg-turbo and image editing/viewing programs that use one or the other, so many editing/viewing programs continue to use libjpeg to maintain compatibility since it's been around longer. Firefox uses libjpeg-turbo for it's hardware accelerated web browsing and the performance difference between the two is very obvious. Based on the development activity, it looks like libjpeg-turbo is about to get an update in the near future to bring it up to par with libjpeg9.
              Last edited by TheLexMachine; 14 January 2013, 02:50 PM.

              Comment


              • #8
                Originally posted by TheLexMachine View Post
                Based on the development activity, it looks like libjpeg-turbo is about to get an update in the near future to bring it up to par with libjpeg9.
                I'm not so sure about this. Quoting a part of libjpeg-turbo maintainer's message from http://sourceforge.net/mailarchive/f...eg-turbo-users: "As far as adoption, IMHO the only major reason why an x86- or ARM-focused project would prefer the upstream code at this point is that said project wanted to implement SmartScale, and IMHO, without any reproducible metrics demonstrating the usefulness of that format, anyone who's implementing it in their software is doing so only out of a desire to be on the bleeding edge and not for any definable need. My question all along has been: why did the IJG use libjpeg as a platform for this new format? Why didn't they implement it in a completely new library? The only answer I can come up with is that they're trying to put the cart before the horse. libjpeg was originally a reference implementation meant to ease the adoption of an accepted industry standard format. IMHO, the current IJG is, instead, building upon the existing reputation of libjpeg to attempt to drive adoption of a format that has not been accepted as a standard yet. They're further using the existing reputation of libjpeg to distribute propaganda against the very standards committee that has not accepted their new format (and now, since jpeg-8d, they're using their position to distribute propaganda against us as well.) That was never the intent of libjpeg."

                A good feature of the old established image formats such as JPEG or PNG is that they are standardized and compatible with a lot of existing browsers, embedded devices, hardware accelerators, etc. Now libjpeg-9 can create the files, which are not standard conforming JPEGs and simply will not work with the other compliant implementations. And if we are to invent some new and incompatible format, it really has to compete against the other newcomers (such as WebP).

                Comment


                • #9
                  Well, I was obviously misinterpreting something else I was looking at regarding the updates though I am going to look into this SmartScale stuff to see what it's bringing to the table, if anything. As an owner of a high-end photography setup, I like to see what's going on with the latest stuff.

                  Comment


                  • #10
                    Just found the paper with this lossless coding proposal linked from the wikipedia page: http://jpegclub.org/temp/JPEG_9_Lossless_Coding.doc
                    It has some compression ratio comparison tables, showcasing libjpeg-9 lossless coding and allegedly outperforming the competition. I decided to also give WebP a try (using libwebp-0.2.0.tar.gz):
                    Code:
                    $ wget http://www.r0k.us/graphics/kodak/kodak/kodim01.png
                    $ cwebp -m 6 -lossless kodim01.png -o kodim01.webp
                    $ ls -l kodim01.webp 
                    -rw-r--r-- 1 ssvb ssvb 504672 Jan 14 23:17 kodim01.webp
                    The very first file from the set compresses to 504672 bytes with WebP, which seems to be significantly better than 574K reported for libjpeg-9 using the same file. I have not run any tests through the rest of the files, this might be a good exercise for somebody else

                    In any case, the paper says: "In April/May 2012, a new feature was found and implemented in the IJG software which significantly improves the lossless compression of continuous-tone color images, outperforming other currently popular methods and thus making the new feature very attractive for practical application. A development version with the new feature is currently presented by InfAI Leipzig and IJG, and is planned for release as a new major IJG version 9 in January 2013."
                    But for a new lossless image compression method developed in 2012/2013, totally ignoring WebP and maybe some other modern codecs seems to be a bit unfair, isn't it?

                    Just let the aging JPEG and PNG formats keep providing best compatibility with the existing and future software. That's their best feature today and they really have nothing else to offer.

                    Comment

                    Working...
                    X