Announcement

Collapse
No announcement yet.

Guetzli: Google Rolls Out A New JPEG Encoder

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    One wonders why another jpeg compressor is needed...

    Comment


    • #12
      Originally posted by dh04000 View Post
      One wonders why another jpeg compressor is needed...
      As the first paragraph of the article says,

      Google has announced Guetzli, not a German cookie, but rather a new open-source algorithm for creating high-quality JPEGs that are 35% smaller than currently available methods.
      (emphasis added)

      Comment


      • #13
        So is this anything more significant than a different quantization table

        Comment


        • #14
          Originally posted by dh04000 View Post
          One wonders why another jpeg compressor is needed...
          And is the this the second one from Google alone that does this, and about 30% smaller than normal for an order of magnitude more processing?

          Comment


          • #15
            So, at an order of magnitude slower, that would extend compression on the types of JPEG images I deal with from 'a flicker' to 'a blink'. For 35% better compression, that sounds good to me!

            Comment


            • #16
              As a person living in Switzerland, I find the naming "Guetzli" for this algorhythm deeply embarassing. Maybe comparable to an algorythm that would be called ho-di-ho-aye-ho. If you consider that that google moved some developmemt to switzerland purely for tax reasons, that makes it all the more embarrassing. Sorry for being Swiss is all that comes to my mind.

              Comment


              • #17
                I can remember in the early days of JPEG, back when CPUs were so much more wimpy than they are today, a company called Storm Technologies released a hardware card to do JPEG encoding/decoding. They also invented a feature they called “JPEG++”, which, as I understood it, dynamically varied the compression over the image, to lose a few more bits where it wouldn’t be noticed, and put back a few more where it would, to try to achieve a better compression ratio overall.

                Comment


                • #18
                  Originally posted by cynic View Post
                  I read somewhere that it requires huge amounts of memory for such encoding
                  807kb > 676kb
                  400MB RAM
                  Only one core used and slow, slow, slow.

                  Comment


                  • #19
                    So what, you have all the time of the world to encode a JPEG Serverside!
                    If it takes 2gig of ram and 10 seconds of processing once to get a 0,5 second page load time reduction for 80% of mobile users, it´s wirth it IMHO.
                    The bandwidth saving are noteworthy as well, especially if you are as big as google!

                    Comment


                    • #20
                      Originally posted by prazola View Post

                      807kb > 676kb
                      400MB RAM
                      Only one core used and slow, slow, slow.
                      what you are saying is that there is room for improvement in encoding speed - cool - in the near(ish) future we can expect smaller jpeg's faster!

                      Comment

                      Working...
                      X