Announcement

Collapse
No announcement yet.

Google's Jpegli Offers ~35% Compression Improvement For High Quality JPEGs

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by AmericanLocomotive View Post
    ​We all understand that large companies have different divisions, often embarking in different research projects. But those divisions are usually are usually being driven in the same direction. In Google's case, it seems the different divisions are all over the place, going in all different directions, all the time. Google will kill off a product, and then spin-up a near identical product a couple of months later with a different name and a slightly different feature set.

    Remember at one point 4-5 years ago Google had like three different remote-video-conferencing products at the same time? ...and the one they stuck with (Google Meets) had the worst feature set out of all of them.
    Remember, this is google we're talking about! They're into R&D of ADVANCED computing technologies AT SCALE.
    So they just use R&D approaches that can SCALE to handle NP-hard problems, WHATEVER the cost.

    They started out with a roadmap to ramp their literal(!) room full of monkeys with keyboards to 1-google (the number) primate-bits scale.
    Sure there's no actual uniform direction of what their individual monkey-developers are working on, that's the point! --
    They could repeat themselves, duplicate effort, whatever, but they're just as likely to randomly go and try something truly unexpected and groundbreaking!

    Not only will they eventually find A GOOD solution, they'll find EVERY good (and bad!) solution, and in the mean time we'll be unwilling unwitting alpha-testers of them all!

    Then as technology advanced they've started embracing quantum computing and now have data centers full of quantum-nano-monkeys working alongside maxwell's daemons!

    Again, yes, it seems like they're working on any given problem in all different for-x, against-x, sometimes-x, somewhat-x directions, but that's the beauty of the wavefunction! When they can finally get to trying to solve every possible problem in every possible way simultaneously then the wavefunction will collapse and like Dijkstra's shortest path algorithm, the ONE TRUE SOLUTION, you know, to life, the universe, and everything will emanate!

    See, that is the beauty of THE GOOGLE!

    Comment


    • #22
      Originally posted by JEBjames View Post

      I'm not sure I'm understanding this right...

      They are using some Jpeg-XL magic under the hood...but only in the encoding. They keep the SAME plain vanilla 100% compatible JPEG format as the output? i.e. better compression/quality like jpeg-xl...but the same jpeg format? Only the compressor part needs to changed?

      All existing software jpeg software will be able to open these files because it's "the same old jpeg format"?

      Am I totally missing something? This sounds way better than adding yet another file format? What's the down side here?
      it's more or less just coding stuff that was lifted from libjxl. it's not specific to JXL in anyway. mozjpeg is good, but it's not the best thing in the world. The downside is you gain none of the JXL feature set, things like the lossless encoding (which is actually really good with JXL), XYB under the hood (though you can get some of the benefits using ICC) larger file sizes, many layers, animation, and more and more features. Also it's still not quite as good as jpeg-xl when it comes to compression.

      Comment


      • #23
        Originally posted by AmericanLocomotive View Post
        So Google is trying to bring some Jpeg-XL technologies back to regular Jpeg, after they basically killed Jpeg-XL.

        I'd also like to point out that Google's own office suite (Docs, Slides, Sheets, etc..) STILL does not support Google's own WebP format. So you can't even make a lightweight slide presentation using Google's own image format that's been out 6+ years now.

        I swear this company has no real direction or guidance. Just seems to be thousands of different teams all doing their own thing, and when one team starts costing Google too much money, they kill it off.
        As a former google employee (left circa 2017), I can tell you for a fact that is exactly how it works. Each major engineering area is basically self driven, each competing for budget/resources while trying not to call too much attention to itself (as that usually causes bad things to happen like layoffs/reorgs/etc)

        Comment


        • #24
          All this negativity for such a big improvement with no downsides. JPEG is the most used format in the webs, everything is compatible with it. Imagine MP3 getting a 30% boost.

          Back in the days when I did build websites I used tools like optipng to squeeze out much less than 30%, for jpeg there wasn't really much you could do apart from stripping unnecessary metadata and sacrificing quality.

          Comment


          • #25
            Originally posted by ElectricPrism View Post
            We need a joke on a file format that makes a 0% effort to be compressed -- like __Bitmap Master Race__ except with Alpha and some other features.

            Then make the files too large for AI's to process without crashing, and make it harder for 3rd parties to effectively copy the files.

            As a added bonus for Chrome being a total boner with JPEG-XL, we should destroy their place in the world as Overlords of the Internet. Anti-trust lawsuits by the EU, a 3rd option browser __not funded by Google__ -- a complete shift of techies away from HTTP/s to GEMINI or some other alternative that busts their balls by design. Whatever it takes.
            Wow, you are so desperate, yet thick as a brick. The whole internet is based first and foremost on HTTP/S. Besides the fact that it's just impossible to change the whole internet to anything else, you'd only screw every browser that's not based on Chromium. If you where switching to anything else, that would need to be standardized first, which is a long and tedious process. And when Google knows, a technology will overtake in the foreseeable future, they will be the first to implement it in an experimental state. Google just has the manpower to implement anything they want. Apple may have too, just not the will to do so. But Firefox, not to mention any other competition, don't. They will be the last to support such huge changes.

            Comment


            • #26
              Originally posted by mirmirmir View Post
              What do we say about this boys?
              Can one officially call shenanigans in this situation?

              Comment


              • #27
                Originally posted by AmericanLocomotive View Post
                20-35% better compression ratio is a huge amount - like monumental. Imagine telling someone at Instagram or Imgur that they could essentially increase their bandwidth limits and storage capacity by only 25% for free, without losing any image quality, just by switching to Jpeg-XL.
                Don't get me wrong, I want JpegXL to succeed. But Jpeg is 30 years old (1992), 30% ist that much objectively. I was promised 50% countless times with Jpeg2000, WebP and others. If you compare AV1 to Mpeg1 (1991), you'd laugh at 30%. I'd say even Opus (2012) is >30% better then MP3 (1991). To be fair, encoders today are generally more advanced. If people invested time into an awesome MP3 encoder, I'm sure they'd squeeze some 10% from it as well.

                And I'm especially disappointed in those 30%, because >10% are the result of better entropy encoding, that is actually in the old JPEG spec, just isn't used because of (long expired) patents. Original Jpeg (the standard) was just that good.

                I still hope google changes their minds and integrates XL. But I bet if Facebook+Insta+... pushed google to integrate XL. they would. The cost of encoding and storing all images twice for compatibility (which they have to do for the next >10 years if XL was supported today) is non-negligible as well.

                Comment


                • #28
                  Originally posted by Mathias View Post
                  If people invested time into an awesome MP3 encoder, I'm sure they'd squeeze some 10% from it as well.
                  They have though? lame -V 4 (~165 kbps) is nearly transparent. You needed -V 2 (~190 kbps) for that a couple of decades ago, and even then it was a lot better than encoders from a few years prior.

                  Comment


                  • #29
                    Originally posted by Delta_44 View Post
                    Useless.
                    Just use AVIF
                    AVIF is a video codec. I will use it for animations, but for images and photos JPEG XL is better. Ideally with lossless compression.

                    Comment


                    • #30
                      Originally posted by andyprough View Post
                      So now Google wants to tell us all about the benefits of jpeg-xl? How rich. And how very google.
                      They just wanted to tell us about their JPEG-XL. It seems if it's not Google, it doesn't matter.

                      Comment

                      Working...
                      X