Announcement

Collapse
No announcement yet.

WebKitGTK 2.23.90 Adds Support For JPEG2000, More Touchpad Gestures

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • WebKitGTK 2.23.90 Adds Support For JPEG2000, More Touchpad Gestures

    Phoronix: WebKitGTK 2.23.90 Adds Support For JPEG2000, More Touchpad Gestures

    It missed the GNOME 3.32 Beta by a week, but out today is the WebKitGTK 2.23.90 release, the downstream of the WebKit web layout engine focused on GTK integration and used by the likes of GNOME Web (Epiphany)...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    If I recall correctly from my compression course at uni, JPEG2000 is a huge improvement over jpeg, both when it comes to size as well as quality (and of course the ratio).
    But I thought the issue with JPEG2000 is its patents, and that those are the sole reason for it never becoming widely adopted. How has WebKit solved that issue?

    Comment


    • #3
      Originally posted by Azpegath View Post
      But I thought the issue with JPEG2000 is its patents, and that those are the sole reason for it never becoming widely adopted. How has WebKit solved that issue?


      Comment


      • #4
        This seems like a bad idea. JPEG2000 decoders do not get nearly as much attention as other image decoders, chances are there are quite a few more vulnerabilities in JPEG2000 libraries than in jpeg or png ones.

        Comment


        • #5
          Originally posted by Azpegath View Post
          If I recall correctly from my compression course at uni, JPEG2000 is a huge improvement over jpeg, both when it comes to size as well as quality (and of course the ratio).
          But I thought the issue with JPEG2000 is its patents, and that those are the sole reason for it never becoming widely adopted. How has WebKit solved that issue?
          It also turns out that real JPEG2000 encoders, aside from some exotic proprietary ones which never gained much adoption, tend to produce worse results than good quality JPEG encoders. The main thing JPEG2000 improves is blocking artifacts, but it trades making the whole image blurry for that, so not sure if that tradeoff is worth it.

          Your uni course should've probably shown you how you can not predict the quality of a codec by the sophistication and overall goodness of the decoder.

          Comment


          • #6
            I just checked out the Midori browser home page. I remember this browser. It was a lightweight browser that used the WebKitGTK engine. Something is really weird about the home page now:



            The site has a strange marketing feel. There are lots of spelling and grammar mistakes.

            It links through to this foundation: https://www.astian.org/

            Check the bottom of the page: I think the 4 people pictured are completely made up. They all look like teenagers paid to do model shots. Their names don't match up with the pictures and not all the links provided (e.g. facebook links) match up with the names & pictures either.

            My spidey senses are tingling.

            I was hoping to see that Midori had been revived by a bunch of passionate devs. What do I find? What looks like it could be some fly by night company who have taken over a dead software project to inject it with malware. (that's speculation. I don't have time to research further right now)

            Comment


            • #7
              Originally posted by microcode View Post
              This seems like a bad idea. JPEG2000 decoders do not get nearly as much attention as other image decoders, chances are there are quite a few more vulnerabilities in JPEG2000 libraries than in jpeg or png ones.
              Well, that's a passing problem if the format is receiving a more widespread use. The whole "if you build it"..

              Comment


              • #8
                Originally posted by microcode View Post
                It also turns out that real JPEG2000 encoders, aside from some exotic proprietary ones which never gained much adoption, tend to produce worse results than good quality JPEG encoders. The main thing JPEG2000 improves is blocking artifacts, but it trades making the whole image blurry for that, so not sure if that tradeoff is worth it.

                Your uni course should've probably shown you how you can not predict the quality of a codec by the sophistication and overall goodness of the decoder.
                Well I think you're pulling a Strawman. I never said such a thing. But anyhow, my meaning was that the underlying algorithm provides possibilities of a better result, based on the comparisons that I've seen.
                The quality of the current _worst_ implementations can not be used as a benchmark for the method, that is a bit unfair.

                But your comment on the reduction of blocking artifacts, with the trade off of overall bluriness is very informative, thank you.

                Comment


                • #9
                  The main thing JPEG2000 improves is blocking artifacts, but it trades making the whole image blurry for that, so not sure if that tradeoff is worth it.
                  Not quite; JPEG 2000 supports region-of-interest coding, where less quantization is applied to portions of the image that are visually important (as determined by the encoder). I.e. only portions of the image that would otherwise become blocky in original JPEG end up blurred in JPEG 2000, not the whole image.

                  Comment


                  • #10
                    Originally posted by Azpegath View Post
                    Well I think you're pulling a Strawman. I never said such a thing. But anyhow, my meaning was that the underlying algorithm provides possibilities of a better result, based on the comparisons that I've seen.
                    Sorry if I came off passive aggressive, or like I was criticizing you.

                    Originally posted by Azpegath View Post
                    The quality of the current worst implementations can not be used as a benchmark for the method, that is a bit unfair.
                    Yeah, I agree, but the median JP2 is made with an encoder which produces worse end results most of the time than the median JPEG encoder. A similar (probably temporary, in this case) situation with AV1 right now, where it is generally impractical to use the libaom AV1 encoder unless you are google and can allocate literally ten thousand machines for many hours to one video, just to try things out; so the median AV1 frame is encoded with rav1e. rav1e beats x264 by quite a margin, but currently produces worse quality than VP9 from libvpx (where the median VP9 frame comes from)

                    Furthermore, in the case of JPEG, all of the very best encoders are free and open source, whereas passable JPEG 2000 encoders are relatively expensive.

                    Originally posted by Azpegath View Post
                    But your comment on the reduction of blocking artifacts, with the trade off of overall bluriness is very informative, thank you.

                    Comment

                    Working...
                    X