Announcement

Collapse
No announcement yet.

Mozilla Eyes Removal Of Theora Support In Firefox

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by pong View Post

    And JPG-XL and ...
    I know this is probably a joke but,

    jpegxl already has a pretty competent rust decoder, i even used it in a couple personal projects https://github.com/tirr-c/jxl-oxide it does have some rather large limtations like no threading (but it does have SIMD) and the decoder will happly go until it hits an OOM but other then that its actually pretty fast for what it is, and really flexible, AV1 has a port of dav1d to rust (https://github.com/memorysafety/rav1d) (and yes it's a port of dav1d, the code was transpiled, not a re-write, and not a from ground up) that's doing pretty well here, vorbis and many other audio and image codecs have open source rust decoders as well.

    Im pretty sure in the future we will see browsers and other security conscious applications requiring decoders in rust, zig, go etc. for consideration for inclusion, for sure not now, and maybe not in recent times but eventually I can see it easily

    Comment


    • #22
      Originally posted by schmidtbag View Post
      I don't understand the point of removing it. I'm sure the codebase for it is quite minimal (I have doubts that there's even anything explicitly mentioning Theora in Firefox's code) and if it works, why not just leave it? If it stops working and nobody is there to notice, does it matter?
      I'm all for removing old unnecessary crap but not when the existence of the feature is inconsequential.
      Here is what was said when removing the Theora codec from the open-source Dæmon game engine (the one used by the Unvanquished game) 3 years ago:

      The immediate motivation for this arose because while updating the external_deps build, I noticed the Theora project seems kind of dead. When compiling, I saw a gcc warning that the decoder code indexes out of bounds. Although this was fixed in 2012, there hasn't been a new release since 2010 so we didn't have the fix.​
      The problem with Theora is that the library is dead and there is no usable release. The last release was 10 years ago and contains the very bad out of bounds indexing bug. This is blocking me from creating a new build for the external_deps libraries (which will fix the MacOS Catalina crashing bug, among others).
      I guess one big concern is for Wikipedia that use a lot of OGV Theora videos. Killing Theora also will just kill OGV since OGV only officially supports Theora as a video codec as far as I know. Webm with VP8 or VP9 is better anyway, they likely have the hardware infrastructure to re-encode the videos anyway.
      Last edited by illwieckz; 02 November 2023, 05:08 AM.

      Comment


      • #23
        Thanks for citing the context. That is sad / bad hygiene for releases vs. security / quality bugs on theora's part, so I can see why they'd consider it abandonware.
        I would think that the "minimum standard" for actively shipped code (in the browser or theora itself) should be passing any high quality / security concern
        static code analysis tests that may evolve to show potential error cases in the code. That's an ever mutating problem as the SCA tool checks get better over time
        so some ongoing clean-up / lint maintenance is typicaly needed.
        It is sad they had a serious index OOB error though to begin with. With a moderate amount of care that should have been avoided by manual consistency / safety checks put in by the original devs who shouldn't have needed SCA lint to tell them to religiously protect against OOB array indexing in C/C++ in all cases.

        Re wikipedia et. al. site support / risk of breakage --
        when chrome announced dropping theora there was a comment in that phoronix forum thread that said that wikipedia already (always has?) wraps the use of the theora media playback via this OSS player which is already more multi-platform and implements a SW decoder not dependent on the in-built browser theora support (which isn't always or maybe ever used / needed with ths SW decoder player AFAICT).
        So because of the below web server deployed player component any web site has a potentially easy way to still serve successfully theora videos to mobile, multi-platform desktop browser clients; I assume basic JS / HTML5 or whatever support is all that's needed from the browser side though I didn't delve into the dependencies / system requirements.

        I could be paraphrasing / inferring some of that incorrectly from memory; q.v. the comments in the other chrome/theora thread and the OSS player and wikipedia's setup if you want to verify whether there are gaps / support issues etc. to a deeper level.

        JavaScript media player using Ogg/Vorbis/Theora/Opus/WebM libs compiled with Emscripten - bvibber/ogv.js


        Originally posted by illwieckz View Post

        Here is what was said when removing the Theora codec from the open-source Dæemon game engine (the one used by the Unvanquished game) 3 years ago:





        I guess one big concern is for Wikipedia that use a lot of OGV Theora videos. Killing Theora also will just kill OGV since OGV only officially supports Theora as a video codec as far as I know. Webm with VP8 or VP9 is better anyway, they likely have the hardware infrastructure to re-encode the videos anyway.

        Comment


        • #24
          Thanks for the information!
          Actually it was ignorance of the details, not a joke, but my first reaction to seeing the outcry wrt. chrome dropping jpeg-xl was
          "ok, yeah, but isn't there an easy / performant / good WASM / rust / javascript / whatever work-around to decode / render / transcode / whatever jpeg-xl for browsers that don't support it?" since that does seem like a practical and obvious potentially good alternative.
          I hadn't seen anyone mention the existence of good rust / WASM / whatever decoders but it seemed an inexplicable lack not to be filled already if so.

          Your comment about the lack of threading is interesting, and indeed a noteworthy limitation depending on what you mean by the lack. My initial gut reaction would be that if it has SIMD and efficient single-threaded code performance then probably a large percentage of JPEG-XL files should be decodable serially with SIMD (or even potentially without it to some class/extent e.g. non-AVX-512 CPUs etc.) "pretty quickly" enough to be mostly suitable; by comparison plain JPEGs have been decoded by no-thread single-core non-GPGPU platforms for ~20 years before good multi-core CPUs/heavily MT browers became the client norm . The overhead of launching / managing multiple decoder threads for a single image is probably not worth it except for medium-large images or high complexity ones or something like that. I hope the browser can launch different threads to decode different images on the same page / different pages simultaneously, though, since that would be a much more significant limitation if present. I also hope the decoding thread is just some worker thread and not some main UI / browser engine thread that makes EVERYTHING stutter / delay.
          OTOH there are some producers of HUGE images (science imagery data sets, NASA/... composites, geographical, data visualizations, whatever that's spitting out >> 8k*8k images) and for those I'd probably want MT + SIMD + GPU accelerated decode, indeed.

          I wonder if just using the WASM sandbox related APIs are nice for implementing portable (e.g. multi-browser, multi-use-case outside browsers) or whether some kind of wrapper API layer / "design pattern" above the level of the WASM sandbox API schemes is a nice idea to help make things "more ubiquitously facile" in terms of API adaptions and abstractions to suit different sandboxes / broswer/not cases / different higher layer client code API interfaces etc. CODECS should be as portable / plug & play as possible without too much painful glue logic to integrate them.

          Originally posted by Quackdoc View Post
          I know this is probably a joke but,

          jpegxl already has a pretty competent rust decoder, i even used it in a couple personal projects https://github.com/tirr-c/jxl-oxide it does have some rather large limtations like no threading (but it does have SIMD) and the decoder will happly go until it hits an OOM but other then that its actually pretty fast for what it is, and really flexible, AV1 has a port of dav1d to rust (https://github.com/memorysafety/rav1d) (and yes it's a port of dav1d, the code was transpiled, not a re-write, and not a from ground up) that's doing pretty well here, vorbis and many other audio and image codecs have open source rust decoders as well.

          Im pretty sure in the future we will see browsers and other security conscious applications requiring decoders in rust, zig, go etc. for consideration for inclusion, for sure not now, and maybe not in recent times but eventually I can see it easily

          Comment


          • #25
            The word "Eyes" being right above the big eye in the theora image keeps tripping my brain up.

            Comment


            • #26
              Theora was never widely used. Wikipedia probably makes up 95% of all (still viewed) Theora videos on the web. And the Theora videos are usually 720 and below. Using a WASM decoder for this legacy stuff is not a huge issue. On the other hand, a next-generation codec (image/video/audio) *has* to be as fast as possible. There is a JXL-Wasm decoder, but it is super slow compared to native code. Also every website using JXL would have to ship the decoder wasm, so you quickly diminish any size advantages over using jpeg. (Ok, using a big CDN will mostly eliminate that size overhead. But the performance will still suffer.)

              Comment


              • #27
                Originally posted by pong View Post
                Thanks for the information!
                Actually it was ignorance of the details, not a joke, but my first reaction to seeing the outcry wrt. chrome dropping jpeg-xl was
                "ok, yeah, but isn't there an easy / performant / good WASM / rust / javascript / whatever work-around to decode / render / transcode / whatever jpeg-xl for browsers that don't support it?" since that does seem like a practical and obvious potentially good alternative.
                I hadn't seen anyone mention the existence of good rust / WASM / whatever decoders but it seemed an inexplicable lack not to be filled already if so.

                Your comment about the lack of threading is interesting, and indeed a noteworthy limitation depending on what you mean by the lack. My initial gut reaction would be that if it has SIMD and efficient single-threaded code performance then probably a large percentage of JPEG-XL files should be decodable serially with SIMD (or even potentially without it to some class/extent e.g. non-AVX-512 CPUs etc.) "pretty quickly" enough to be mostly suitable; by comparison plain JPEGs have been decoded by no-thread single-core non-GPGPU platforms for ~20 years before good multi-core CPUs/heavily MT browers became the client norm . The overhead of launching / managing multiple decoder threads for a single image is probably not worth it except for medium-large images or high complexity ones or something like that. I hope the browser can launch different threads to decode different images on the same page / different pages simultaneously, though, since that would be a much more significant limitation if present. I also hope the decoding thread is just some worker thread and not some main UI / browser engine thread that makes EVERYTHING stutter / delay.
                OTOH there are some producers of HUGE images (science imagery data sets, NASA/... composites, geographical, data visualizations, whatever that's spitting out >> 8k*8k images) and for those I'd probably want MT + SIMD + GPU accelerated decode, indeed.

                I wonder if just using the WASM sandbox related APIs are nice for implementing portable (e.g. multi-browser, multi-use-case outside browsers) or whether some kind of wrapper API layer / "design pattern" above the level of the WASM sandbox API schemes is a nice idea to help make things "more ubiquitously facile" in terms of API adaptions and abstractions to suit different sandboxes / broswer/not cases / different higher layer client code API interfaces etc. CODECS should be as portable / plug & play as possible without too much painful glue logic to integrate them.
                when it comes to wasm, both libjxl and jxl-oxide can be compiled to wasm, multithreaded can be a bit of a pain so a lot of wasm stuff is single threaded anyways jxl-oxide has recently had a lot of perf stuff being done, so it is something I want to test, but my javascript knowledge is rust and out of date, so it's not something high priority for me. you do want mutlithreaded decode for jxl since sequences and large images can hurt quite a bit. but yes, the vast majority of JXL images I have, even the 23164x18480windwaker test image I have decodes fine (albiet slow) (you can find this image by searching windwaker 35x dolphin).

                Comment


                • #28
                  Originally posted by Danny3 View Post
                  That's ok, unlike the shitty response they did when Google removed JPEG-XL!

                  But it would be nice if they instead, focus and improve the support for AV1 and VP9 to the maximum extent possible.

                  And when will they finally start implementing HDR support so we can see the videos on Youtube and other sites properly, like we can see them on our TVs?
                  I still can't believe they invested so much to implement Vr support instead of HDR support, since Vr is much less and harder to use since very few people have Vr headsets, compared to the many HDR capable displays that people have.
                  Mozilla probably thought that if VR takes off it would be really bad for them to not have support ready for it. To be honest, HDR is a gimmick similar to VR and not widely supported either. Your fake HDR display most likely has its fake HDR disabled by default and you don't even know it. Enabling it would result in awful picture quality and you'd hate your life.

                  HDR support in Firefox is not really needed.

                  Comment


                  • #29
                    Originally posted by theuserbl View Post
                    What is, if there are Theora videos on some websites?
                    Then they can no longer be seen with both browsers.
                    Are you actually this pathetic or just pretending to? If you stumble upon an ancient website with videos that your browser doesn't support natively, then just download them to your computer and use some other software for playback. I'm pretty sure e.g. WMV is much more popular than Theora and no browser ever supported that video format.

                    Comment


                    • #30
                      Originally posted by theuserbl View Post
                      What is, if there are Theora videos on some websites?
                      Then they can no longer be seen with both browsers.
                      Not particularly well maintained websites mind you. Off/Theora files would be unnecessarily large and make the site slow to load. I find the quoted stats quite convincing.

                      A browser is not a museum. Features that lie effectively dormant are also an attack vector and cause maintenance efforts. By contrast, migrating to a more modern codec is not that hard.

                      Comment

                      Working...
                      X