Announcement

Collapse
No announcement yet.

Servo Driving Modularity To Support Different JavaScript Engines

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by bug77 View Post
    This is probably a stupid question, but why does a layout engine need to integrate with a JS engine in the first place? I know there's Ajax, but even that is just a collection of events.
    Probably because the JS can alter the DOM and CSS, adding new elements to the original HTML and other stuff

    Comment


    • #22
      Originally posted by bug77 View Post
      JS can run wild and then tell the layout engine what to do. But that doesn't mean the layout engine needs to speak JS, they could be decoupled via some events. Yes, I get it, events would be async, they would need to be standardized, so the JS code can talk to various layout engines transparently. Oh well...
      I think historically speaking browsers tended to focus a fair amount on the performance of dom manipulation, which is typically easier to do with tighter integration. Modularity wasn't particularly important, because a browser just used it's own js engine and that isn't something that was replaced frequently.

      That type of code is pretty old in a lot of browser projects, too, so a lot of time it was just them trying to get things working back in the 00's and hasn't been revisited since other than to add new features on top of what's already there.
      Last edited by smitty3268; 16 April 2024, 02:16 AM.

      Comment


      • #23
        Grr, here was me thinking that the whole of Servo was going to be rust based, now it'll just be bindings to V8 - do they want Firefox / Spidermonkey to fail, or just planning for it?

        Comment


        • #24
          Originally posted by FireBurn View Post
          Grr, here was me thinking that the whole of Servo was going to be rust based, now it'll just be bindings to V8 - do they want Firefox / Spidermonkey to fail, or just planning for it?
          what? Sspidermonkey isn't rust in the first place. Spidermonkey is an old code base. if you want servo to be rust based, this work absolutely critical. not preferable. Criticial. it is impossible to get what you want unless this work is being done. V8 is used as an example because it's the biggest script engine.

          Modularity is always a good thing, spidermonkey has issues, v8 has issues. Having the flexibility to swap between them is a great thing.

          Comment


          • #25
            Originally posted by Quackdoc View Post
            Modularity is always a good thing, spidermonkey has issues, v8 has issues. Having the flexibility to swap between them is a great thing.
            I wonder how flexible this would be:
            • at compile time
            • at package time
            • as a library at start time
            • dynamic as an extension at runt time
            I don't expect (last option) both engine to be loaded at the same time and be able to switch on the fly.

            Comment


            • #26
              Originally posted by bug77 View Post
              But that would only require supporting the small subset of JS that handles the registering (if that).

              JS can run wild and then tell the layout engine what to do. But that doesn't mean the layout engine needs to speak JS, they could be decoupled via some events. Yes, I get it, events would be async, they would need to be standardized, so the JS code can talk to various layout engines transparently. Oh well...
              The problem with JS is that the majority of the engine complexity is owed to the language itself, not the APIs. Those are actually the easiest bits to implement. The problem is Javascript builds classes via prototyping - the ability to modify an existing class during runtime - and everything is weakly typed. These two aspects combined makes writing compilers a living hell, and even if you want the bare basics it's the core nature of the language itself that's a pain in the ass... It's why V8 has such a radical JIT compilation pipeline with like a dozen internal sub-compilers. You could make a minimal engine that runs interpreted, but the performance would be so atrocious it would make no sense to use.

              When it comes to APIs, there's also just not a lot you can leave out, you'd be shocked how many will affect the layout of a site. Even leaving out something as mundane as the History API can break sites using it to detect templates to be loaded, and I've seen portfolio sites where the layout responds to a background video, and even Youtube runs full-on Shaders for the ambience effect. There's just not really a minimal implementation that wouldn't break a huge number of websites, especially the juggernauts, which would make it pointless for testing because complex sites are what you need to test the most. About the best you can do is drop APIs that require explicit permissions and just have them always return denied, but at that point you've already written 95% of a full ECMA 6 implementation.

              It seems silly that something so atrociously complex is part of a relatively simple markup standard, but that's the messy evolution of the internet fer ya.
              Last edited by Kver; 16 April 2024, 11:46 AM.

              Comment


              • #27
                Originally posted by Mathias View Post
                I wonder if this could be used as a guide on how to port V8 to Firefox.
                I'm not saying it should be done. But it is *interesting* how this small non-profit tries to compete with the biggest players, while all others gave up on their own development... I can totally see them switching to V8 or Blink or both for financial reasons...
                It's fine for Servo to do that but there is no point in Firefox doing that in my opinion. It has the effect of making web standards irrelevant and ruins one of the selling points of Firefox. It is one of the few remaining independent web browsers left on the planet, it is not just Chromium Blink and V8 with a different coat of paint. If V8 is better somehow then we should improve Spidermonkey.

                Comment


                • #28
                  Originally posted by Kver View Post

                  The problem with JS is that the majority of the engine complexity is owed to the language itself, not the APIs. Those are actually the easiest bits to implement. The problem is Javascript builds classes via prototyping - the ability to modify an existing class during runtime - and everything is weakly typed. These two aspects combined makes writing compilers a living hell, and even if you want the bare basics it's the core nature of the language itself that's a pain in the ass... It's why V8 has such a radical JIT compilation pipeline with like a dozen internal sub-compilers. You could make a minimal engine that runs interpreted, but the performance would be so atrocious it would make no sense to use.

                  When it comes to APIs, there's also just not a lot you can leave out, you'd be shocked how many will affect the layout of a site. Even leaving out something as mundane as the History API can break sites using it to detect templates to be loaded, and I've seen portfolio sites where the layout responds to a background video, and even Youtube runs full-on Shaders for the ambience effect. There's just not really a minimal implementation that wouldn't break a huge number of websites, especially the juggernauts, which would make it pointless for testing because complex sites are what you need to test the most. About the best you can do is drop APIs that require explicit permissions and just have them always return denied, but at that point you've already written 95% of a full ECMA 6 implementation.

                  It seems silly that something so atrociously complex is part of a relatively simple markup standard, but that's the messy evolution of the internet fer ya.
                  I believe all that. But I still think there should be a better way.

                  Comment


                  • #29
                    Originally posted by ahrs View Post

                    It's fine for Servo to do that but there is no point in Firefox doing that in my opinion. It has the effect of making web standards irrelevant and ruins one of the selling points of Firefox. It is one of the few remaining independent web browsers left on the planet, it is not just Chromium Blink and V8 with a different coat of paint. If V8 is better somehow then we should improve Spidermonkey.
                    I really don't understand that argument. Who is profiting from another rendering/JS engine? Web-Developers? No. The end user (=me) gets a slower engine. (At least when I see a slow webpage, if I try it in chromium, it is much faster. Maybe the opposite is true as well sometimes.) It's not like Mozilla can push new web standards by themselves. JpegXL anyone?

                    On the other hand, Mozilla could spend the money on other things. Things why I use Firefox, like privacy stuff.

                    The only problem I see with an all-Blink/V8 solution, if Google removes usefull stuff or makes stuff really hard. Like ManifestV3 plugins. If rendering and Javascript engine are modular enough, that shouldn't be a big problem. Though at some point it might be easier to roll your own Engine then to backport stuff to another project.

                    Comment


                    • #30
                      Originally posted by timofonic View Post
                      What about making a JavaScript engine in Rust too? JavaScript is often the cause of many security issues....
                      Writing a JS engine in rust isn't sufficient when JIT is involved.

                      While a rust non-JIT JS engine would probably be quite secure, JIT means dynamically generating x86 (etc) assembly code from JS code and then executing it. This assembly is highly optimized through the use of a lot of assumptions/predictions about the JS code which aren't universally valid, and which no sane compiler developer would include in a compiler for languages like C/C++/Rust/etc. Part of the reason this works is that for the weird corner cases where assumptions don't hold the generated assembly can be made to fall back to the slower non-JIT interpreter.

                      The end result is that there have been many logic bugs in javascript engines' optimization passes that result in unsafe assembly code, memory corruption, and JS engine sandbox escape. Rust's compiler can protect against that stuff for rust code which is compiled up-front, but it can't provide any assurances for random JIT'd code that your code generates later. There also (in general) isn't really a lot of stuff you can do to prevent these kinds of logic bugs, even on a theoretical level. It took a long time for just the theory behind Rust's borrow checker to emerge.

                      Comment

                      Working...
                      X