Originally posted by bug77
View Post
Announcement
Collapse
No announcement yet.
Servo Driving Modularity To Support Different JavaScript Engines
Collapse
X
-
Originally posted by bug77 View PostJS can run wild and then tell the layout engine what to do. But that doesn't mean the layout engine needs to speak JS, they could be decoupled via some events. Yes, I get it, events would be async, they would need to be standardized, so the JS code can talk to various layout engines transparently. Oh well...
That type of code is pretty old in a lot of browser projects, too, so a lot of time it was just them trying to get things working back in the 00's and hasn't been revisited since other than to add new features on top of what's already there.Last edited by smitty3268; 16 April 2024, 02:16 AM.
- Likes 1
Comment
-
Originally posted by FireBurn View PostGrr, here was me thinking that the whole of Servo was going to be rust based, now it'll just be bindings to V8 - do they want Firefox / Spidermonkey to fail, or just planning for it?
Modularity is always a good thing, spidermonkey has issues, v8 has issues. Having the flexibility to swap between them is a great thing.
- Likes 1
Comment
-
Originally posted by Quackdoc View PostModularity is always a good thing, spidermonkey has issues, v8 has issues. Having the flexibility to swap between them is a great thing.- at compile time
- at package time
- as a library at start time
- dynamic as an extension at runt time
Comment
-
Originally posted by bug77 View PostBut that would only require supporting the small subset of JS that handles the registering (if that).
JS can run wild and then tell the layout engine what to do. But that doesn't mean the layout engine needs to speak JS, they could be decoupled via some events. Yes, I get it, events would be async, they would need to be standardized, so the JS code can talk to various layout engines transparently. Oh well...
When it comes to APIs, there's also just not a lot you can leave out, you'd be shocked how many will affect the layout of a site. Even leaving out something as mundane as the History API can break sites using it to detect templates to be loaded, and I've seen portfolio sites where the layout responds to a background video, and even Youtube runs full-on Shaders for the ambience effect. There's just not really a minimal implementation that wouldn't break a huge number of websites, especially the juggernauts, which would make it pointless for testing because complex sites are what you need to test the most. About the best you can do is drop APIs that require explicit permissions and just have them always return denied, but at that point you've already written 95% of a full ECMA 6 implementation.
It seems silly that something so atrociously complex is part of a relatively simple markup standard, but that's the messy evolution of the internet fer ya.Last edited by Kver; 16 April 2024, 11:46 AM.
- Likes 1
Comment
-
Originally posted by Mathias View PostI wonder if this could be used as a guide on how to port V8 to Firefox.
I'm not saying it should be done. But it is *interesting* how this small non-profit tries to compete with the biggest players, while all others gave up on their own development... I can totally see them switching to V8 or Blink or both for financial reasons...
Comment
-
Originally posted by Kver View Post
The problem with JS is that the majority of the engine complexity is owed to the language itself, not the APIs. Those are actually the easiest bits to implement. The problem is Javascript builds classes via prototyping - the ability to modify an existing class during runtime - and everything is weakly typed. These two aspects combined makes writing compilers a living hell, and even if you want the bare basics it's the core nature of the language itself that's a pain in the ass... It's why V8 has such a radical JIT compilation pipeline with like a dozen internal sub-compilers. You could make a minimal engine that runs interpreted, but the performance would be so atrocious it would make no sense to use.
When it comes to APIs, there's also just not a lot you can leave out, you'd be shocked how many will affect the layout of a site. Even leaving out something as mundane as the History API can break sites using it to detect templates to be loaded, and I've seen portfolio sites where the layout responds to a background video, and even Youtube runs full-on Shaders for the ambience effect. There's just not really a minimal implementation that wouldn't break a huge number of websites, especially the juggernauts, which would make it pointless for testing because complex sites are what you need to test the most. About the best you can do is drop APIs that require explicit permissions and just have them always return denied, but at that point you've already written 95% of a full ECMA 6 implementation.
It seems silly that something so atrociously complex is part of a relatively simple markup standard, but that's the messy evolution of the internet fer ya.
Comment
-
Originally posted by ahrs View Post
It's fine for Servo to do that but there is no point in Firefox doing that in my opinion. It has the effect of making web standards irrelevant and ruins one of the selling points of Firefox. It is one of the few remaining independent web browsers left on the planet, it is not just Chromium Blink and V8 with a different coat of paint. If V8 is better somehow then we should improve Spidermonkey.
On the other hand, Mozilla could spend the money on other things. Things why I use Firefox, like privacy stuff.
The only problem I see with an all-Blink/V8 solution, if Google removes usefull stuff or makes stuff really hard. Like ManifestV3 plugins. If rendering and Javascript engine are modular enough, that shouldn't be a big problem. Though at some point it might be easier to roll your own Engine then to backport stuff to another project.
Comment
-
Originally posted by timofonic View PostWhat about making a JavaScript engine in Rust too? JavaScript is often the cause of many security issues....
While a rust non-JIT JS engine would probably be quite secure, JIT means dynamically generating x86 (etc) assembly code from JS code and then executing it. This assembly is highly optimized through the use of a lot of assumptions/predictions about the JS code which aren't universally valid, and which no sane compiler developer would include in a compiler for languages like C/C++/Rust/etc. Part of the reason this works is that for the weird corner cases where assumptions don't hold the generated assembly can be made to fall back to the slower non-JIT interpreter.
The end result is that there have been many logic bugs in javascript engines' optimization passes that result in unsafe assembly code, memory corruption, and JS engine sandbox escape. Rust's compiler can protect against that stuff for rust code which is compiled up-front, but it can't provide any assurances for random JIT'd code that your code generates later. There also (in general) isn't really a lot of stuff you can do to prevent these kinds of logic bugs, even on a theoretical level. It took a long time for just the theory behind Rust's borrow checker to emerge.
Comment
Comment