After half a year, what now?

Wu Yu Wei published on
5 min, 959 words

At the beginning of this year, I was planning to contribute to Servo and wish to restore its glory. But after diving into the code base and understanding its landscape, I accepted the cruel reality that it's impossible unless Servo can be adequately funded again, just as previous Servo members advised. So I will not continue the series of Servo posts. However, I hope I can demonstrate the situation it faces well in this post, grasp the remaining we have, and look for where we can explore further.

What happened after Mozilla layoff?

After Mozilla layoff in 2020, most members from Servo are fired AFAIK. Despite being moved under Linux Foundation, the foundation would only cover its cost of CI/CD. So technically speaking, the project was already in maintenance mode at that time. But several parts in Servo are already integrated into Firefox, and some people in Mozilla still maintain them to date. I was hoping if we can fill the missing puzzle in Servo, we can get a minimum viable browser or a webview library. However, it didn't go as easy as I thought would be. There are some blockers just too difficult for normal contributors to refactor.

Unable to upgrade SpooderMonke

Servo, as a project under Mozilla, uses SpiderMonkey as its JavaScript engine. Mozjs and rust-mozjs are its wrappers on top of SpiderMonkey. In the last two years, contributors are trying to upgrade it to the latest version but they all failed. Servo still uses version 88 nowadays, while SpiderMonkey has already published 108 at the time of this post. And the engine itself is also the problem IMHO. Its types, GC, and design affect not only the script layer, but also the rest of Servo. Although Rust doesn't require any garbage collection, JavaScript in script layer needs it. In order to use the GC from SpiderMonky, Servo has to create crazy amount of types and lints to control it. This makes handling DOM types extremely verbose, and hence also affects whole style crate and layout crate since they all have to deal with these types. I once considered perhaps we could switch to v8 instead which have a well-maintained crate from deno. But after giving a try of its API for dealing with GC and creating JavaScript types, I got a conclusion that it will fall into same scenario like Mozjs already has. Unless Rust has a decent crate for GC, I don't believe we can work with a GC language seamlessly. Shifgrethor is the closest we could get, and I also forked it into elise for testing multi-thread situations. But it relys on several nightly features.

Spaghetti inside script/style/layout

One of the main goals of Servo is to extract each feature from a browser into individual crates. This has been succeed in some features like HTML parsing, selecting CSS, and rendering etc. But there are some crates still tangle around and unable to do so which are script, style, and layout crates. We already see types from GC affect all these crates. It's also difficult to distinguish which document traversals should belong to whom. The worst part is the old layout is sort of incorrect. Back in the day, they want to make the layout to be parallel as possible. But this didn't work well in some attributes like flex and absolute. When they tried to rewrite a new one, the layoff happened. So now we have two layout crates: the old one is more complete but would be wrong in specific properties; the new one has almost nothing but should be correct if only it could be complete. If you try to fix some layout issues, you will have to jump around different crates again and again. And the amount of issue is massive already!

Crates that still work well

With huge blockers like those mentioned above, I tend to agree it will require proper funding to be able to start developing again. In the near future, I feel like we should focus on those crates which are already mature and explore further on top of them. Take crates like html5ever, cssparser for example, they become the bedrock of every modern web framework. There are also crates like kuchiki and scraper which build on top to make users easier to use. And I hope perhaps same thing can happen to webrender.

Webrender is one of a few crates still maintained by Mozilla and is being used by Firefox directly! There was a GUI crate called Azul that once used Webrender to build the layout system. Although Azul is also no longer maintained due to personal reasons, I feel like we could give it another shot and see if an HTML/CSS layout system can work again. I understand it will be very limited without any scripting functionality. But I hope we can try something other than DOM layer built by JavaScript engine. Maybe an API for document tree could work, or just doing regular XML parsing as other GUI frameworks do is enough. I don't have any conclusion yet, but I haven't come this far just to end the journey here either!