Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ


Choose your language

InfoQ Homepage News Iconic Doom3 Game Now in Browsers with WebAssembly: Q&A with Gabriel Cuvillier

Iconic Doom3 Game Now in Browsers with WebAssembly: Q&A with Gabriel Cuvillier

This item in japanese

Gabriel Cuvillier, senior software engineer at Continuation Labs, ported the iconic Doom 3 game to browsers with WebAssembly. The 7-week full-time effort illustrated both the present performance potential and the missing parts for WebAssembly today to seamlessly run heavy-weight desktop applications and games. InfoQ interviewed Cuvillier on the technical challenges encountered, and the lessons to be learned for developers thinking about porting desktop applications with WebAssembly.

Doom 3 is a horror first-person shooter video game originally released for Microsoft Windows in 2004. Doom 3 utilizes the id Tech 4 game engine, released under the GNU General Public License in 2011. The game was a critical and commercial success, with more than 3.5 million copies of the game sold.

Screenshot from D3wasm WebAssembly port of Doom3

InfoQ: What drove you to port DOOM3 to browsers with WebAssembly?

Gabriel Cuvillier: Since the general availability of WebAssembly MVP in major browsers two years ago, I have the feeling that a hype cycle has been started around the technology: a lot of praises are being said, nice presentations and talks are being done everywhere, and so on. But, in practical terms, apart from some small nice benchmarks and sample demonstrations, there have been very few real-world use cases publicly studied and shown.

So, in order to convince myself that WebAssembly could fulfill its promises, I decided to move things to the next level and port a real program.

I found the Doom 3 game to be an ideal candidate for this: it is a real-world large C++ program, a former successful AAA video game, with open-sourced code (and known to be of very good quality), and at the time the game was released - back in 2004 - it was really bleeding-edge technology in terms of game engine and graphics, known to put a lot of Desktop systems down to their knees.

Additionally, the game also has a very nice characteristic that has been been a critical point in my decision to focus on this game specifically: the id Tech 4 engine is probably one of the most advanced game engines that can be run on a single thread of execution. Or, put differently, while nowadays most engines are designed for multiple CPU core systems, Doom 3 is one of the last “high-end” games designed to run on a single CPU core system. As multithreading is not yet ready on the Web (mostly due to Spectre/Meltdown security vulnerabilities), this single-threaded feature has been a mandatory requirement from the very beginning of my projects.

So, by the end of 2018, I decided to port Doom 3 to the Web and ultimately convinced myself that WebAssembly is a technology to seriously consider for the next 10 years.

InfoQ: You mentioned that D3 is a very demanding game to run in a browser. What makes it so? What are the performance drivers which are present in native desktop operating systems and missing today in desktop browsers?

Cuvillier: Doom 3 is a very demanding game in browsers because 1) it is an already very demanding game by itself, and 2) it has to run with additional constraints in the browser that are not present in native builds.

Concerning 1), as I said in the previous answer, it is a very demanding game by itself because it provides a good graphics quality level while running on a single thread of execution. This means that the amount of things needed to be computed sequentially while trying to stay at 16ms per frame is simply massive: unified lighting model with dynamic per-pixel lighting and real-time shadows, skeletal animation, rigid body physics, AI, networking, and so on.

In that context, and concerning 2), the first constraint is that WebAssembly is first and foremost a low-level virtual machine, and so in the end you have bytecode interpreted by a dedicated program. This can’t match the performance of a general-purpose CPU executing its own native instruction set! Without a Just-In-Time compiler, the performance hit because of this is at least 2x (that’s an empirical number of course, based on my personal observations. But maybe there are some more accurate academic studies on this topic).

The second constraint is more tricky to understand; in the context of the Web, and from the point of view of WebAssembly, everything happening in the “outside world” is more or less tied to… Javascript, and ultimately to the browser and its secure sandbox. The “outside world” includes graphics and audio APIs. So, when the Wasm code calls a graphic API, this will not directly call your graphic card driver as it is used to on a native build. Instead, a small Javascript layer is being involved each time to call the correct “Web API”, and then the browser will transform/forward the Web API call to a graphic driver call, after having done some checks and validations. Escaping the “secure sandbox” comes at a cost.

And all of this introduced a lot of overhead in the end. So if you have a program that heavily stresses both the CPU and the “external world” (such as doing many graphics API calls per frame), you end up in a very demanding situation.

InfoQ: Are you aware of other desktop games which have been ported to the browser with WebAssembly? Is the performance profile similar to the one you achieved with DOOM3?

Cuvillier: To my knowledge, as of 2019, there is not yet another commercial game as demanding as Doom 3 that has been ported to the Web. There are some nice tech demos, but no full AAA games. Note that I might be wrong- I did not check the whole Web! But - shameless plug - another nice game porting experiment of a full commercial video game is simply the previous game I ported to WebAssembly: Arx Fatalis. This is a 2002 video game, so quite a bit uglier than D3, but nevertheless very interesting. You may test the demo here:

The story behind this port dates back to earlier experiments I did to run native applications using a technology called Portable Native Client (PNaCl), which is more or less one of the precursors to WebAssembly. It worked nicely, but after Google decided to deprecate the technology in favor of WebAssembly, I decided to migrate the port to WebAssembly as well as a way to learn the new technology.

InfoQ: Among future features coming to WebAssembly in all modern browsers, SIMD support, Dynamic Linking, 64 bits addressing, OffscreenCanvas: pick one!

Cuvillier: The most important thing to me now for WebAssembly is not on the list :)

It is the ability to suspend/resume the WebAssembly runtime.

The reason is difficult to explain in a short interview, but basically, you have all your typical synchronous C/C++ code embedded in a fully asynchronous environment. And that frequently does not blend well; some usually simple things can now be difficult to achieve, such as doing synchronous I/O operations.

For example, one has to realize that “synchronously reading a big file from a persistent storage” (such as a game asset from disk for example) is something that is no longer possible on the Web. This very simple thing has been taken for granted since the dawn of programming, and now it is gone! Honestly, that hurts.

Well, you do have some ways to handle the issue, but all of them come at a cost: either your artificially “blocks everything” (including the browser tab) which is not a good practice on the Web, or you rely on costly workarounds such as storing all the filesystem/assets in RAM (such a waste given the size of our persistent storage solutions), or using specific compiler flags with non-trivial requirements on the code and impacts on the compiled binary.

Of course, rewriting the synchronous code to asynchronous code is the best solution, but with huge codebases such as Doom 3, that’s simply not possible in a reasonable time.

So, allowing to suspend/resume the WebAssembly runtime could allow re-introducing some of the synchronous I/O paradigms we have been used for 25+ years (at least, “synchronous” from the point of view of the C/C++ code).

Hopefully, the people behind the Emscripten project have worked on various compiler-side solutions that could address this problem, with funny names such as “Asyncify” and “Emterpreter”. I used them a bit in the Doom3 port to ease up the porting process. The next iteration will be probably called “Bysyncify”, and I can’t wait to test it.

Well, if I really had to pick one item from your list, I’d take Offscreen Canvas. Though this is not related to WebAssembly, but more to the browser implementations that are not yet all implementing this feature (Safari in particular). But the truth is that this “pick” is related to the previous point: Offscreen Canvas allows the main WebAssembly code of a graphical application to be run in a Web Worker, and in Web Workers, the synchronous/asynchronous issues are far less relevant.

InfoQ: You spent 6-7 weeks working full-time on the port, and described the task as fairly complex. In your opinion, should developers start experimenting with porting major applications to the browser, or is it wiser to wait for the technology to mature?

Cuvillier:  I find the technology to be very mature by itself, and I am quite impressed by it. For sure, if one has the need for a CPU-intensive application available on the Web, there is absolutely no reason to not start working on it using WebAssembly. It could be Video Games, CAD software, or Big Data, for example. I am currently working on a long-term project to illustrate the usage of WebAssembly in 3D CAD software, and I recently did some contractual work on an upcoming product related to the last example mentioned. While I can’t disclose more for now, be sure that 2020 will have a bunch of nice Wasm-related surprises :)

Additionally, the people working on the WebAssembly ecosystem - be it on the compilers, Wasm runtimes, or compilation environments - are very skilled in their respective fields. You can expect a quite good software platform in the future.

A word of caution though: the Web with WebAssembly is a difficult-to-learn technological stack, with rough edges and still  a quite fast-moving target. This last point may explain why it is still difficult to have an up-to-date in-depth book about the platform. A lot can be actually done, but only with sufficient money/time/knowledge resources. But for those who are ready to give energy to this, they for sure won't be disappointed!

Rate this Article