The SFHTML5 group recently had a meeting discussing HTML5 technologies for creating virtual reality experiences – WebGL, WebVR, Three.js, GLAM –, and the current development status for implementing support for them in Firefox and Chrome. The idea is to bring the entire web into the VR experience.
Tony Parisi, co-creator of the VRML and X3D ISO standards, made an introduction to WebVR, a blend of Virtual Reality and web technologies. Currently, VR software is mostly game-like with lots of work in creating and manipulating graphics. Most existing VR apps are built with the Unity engine which is mastered by few. Parisi’s idea is to level the field and let all developers be able to create VR apps using basic web technologies, namely JavaScript+WebGL. Three.js, a JavaScript library for rendering 2D/3D graphics in WebGL, makes it easier to render VR scenes, the code being 3-10x shorter than the corresponding WebGL code. Parisi is also working on a different solution called GLAM which is a declarative language for creating 3D web content. This should make it even simpler to render VR. Rendering a rotating 3D cube with GLAM requires the glam.js script and a tag like this:
#photocube {
image:url(../images/flowers.jpg);
rx:30deg;
}
Parisi mentioned that Chrome (experimental build, Blink source code branch) and Firefox (nightly) have started the implementation of a head tracking WebVR API that can be used for tracking the movement of VR devices (Oculus Rift for now, other devices to be supported later) and supports stereo 3D rendering for WebGL/CSS3 content. A simpler alternative that already works is using Google Cardboard, a smartphone and a VR app such as Cardboard. This app uses the smartphone’s camera and motion sensors to generate a moving stereoscopic image in the phone’s browser with WebGL.
During his session, UI/UX design for WebVR, Josh Carpenter, UX Lead Designer for Firefox OS at Mozilla, demonstrated using Oculus Rift with Firefox and outlined some of the characteristics of an open “webby” virtual reality experience that he would like to see happening:
- The ability to seamlessly and safely navigate from place to place via links. This includes controls for navigating links, zooming, scrolling, going back, entering information in a field, etc. In short, the ability to have a browser-like experience with the VR headset on.
- Ease of development. For this purpose, new tools need to be created. The JavaScript-Three.js-WebVR-WebGL-browser toolchain is in its infancy.
- VR experiences that work on any device
- The ability to automatically convert the world’s websites to VR. That means that one should be able to navigate to an old fashion website and still get a 3D VR experience.
- The commoditization of VR: more devices, embedability and better performance.
Carpenter also demonstrated incipient work on converting regular websites to VR without asking them to update their content. He has also created a number of transitions to use when going from one VR site to another. Carpenter sees a lot of potential in designing a VR browser, because there is available a 360° canvas with spatial animations.
Brandon Jones, a WebGL and WebVR developer at Google, provided code samples for a VR application, explaining what it takes to render a VR scene during the session Web Browser VR Internals. While doing VR rendering seems daunting at the first site, with Three.js it becomes much simpler as shown in the following code snippet:
<script src="js/effects/VREffect.js"></script>
<script src="js/controls/VRControls.js"></script> // Normal scene setup, then...
var vrControls = new THREE.VRControls(camera);
var vrEffect = new THREE.VREffect(renderer);
function onEnterVRFullscreen() {
vrEffect.setFullScreen(true);
}
function onWindowResize() {
vrEffect.setSize(window.innerWidth, window.innerHeight);
}
function onRequestAnimationFrame() {
vrControls.update();
vrEffect.render(scene, camera);
}
Jones admitted though that Three.js does not treat VR as a first class citizen and it is not optimized for VR rendering yet and it may require some optimizations, but it does the job. He provided some advice to VR app developers:
-
Favor lightweight Vertex shaders.
-
Reduce canvas resolution instead of frame rate when having performance issues.
-
Completely avoid rendering twice non-view dependent things (shadow or environment maps).
-
Drawing the scene for each eye entirely in sequence is heavy on state changes. Depending on the scene, it may be more efficient to draw left eye, swap the viewport then draw right eye for each object.
Jones also demonstrated rendering a Quake 3 VR scene in Chrome.
Resources: Videos from SFHTML5 Meetup, Getting Started with WebVR API, Mozilla VR – a fully VR-enabled website, Google Cardboard, Chrome VR Experiments, GLAM, Mozilla WebVR Mailing List.