BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage News Cross-Platform Augmented Reality Apps with Unity AR Foundation

Cross-Platform Augmented Reality Apps with Unity AR Foundation

This item in japanese

Bookmarks

Unity, maker of the eponymous game engine, continues to advance its AR Foundation project, which aims to make it easier for developers to create AR apps that runs both on iOS and Android. Its latest release adds support for ARKit’s ARWorldMap and Unity’s Lightweight Render Pipeline.

AR Foundation exposes a common API which aims to cover the core functionality of both Android ARCore and iOS ARKit, thus making it possible to create AR apps for both platforms from a single code base. After providing support for a number of basic AR features in its first release, including plane detection, device position and orientation tracking, light estimation, and others, Unity is now adding more advanced features to its offerings.

One of those is support for ARKit ARWorldMap, which enables the creation of shared or persistent experience. Shared experience allows multiple users to see and interact with the same AR scene using different devices at the same time, with each user seeing the common virtual environment from their own perspective. ARWorldMap also makes it possible to create persistent AR experience that can be stored and recreated at some other point in time. Another ARKit feature that is now supported by AR Foundation is face tracking, which makes it possible to track the movement and expressions of the user’s face.

It is worth noting that both world map and face tracking support are for the time being exclusive to ARKit. Unity plans to add support for the equivalent ARCore features in the future but no detailed plan has been announced. You can see the list of currently supported features in the image below.

A new feature that is supported both on iOS and Android is Unity Lightweight Render Pipeline. This enables the creation of shaders using Unity’s shader graph, which provides a visual editor for shaders, and then use them in AR apps.

A couple of other features that Unity is working on for AR Foundation are remoting, which is the ability to stream sensor data from a mobile device to a desktop computer with the aim to speed-up development; and in-editor simulation, aiming to enable testing without using a real device. Both features are scheduled to be released during 2019.

Rate this Article

Adoption
Style

BT