BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage News Google Unveils New Tools and Libraries for Building AI Glasses Apps

Google Unveils New Tools and Libraries for Building AI Glasses Apps

Listen to this article -  0:00

With the release of the Android XR SDK Developer Preview 3, Google has introduced two new libraries to help developers create AI Glasses experiences, Jetpack Projected and Jetpack Compose Glimmer. ARCore for Jetpack XR has also been expanded to work with AI Glasses, adding motion tracking and geospatial capabilities.

The new libraries introduced with Android XR SDK Developer Preview 3 allow developers to extend existing mobile apps to interact with AI Glasses by leveraging their built-in speakers, camera, and microphone, as well as presenting information through the glasses' display, where available.

There are many scenarios where your app might want to use AI glasses hardware. For example, a video conferencing app could add a UI control that allows the user to switch their video stream from the phone's camera to the AI glasses' camera, offering a first-person point of view.

The first library, Jetpack Projected, enables a host device, such as an Android phone, to project an app's XR experience to AI Glasses using audio and/or video. The library allows apps to check whether the target device has a display and wait for it to become available for use. Before an app can access device hardware, it must request permission at runtime in accordance with the standard Android permission model.

You can access AI Glasses hardware from both an AI Glasses activity and a standard app, provided you get a valid projected context. Audio support is straightforward, as the AI Glasses audio device behaves as a standard Bluetooth audio device.

Capturing a photo or video with the glasses' camera is a bit more complex, since it requires instantiating several classes to check hardware availability, setting it up, and binding the activity lifecycle to the camera so it opens and closes with the activity state.

Jetpack Compose Glimmer, on the other hand, is a set of UI components and a visual language to create augmented experiences on AI glasses provided with a display. The new visual languages uses optical see-through to blend visuals with the environment, focusing on clarity, legibility, and minimal distraction. Supported components include text, icons, title chips, cards, list, and buttons. All components are built on the underlying concept of a surface, which developers can access to create non-standard components.

Glimmer components can be customized using modifiers to adjust layout, appearance, and behavior and can be stacked along the z-axis to provide a sense of depth through the use of shadows. Google also introduced an AI Glasses emulator in Android Studio for UI preview and to simulate user interactions, including touchpad and voice input.

As a final note on the latest Android XR SDK version, Google has expanded ARCore for Jetpack XR, a set of APIs to create augmented experiences that include the possibility of retrieving planar data, anchoring content to a fixed location in space, and more. The latest version adds support for motion tracking, so that glasses responds to user movements, and geospatial pose, allowing content to be anchored to locations covered by Google Street View.

Android XR SDK Preview 3 is available in Android Studio Canary after upgrading to the lastest emulator version (36.4.3 Canary or later).

About the Author

Rate this Article

Adoption
Style

BT