BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage News Google Unveils AppFunctions to Connect AI Agents and Android Apps

Google Unveils AppFunctions to Connect AI Agents and Android Apps

Listen to this article -  0:00

In a move to transform Android into an "agent-first" OS, Google has introduced new early beta features to support a task-centric model in which apps provide functional building blocks users leverage through AI agents or assistants to fulfill their goals.

The foundation for this new model is provided by AppFunctions, a Jetpack API that allows developers to expose self-describing capabilities within their apps for seamless integration with AI agents. By running on-device, these interactions offer improved privacy and faster performance by minimizing network latency.

Mirroring how backend capabilities are declared via MCP cloud servers, AppFunctions provides an on-device solution for Android apps. Much like WebMCP, it executes these functions locally on the device rather than on a server.

For example, a user might ask Gemini Assistant to "Show me pictures of my cat from Samsung Gallery". The assistant would interpret the user's request, retrieve the relevant photos, and present them in its own interface. Those images can then persist in context, allowing the user to reference them in follow-up requests, such as editing, sharing, or taking further action.

As not all apps will support AppFunctions, especially in this early stages, Google has also introduced an UI automation platform in Android that provides a fallback when apps aren't integrated. This automation layer makes it possible for users to "place a complex pizza order for their family members with particular tastes, coordinate a multi-stop rideshare with co-workers, or reorder their last grocery purchase" all through the Gemini Assistant without additional developer effort.

This is the platform doing the heavy lifting, so developers can get agentic reach with zero code. It’s a low-effort way to extend their reach without a major engineering lift right now.

In its announcement, Google emphasized that privacy and user control are central to the design of AppFunctions. All interactions are built for on-device execution with full user visibility through live view and/or notifications, the ability to manually override the agent's behavior, and mandatory confirmation required for sensitive actions such as purchases.

As noted, AppFunctions and the UI automation platform are still in early beta, currently available on the Galaxy S26 series, with a wider rollout of these features planned for Android 17.

About the Author

Rate this Article

Adoption
Style

BT