BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage News Ubuntu Embraces Local AI Instead of Cloud-First OS Integration

Ubuntu Embraces Local AI Instead of Cloud-First OS Integration

Listen to this article -  0:00

Ubuntu has outlined its AI strategy, describing it as a deliberate departure from industry trends towards cloud-centric, AI-first operating systems. Instead, the company says, Ubuntu will focus future releases on local intelligence, modular design, and strict user control.

Canonical plans to integrate AI models into its operating systems over the year in what Ubuntu software engineer Jon Seager describes as a "focused and principled manner that favours open weight models" aligned with the company's values. He adds that developers will take particular care to avoid AI slop pull requests that "have been flung at open source projects with little care, consideration or thought".

This integration will encompass both implicit as well as explicit usage of AI. The former enhances existing OS functionality, such as speech-to-text, while the latter adds support for AI-native, user-facing features and agentic workflows that users actively interact with, including document authoring and automated troubleshooting.

There are certain tasks for which AI tools are a no-brainer. In these cases, AI tools can work autonomously and produce excellent results - particularly where the work is of a mechanical nature and they’re given the right context. In other cases, they struggle.

One central element of Canonical's approach is its reliance on local models and on-device inference, which Seager notes will be a key enabler for many organizations:

Depending on your industry and customer base, there may be limitations on which models and tools can be used (if any at this point) but that’s where access to local, offline inference and bespoke tools for LLMs to call could be invaluable.

To make it easier for Ubuntu users to use local models, the OS will provide inference snaps as a direct and simplified tool to install local models optimized for the current hardware.

It’s easier to snap install nemotron-3-nano than juggle Ollama, Huggingface and a sea of model quantisations, and the snap will give you the optimised bits for your particular silicon if that silicon company has contributed them.

As with other snaps, inference snaps will be subjected to confinement rules, which restrict access to the user's machine and data.

Canonical's announcement, sparked some discussion online. On Reddit, while some commenters described it a reasonable and sensible position, others expressed clear distrust of AI integration in Ubuntu and rejected the idea of it becoming a default feature, warning that such move could prompt them to leave the OS.

Despite these concerns, Seager noted that there will not likely be a way to disable AI overall:

I don’t think we’ll introduce a “global AI killswitch”, mostly because that’s a very complex thing to do “honestly” given how many different ways people consume software on Ubuntu these days.

However, the OS will enable users to remove any feature they do not like simply by uninstalling the corresponding snaps.

About the Author

Rate this Article

Adoption
Style

BT