Google Cloud has announced it's contributing a gRPC transport package for the Model Context Protocol (MCP), plugging what the company calls a critical gap for organizations that have already standardized on gRPC across their microservices. MCP, Anthropic's protocol that integrates AI agents with external tools and data, is currently gaining significant traction in enterprise environments.
Right now, MCP ships with JSON-RPC over HTTP as its transport. That works well when dealing with natural-language payloads, but it creates a real headache for developers already running gRPC everywhere. Other options? Rewrite services to speak MCP's JSON transport, wire up transcoding proxies, or keep two separate implementations running side by side. None of those are great.
Spotify's already felt this pain firsthand. Stefan Särne, senior staff engineer and tech lead for developer experience there, laid it out in Google's blog post:
Because gRPC is our standard protocol in the backend, we have invested in experimental support for MCP over gRPC internally. And we already see the benefits: ease of use and familiarity for our developers, and reducing the work needed to build MCP servers by using the structure and statically typed APIs.
The push has community backing too. Developers had been making the case since at least April 2025—a GitHub discussion (#1144) saw practitioners arguing MCP should be built around gRPC from the start, with some already shipping their own gRPC-based MCP servers in the meantime. A GitHub issue (#966) from July 2025 racked up 43 upvotes, with developers making the case that JSON-over-HTTP brings high overhead from JSON serialization, inefficient long-polling for resource watches, and a lack of type safety in the API contract. The MCP maintainers have since agreed to support pluggable transports in the SDK, and Google plans to contribute and distribute the gRPC transport package itself.
Swapping JSON for Protocol Buffers under the hood could meaningfully cut network bandwidth and CPU overhead. For shops already running gRPC infrastructure, it means AI agents could talk to existing services without bolting on extra translation layers. Protocol Buffers' structured, typed contracts also line up better with how most backend services already get defined.
But there's a real tension here that the proposal doesn't fully resolve. A Medium analysis comparing MCP and gRPC pointed out that "gRPC's server reflection provides structural information (method names, parameters), but it lacks the semantic, natural-language descriptions (the 'when' and 'why') that LLMs need." MCP was built from the ground up to give AI agents that context—tool descriptions, resource explanations, prompt guidance. gRPC just doesn't do that natively.
So the bigger architectural question remains: does MCP bend to fit existing RPC systems like gRPC, or do those systems learn to speak MCP's language? Practitioners are split. Some say forcing JSON-RPC rewrites of perfectly working gRPC services is unnecessary friction. Others argue you can't just slap gRPC on top of an AI-first protocol without adding the semantic layer LLMs actually need to function.
For developers shipping AI agents into production, the practical upside is clear. Companies already deep into gRPC—Google itself uses it "to enable services and offer APIs at a global scale"—can now adopt MCP without tearing apart existing service contracts. Google has also launched fully-managed remote MCP servers with globally-consistent endpoints for its own services, which paired with gRPC support positions Google Cloud to go directly after enterprises with existing gRPC investments looking to bolt on AI agent capabilities.
The gRPC transport is still in development. Google's working with the MCP community through an active pull request for pluggable transport interfaces in the Python SDK. If developers are tracking this, the MCP GitHub repository and contributor channels are where the action is happening.