InfoQ Homepage Microsoft Content on InfoQ
-
Azure AI Foundry Agent Service Gains Model Context Protocol Support in Preview
Microsoft's Azure AI Foundry Service now supports the Model Context Protocol (MCP), revolutionizing AI agent integration. This streamlined approach eliminates cumbersome custom coding, allowing seamless connection to data sources and workflows. With enterprise-grade security, developers can effortlessly enhance agent capabilities, ushering in a new era of interoperability and efficiency in AI.
-
Microsoft Introduces Mu: a Lightweight On-Device Language Model for Windows Settings
Microsoft has introduced Mu, a new small-scale language model designed to run locally on Neural Processing Units (NPUs), starting with its deployment in the Windows Settings application for Copilot+ PCs. The model allows users to control system settings using natural language, aiming to reduce reliance on cloud-based processing.
-
Microsoft Enhances Developer Experience with DocumentDB VS Code Extension and Local Emulator
Microsoft has recently released an open‑source DocumentDB extension for Visual Studio Code alongside DocumentDB Local, a lightweight local emulator.
-
Microsoft Azure Enhances Observability with OpenTelemetry Support for Logic Apps and Functions
Microsoft has expanded OpenTelemetry support in Azure Logic Apps and Functions, enhancing observability and interoperability across platforms. This open-source framework enables seamless data generation and correlation, enhancing diagnostics beyond standard telemetry. With streamlined configuration and integration, Azure's offerings aim for standardized observability across cloud services.
-
Microsoft Open Sources the GitHub Copilot Chat Extension
At its Build 2025 conference, Microsoft announced plans to open source over the next few months the code behind the GitHub Copilot Chat extension under the MIT license and refactor core AI capabilities directly into the main VS Code codebase. The move, if completed, may affect the ability of current for-pay AI code editors to compete purely on features.
-
Azure AI Search Unveils Agentic Retrieval for Smarter Conversational AI
Microsoft’s Azure AI Search unveils agentic retrieval, a cutting-edge query engine that enhances conversational AI answer relevance by up to 40%. This dynamic system leverages conversation history and parallel subquery execution, paving the way for sophisticated knowledge retrieval. Currently in public preview, it offers adaptive search strategies tailored for evolving enterprise needs.
-
Microsoft Announces AI Agent and Platform Updates at Build 2025
At its annual developer conference, Build 2025, Microsoft introduced a set of updates focused on expanding the role of AI agents across Windows, GitHub, Azure, and Microsoft 365.
-
Azure AI Foundry Agent Service GA Introduces Multi-Agent Orchestration and Open Interoperability
Microsoft's Azure AI Foundry Agent Service has launched with robust, scalable features for building and managing AI agents. This versatile, use-case-agnostic platform supports multi-agent orchestration and integrates seamlessly with tools like Logic Apps and SharePoint. Developers can effortlessly create intelligent ecosystems for diverse applications, boosting productivity and innovation.
-
Microsoft Announced Edit, New Open-Source Command-Line Text Editor for Windows at Build 2025
At its Build 2025 conference, Microsoft announced Edit, a new open-source command-line text editor, to be distributed in the future as part of Windows 11. Edit aims to provide a lightweight native, modern command-line editing experience similar to Nano and Vim.
-
Azure Logic Apps Introduces "Agent Loop" for Building AI Agents in Enterprise Workflows
Microsoft's Build conference unveiled Agent Loop, a transformative feature in Azure Logic Apps enabling developers to embed AI agents into enterprise workflows. Leveraging over 1,400 connectors, it allows for creating autonomous and conversational agents for tasks like loan approvals and customer support, streamlining operations, and enhancing decision-making.
-
Microsoft CTO Details Successes, Challenges, and Commitment to Rust at Rust Nation UK
Mark Russinovich, Chief Technology Officer for Microsoft Azure, delved in a recent talk at Rust Nation UK into the factors driving Rust adoption, providing concrete examples of Rust usage in Microsoft products, and detailing ongoing efforts to accelerate the migration from C/C++ to Rust at Microsoft by leveraging generative AI.
-
Neon Serverless Postgres Now Generally Available as an Azure Native Integration
Microsoft and Neon launch Neon Serverless Postgres, now generally available as a native Azure integration, offering developers an innovative, scalable, and cost-effective database solution. With automatic scaling, instant provisioning, and seamless integration, it's designed for enterprises and AI startups alike, enhancing workflows and enabling efficient database management within Azure.
-
Microsoft Pledges Deeper European Tech Ties amidst Sovereignty Debate
Microsoft's five digital commitments aim to bolster Europe's tech landscape and sovereignty through a 40% cloud and AI infrastructure expansion, enhanced cybersecurity, and a robust data privacy framework. By establishing a "European cloud for Europe," Microsoft reinforces its dedication to digital resilience while fostering economic competitiveness and supporting the open-source community.
-
Microsoft Native 1-Bit LLM Could Bring Efficient genAI to Everyday CPUs
In a recent paper, Microsoft researchers described BitNet b1.58 2B4T, the first LLM to be natively trained using "1-bit" (technically, 1-trit) weights, rather than being quantized from a model trained with floating point weights. According to Microsoft, the model delivers performance comparable to full-precision LLMs of similar size at a fraction of the computation cost and hardware requirements.
-
Azure Container Apps Serverless GPUs Reach General Availability with NVIDIA NIM Support
Azure has launched Serverless GPUs for Azure Container Apps, enabling scalable, on-demand execution of AI workloads using NVIDIA A100 and T4 GPUs. This groundbreaking feature supports NVIDIA NIM microservices, simplifying deployment and management while optimizing costs. Developers can focus on applications, as Azure manages infrastructure, offering a flexible solution for diverse AI scenarios.