Friday, February 6, 2026
English English French Spanish Italian Korean Japanese Russian Hindi Chinese (Simplified)

Synopsis

Microsoft debuts its in-house Maia 200 AI chip, marking a major step in its efforts to reduce reliance on Nvidia and take on cloud rivals like Google and Amazon.

Article

Microsoft has launched its first custom AI chip, the Maia 200, signaling a major strategic shift in the company’s cloud and AI roadmap. Built for data center workloads, Maia 200 is designed to power large language models and other generative AI tools within Microsoft’s Azure ecosystem.

Unveiled at the company’s Build event, the chip rollout marks Microsoft’s growing intention to reduce dependency on Nvidia’s GPUs and vertically integrate its AI infrastructure. According to the company, the chip is already being used to run GPT models from OpenAI — the same models that underpin ChatGPT and Microsoft Copilot.

The launch puts Microsoft in closer competition with Google and Amazon, both of which have already rolled out custom silicon (TPUs and AWS Inferentia, respectively). “This is about having more control over our AI future,” said Scott Guthrie, Executive Vice President of Microsoft’s Cloud + AI Group, during the launch keynote.

Microsoft also announced a companion chip, Cobalt 100, aimed at general-purpose computing in Azure servers. While Maia 200 is optimized for AI inference and training, Cobalt 100 is focused on performance-per-watt improvements to support scale and cost-efficiency in the cloud.

The Maia and Cobalt chips are being produced on TSMC’s 5nm process and housed in Microsoft-designed infrastructure called “Azure Boost,” a system that bypasses bottlenecks in traditional data center architecture.

Analysts see the move as a strong statement of Microsoft’s intent to compete not just at the software layer of AI but at the hardware level as well. As generative AI becomes increasingly core to cloud offerings, hyperscalers are racing to design, deploy, and scale custom chips that can deliver lower latency and greater compute efficiency.

The Maia rollout also comes at a time when GPU availability remains constrained and cloud costs are under scrutiny. By building its own chips, Microsoft could gain better unit economics, higher reliability, and deeper optimization across software and hardware layers.

Subscribe

* indicates required

The Enterprise is a leading online platform focused on delivering in-depth coverage of marketing, technology, AI, and business trends worldwide. With a sharp focus on the evolving marketing landscape, it provides insights into strategies, campaigns, and innovations shaping industries today. Stay updated with daily marketing and campaign news, people movements, and thought leadership pieces that connect you to senior marketing and business leaders. Whether you’re tracking global marketing developments or seeking to understand how executives drive growth, The Enterprise is your go-to resource.

Address: 150th Ct NE, Redmond, WA 98052-4166

©2026 The Enterprise – All Right Reserved.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept