
During the Ignite event Microsoft has revealed two custom processorsone for AI and the other for general cloud workloads – both are routed to your Azure infrastructure. The chips will begin rolling into Microsoft’s data centers early next year, in addition to supporting internal work They will also be accessible to Azure customers.
The first solution an AI accelerator called Maia 100, aimed at data centers running products such as Windows Copilot, Bing Chat and more generally managing large language models. This chip made one 5 nanometers at TSMC and intact 105 billion transistors. There is currently no information about the architecture.
“We were thrilled when Microsoft first released designs for the chip Maya and we worked together Refine it and test it with our models“said Sam Altman, CEO of OpenAI.
The chip is mounted in customized liquid-cooled racks with up to 4 chips. “Maia servers are equipped with a fully customized Ethernet-based network protocol 4.8 terabits of total bandwidth per accelerator to enable better end-to-end workload scalability and performanceMicrosoft explained.
The second processor, called Cobalt 100is integrated into instead Azure data center: It’s a chip POOR – predicted 128 Armv9 cores – and Microsoft thinks it’s ideal for running general cloud workloads more efficiently than current hardware solutions. TSMC is also expected to produce cobalt in 5nm.
At the moment there is no concrete information about this Performancebut we are talking about a Improvement of up to 40% compared to the current generation of commercial ARM servers deployed in the Azure cloud.
The Redmond house considers the development of custom processors as the final part of a strategy what it has led to over the years Optimize the entire Azure infrastructure to stand out from the competition from Google and Amazon, who have also been developing customized chips for their servers for years.
In this way, so does Microsoft reduces dependence on the traditional supply chain, the ability to opt for internally developed solutions when the proposals from partners such as NVIDIA were not available due to high demand. Microsoft can finally design its own chips Better control over performance targets and overall costsfrom energy consumption to the cost of the individual hardware component.
The arrival of Maia 100 and Cobalt 100 does not mean a farewell to the hardware of the canonical vendors like Intel, AMD and NVIDIA, at least not at the moment. At Ignite, Microsoft announced an expansion of its partnerships with NVIDIA and AMD to offer the computing power of the NVIDIA H100 and H200 AI accelerators and the AMD Instinct MI300X in the cloud.