Microsoft debuts its first custom AI chip, cloud CPU

In this article:

Microsoft (MSFT) is debuting its first AI chip. At its Ignite conference Wednesday, the company said the chip, called the Maia 100, is the first in its planned Azure Maia AI accelerator series. In addition to the Maia 100, the tech giant also unveiled its first custom Arm-based (ARM) Azure Cobalt central processing unit for general purpose cloud computing.

With the two chips, Microsoft is now on par with rivals Google (GOOG, GOOGL) and Amazon (AMZN), which have also developed their own custom chips to run their competing cloud platforms. The company says the chip will be used for both cloud-based training and inferencing for AI models. Training is the process through which a company sets up an AI model, and inferencing is when it rolls the model out for use in the wild.

“Software is our core strength, but frankly, we are a systems company,” Microsoft’s Rani Borkar, corporate vice president for Azure hardware systems and infrastructure, said in a statement.

“At Microsoft we are co-designing and optimizing hardware and software together so that one plus one is greater than two. We have visibility into the entire stack, and silicon is just one of the ingredients."

By offering services powered by its own custom silicon, Microsoft said it can provide “huge yields in performance and efficiency.”

The idea here is pretty basic. If you’ve got Microsoft software, it’s going to run better on Microsoft-designed chips. That’s because Microsoft can customize its software and hardware to offer better performance capabilities.

It’s the same reason Nvidia offers its own AI software in addition to its AI chips, or why Apple develops its own chips for the iPhone or Mac. If a company can control both the hardware and software, it can provide better outcomes for users.

Microsoft says it partnered with ChatGPT developer OpenAI to test its Maia 100 accelerator and will use those lessons to build out future chips.

Microsoft's Maia
Microsoft's Maia (Microsoft)

“We were excited when Microsoft first shared its designs for the Maia chip, and we’ve worked together to refine and test it with our models,” OpenAI CEO Sam Altman said in a statement. “Azure’s end-to-end AI architecture, now optimized down to the silicon with Maia, paves the way for training more capable models and making those models cheaper for our customers.”

In addition to building out its chips, Microsoft says it also built out its server boards in which the chips sit and the server racks they rest in. To reduce the amount of heat generated by the AI servers, Microsoft says it created a special cooling feature called a sidekick that passes cool liquid through a series of tubes and radiators and absorbs heat more efficiently than simple fans alone. That should help cut down on power usage, as well.

Sign up for the Yahoo Finance Tech newsletter.
Sign up for the Yahoo Finance Tech newsletter. (Yahoo Finance)

As for the Azure Cobalt chip, Microsoft says it offers a 40% performance boost over the current generation of Arm-based Azure chips.

“At the scale we operate, it’s important for us to optimize and integrate every layer of the infrastructure stack to maximize performance, diversify our supply chain, and give customers infrastructure choice,” Microsoft’s executive vice president of the cloud and AI group, Scott Guthrie, said in a statement.

But just because Microsoft is rolling out its own chips doesn’t mean that it’s ditching Nvidia (NVDA) or AMD (AMD). The company is continuing to offer cloud computing capabilities running Nvidia’s H100 chip and is adding access to the company’s newly announced H200 chip as well. Microsoft says it will also begin offering access to AMD’s MI300 chip next year.

Daniel Howley is the tech editor at Yahoo Finance. He's been covering the tech industry since 2011. You can follow him on Twitter @DanielHowley.

Click here for the latest technology news that will impact the stock market.

Read the latest financial and business news from Yahoo Finance

Advertisement