Microsoft May Go Nuclear to Support Its Energy-Hungry AI

In this article:
OpenAi's logo on a phone in front of Microsoft's logo in the background.
OpenAi's logo on a phone in front of Microsoft's logo in the background.


OpenAI’s GPT-3 was reportedly responsible for massive amounts of water and energy usage over the past year, and that demand will only get more intense as it tries to train more capable models.

Artificial intelligence has proved a costly endeavor—well, yes, in terms of money, but AI requires massive amounts of energy, and water consumption to operate at scale. That hasn’t stopped big tech companies such as Google and Microsoft from putting that energy-hungry AI into practically every single one of their user and enterprise end-products. Big daddy Microsoft has been trying to keep its (OpenAI-assisted) lead in the AI rat race, and it may need to grab the fuel rod by both hands if it wants to continue its big AI ambitions.

And when we say fuel rod, we mean it literally. Microsoft put out calls for a program manager on “Nuclear Technology” on Monday. As first reported by CNBC, The job specifically mentions that this new initiative would use “microreactors” and “Small Modular Reactors” to power the data centers used by Microsoft Cloud and AI. Whatever it is, the scope for Microsoft’s nuclear AI could be “global” as Microsoft has Azure data centers in all parts of the globe.

Read more

Microsoft has spent the last year implementing generative AI into practically every one of its software products. Most recently, the Redmond, Washington company announced its AI copilot for Windows 11 to act as a kind of virtual assistant on a desktop. Court documents have shown that Microsoft has been looking for ways to implement more AI capabilities on its Azure cloud platform.

But powering that AI is extraordinarily costly, even more so than its other cloud-based products. Microsoft’s latest sustainability report noted that the company’s water consumption has increased 30% year over year in order to keep its AI supercomputers cool. Microsoft has put billions of dollars into a partnership with ChatGPT-maker OpenAI, and the Redmond company now being forced to power and cool its partner’s growing energy needs to train OpenAI’s latest models. Training GPT-3 consumed enough water to fill a full nuclear reactor’s cooling tower, according to one recent study.

Studies have shown AI is responsible for massive amounts of carbon emissions, and OpenAI’s GPT-3 model was responsible for CO2 emissions than most other large language models. The company’s GPT-4 model is purportedly 1,000 times more powerful than GPT-3.5 and was trained on nearly four times as much data. Running a larger AI model would require several times as much power as smaller models, and AI companies aren’t slowing down.

More from Gizmodo

Sign up for Gizmodo's Newsletter. For the latest news, Facebook, Twitter and Instagram.

Click here to read the full article.

Advertisement