Daily Management Review

If Microsoft Can't Get Enough AI Processors For Its Data Centres, There May Be Service Interruptions


If Microsoft Can't Get Enough AI Processors For Its Data Centres, There May Be Service Interruptions
Investors are being reminded by Microsoft that graphics processing units are a crucial component of its rapidly expanding cloud business. The software developer included language regarding GPUs to a risk factor for disruptions that can occur if it couldn't get the infrastructure it requires in its annual report, which was released late Thursday.
The terminology reflects the rising demand at the leading technology firms for the hardware required to give artificial intelligence capabilities to smaller organisations.
After startup OpenAI's ChatGPT chatbot became successful, interest in AI, and more specifically generative AI, which involves creating human-like text, speech, videos, and images in response to user input, has increased this year. This has helped GPU manufacturers like Nvidia and, to a lesser extent, AMD.
“Our datacenters depend on the availability of permitted and buildable land, predictable energy, networking supplies, and servers, including graphics processing units (‘GPUs’) and other components,” Microsoft said in its report for the 2023 fiscal year, which ended June 30.
There are three sentences in the regulatory filing that refer to GPUs. In the report from the prior year, they received zero mentions. Recent annual reports from other significant technology companies, including Alphabet, Apple, Amazon, and Meta, do not contain comparable wording.
As part of a complex agreement, OpenAI uses Microsoft's Azure cloud to run calculations for ChatGPT and multiple AI models. Microsoft has also started implementing generative AI into its existing products, including the Bing search engine, Word, Outlook, and OpenAI's models.
Microsoft is now looking for more GPUs than it had anticipated thanks to those efforts and the interest in ChatGPT.
“I am thrilled that Microsoft announced Azure is opening private previews to their H100 AI supercomputer,” Jensen Huang, Nvidia’s CEO, said at his company’s GTC developer conference in March.
Microsoft has started exploring outside its own data centres to acquire enough capacity, and has signed a deal with Nvidia-backed CoreWeave, which provides cloud-based GPU rental services to outside developers.
Microsoft has also invested years in developing its own specialised AI processor. The Information stated in April, citing unnamed sources, that Microsoft has accelerated the rollout of its chip as a result of all the attention being given to ChatGPT. Over the past ten years, Alphabet, Amazon, and Meta have all revealed their own AI CPUs.
Amy Hood, the head of finance at the firm, stated on a conference call with analysts on Tuesday that Microsoft anticipates increasing its capital expenditures sequentially this quarter to pay for data centres, standard central processor units, networking hardware, and GPUs.
“It’s overall increases of acceleration of overall capacity,” she said.