MICROSOFT’S AI INVESTMENT SOARS: POWERING THE FUTURE BEYOND OPENAI
In the midst of the ongoing artificial intelligence (AI) boom, Microsoft has solidified its position as a key player through its significant investment in OpenAI. However, the software giant’s AI ambitions extend far beyond OpenAI, as it continues to allocate substantial resources to meet the soaring demand for AI-powered services. Recent reports suggest that Microsoft has entered into a multi-year agreement with startup CoreWeave, involving potentially billions of dollars in investment for cloud computing infrastructure.
CoreWeave, which recently secured $200 million in financing, boasts a valuation of $2 billion. The company specializes in providing simplified access to Nvidia’s highly regarded graphics processing units (GPUs), widely recognized as the market’s premier option for running AI models. Sources familiar with the matter reveal that Microsoft signed the CoreWeave deal earlier this year to ensure that OpenAI, the operator of the viral ChatGPT chatbot, has sufficient computing power to cater to its growing needs. OpenAI relies on Microsoft’s Azure cloud infrastructure to support its substantial computational requirements.
Both Microsoft and CoreWeave declined to comment on the specifics of the partnership, underscoring the confidentiality surrounding the agreement. The surge in generative AI interest commenced late last year following OpenAI’s introduction of ChatGPT, a groundbreaking AI system capable of generating sophisticated responses from human input. Numerous companies, including tech giant Google, have since rushed to incorporate generative AI capabilities into their products. Concurrently, Microsoft has been actively releasing chatbots for its own services, such as Bing and Windows.
Given the immense demand for infrastructure, Microsoft is seeking additional avenues to leverage Nvidia’s GPUs effectively. CoreWeave CEO Michael Intrator refrained from commenting directly on the Microsoft collaboration during a recent interview. However, he did mention that the company’s revenue has experienced substantial growth from 2022 to 2023. Last month, CoreWeave secured additional funding from hedge fund Magnetar Capital, amounting to $221 million in total. Notably, Nvidia invested $100 million in the previous financing round. Established in 2017, CoreWeave currently boasts a workforce of 160 employees.
The success of Nvidia’s GPUs is evident in its stock performance, which has surged by 170% this year. Recently, the company’s market capitalization briefly exceeded $1 trillion, propelled by a July quarter forecast that surpassed Wall Street estimates by over 50%. Colette Kress, Nvidia’s finance chief, anticipates that the company’s growth will be predominantly driven by data center operations, reflecting the rising demand for generative AI and large language models. OpenAI’s GPT-4, a massive language model trained using Nvidia GPUs on extensive online data, forms the foundation of ChatGPT.
Nvidia’s CEO, Jensen Huang, highlighted CoreWeave in his presentation at the Nvidia GTC conference, underscoring the significance of the startup’s role in the AI ecosystem. During an earnings call, Colette Kress specifically referenced CoreWeave, emphasizing the company’s relevance to Nvidia’s growth strategy. CoreWeave’s website proudly declares that its computing power is “80% less expensive than legacy cloud providers.” Among its offerings, the company provides access to Nvidia’s A100 GPUs, which are also available through cloud providers like Amazon, Google, and Microsoft.
Moreover, CoreWeave offers the more affordable Nvidia A40 GPUs, which are primarily marketed for visual computing, while the A100 GPUs target AI, data analytics, and high-performance computing. Some CoreWeave clients have encountered challenges in obtaining sufficient GPU power from major cloud providers. In such cases, prospects have requested A100 or newer H100 GPUs from Nvidia, but CoreWeave has recommended A40 GPUs instead, which offer excellent performance.