According to the report Bloomberg, Microsoft has spent hundreds of millions of dollars building a supercomputer for the OpenAI ChatGPT chatbot. In blog posts, the company explains how it has built the powerful Azure AI infrastructure that OpenAI uses and how its systems are becoming even more reliable.
What is known
To build the supercomputer, Microsoft bundled thousands of Nvidia GPUs on its platform in Azure. In turn, this allowed OpenAI to train ever more powerful models and “unlocked the AI power” of tools like ChatGPT and Bing.
Microsoft vice president of artificial intelligence and cloud computing Scott Guthrie said the company has spent hundreds of millions of dollars on the project. And although this may seem like a drop in the ocean for such a huge company, Microsoft recently continued its multibillion-dollar investment in OpenAI, which undoubtedly demonstrates the company’s willingness to actively develop this direction.
Microsoft is already working to make Azure’s AI capabilities even more powerful with the launch of its new virtual machines, which use Nvidia’s H100 and A100 Tensor Core GPUs, as well as the Quantum-2 InfiniBand network, a project both companies announced last year. . Microsoft says this should allow OpenAI and other companies that rely on Azure to train larger and more complex AI models.
Source: The Verge