Microsoft Builds Big Supercomputer To Encourage OpenAI Chatbots, ChatGPT
JAKARTA - Microsoft is pouring hundreds of millions of dollars into building a massive supercomputer that helps power OpenAI's ChatGPT chatbot, reports Bloomberg. In a pair of blog posts published on Monday, March 13, Microsoft explains how they created the robust artificial intelligence infrastructure on Azure that OpenAI uses, and how the system is now getting stronger.
To build the supercomputers powering the OpenAI project, Microsoft says it's connecting thousands of Nvidia graphics processing units (GPUs) in its Azure cloud computing platform. This enables OpenAI to train increasingly powerful models and "unlock the artificial intelligence capabilities" of tools like ChatGPT and Bing.
Scott Guthrie, the Microsoft vice president of Artificial Intelligence and Cloud, said the company is shelling out several hundred million dollars for the project, according to a statement provided to Bloomberg. And while that may seem like a drop in the ocean for Microsoft, which recently extended its multiyear multibillion-dollar investment in OpenAI, it certainly shows that Microsoft is willing to throw more money into the AI space.
VOIR éGALEMENT:
Microsoft is already working to make AI capabilities in Azure even stronger with the launch of its new virtual machines that use Nvidia's Tensor Core H100 and A100 GPUs, as well as Quantum-2 InfiniBand networking, a project both companies announced last year. According to Microsoft, this will allow OpenAI and other companies that rely on Azure to train larger, more complex AI models.
"We saw that we needed to build a dedicated cluster focused on enabling large training workloads and OpenAI was one of the early pieces of evidence for that," said Eric Boyd, Microsoft Corporate Vice President for Azure AI, in a statement, quoted by The Verge.
"We worked with them to learn what they were looking for when building their training environment and what they needed," he adds.