JAKARTA - U.S. chip designer and computing company Nvidia Corp said on Wednesday, November 16 that it was working with Microsoft Corp   to build "big" computers to handle intense artificial intelligence computing work in the cloud.

The AI ​​computer will operate in Microsoft's Azure cloud, using tens of thousands of graphics processing units (GPUs), Nvidia's most powerful H100, and its A100 chip. Nvidia declined to say how much the deal was worth, but industry sources say each A100 chip costs around $10.000 to $12.000 and the H100 is much more expensive than that.

"We're at an inflection point where AI is coming to the enterprise and getting those services out there that customers can use to apply AI to business use cases to reality," Ian Buck, Nvidia general manager for Hyperscale and HPC told Reuters.

"We are seeing a huge wave of AI adoption ... and the need to apply AI to enterprise use cases," he added.

In addition to selling Microsoft chips, Nvidia said it will partner with software and cloud giants to develop AI models. Buck said Nvidia will also become a customer of Microsoft's AI cloud computing and develop AI applications on top of it to offer services to customers.

The rapid growth of AI models such as those used for natural language processing has sharply increased the demand for faster and more powerful computing infrastructure.

Nvidia says Azure will be the first public cloud to use InfiniBand's Quantum-2 network technology, which clocks in at 400 gigabits per second. Network technology that connects servers at high speed. This is important because heavy computational AI work requires thousands of chips to work together across multiple servers.


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)