Microsoft Releases AI Chip To Optimize Artificial Intelligence And Competition Services With Amazon
JAKARTA - On Wednesday, November 15, Microsoft announced two custom-designed computing chips, as they are faced with high costs in providing artificial intelligence services, bringing key technology into its company.
Microsoft stated that it does not plan to sell the chips, but will use them to power their own subscription software offerings and as part of cloud computing services, Azure.
At the Ignite developer conference in Seattle, Microsoft introduced a new chip called Maia to speed up AI's computing tasks and become the basis for a "Copilot" service for 30 US dollars per month for business software users, as well as for developers looking to create custom AI services.
The Maia chip is designed to run a large language model, an AI software type that underlies Microsoft's Azure OpenAI service and is the result of Microsoft's collaboration with ChatGPT creator OpenAI.
Microsoft and other big tech companies like Alphabet are struggling with the high cost of providing AI services, which can be 10 times larger than traditional services such as search engines.
Microsoft executives have said they plan to address the costs by directing nearly all of the company's efforts to embed AI in its products through a common set of basic AI models. They stated that Maia's chips were optimized for the work.
"We think this provides a way for us to provide our customers with a better solution that is faster, lower costs, and higher quality," said Scott airing, Microsoft's cloud group executive vice president and AI.
Microsoft also announced that next year they will offer cloud services to their Azure customers running on the latest flagship chips from Nvidia and Advanced Micro Devices (AMD).
"This is not something that replaces Nvidia," said Ben Bajarin, CEO of the firm for Creative Strategies analysts.
He said Maia's chips would allow Microsoft to sell AI services in the cloud to private computers and cellphones strong enough to handle them.
"Microsoft has a very different core opportunity here because they make a lot of money per user for the service," Bajarin said.
Microsoft's second announced chip on Tuesday was designed as an internal cost saver and in response to Microsoft's main competitor, Amazon Web Services.
اقرأ أيضا:
Named Cobalt, this new chip is a central processing unit (CPU) created with technology from Arm Holdings. Microsoft revealed that they had tested Cobalt to power Teams, their business messaging tool.
However, Microsoft said that it also wants to sell direct access to Cobalt to compete with the internal chip series "Graviton" offered by Amazon.
"We are designing our Cobalt solution to ensure that we are very competitive both in terms of performance and performance (compared to Amazon chips)," said Ruis, quoted by VOI from Reuters.
Microsoft provides few technical details that make it possible to assess the competitiveness of the chip compared to traditional chips. Rani Borkar, vice president of corporate for Azure's hardware and infrastructure system, said the two were made with 5 nanometer manufacturing technology from Taiwan Semiconductor Manufacturing Co.
He added that Maia's chips will be connected to standard Ethernet network cables instead of Nvidia's more expensive custom network technology that Microsoft uses in supercomputers built for OpenAI.
"You'll see we're following the standardization route more," Borkar told Reuters.