Amazon Develops Two Nvidia Challenger AI Special Chips Grace Badminton
JAKARTA - Amazon is reportedly developing two special chips to train large language models (LLM) and accelerate the adoption of generalized Artificial Intelligence (AI).
The chip, dubbed Inferentia and Trainium, offers alternative Amazon Web Service (AWS) customers to train their LLM. It is claimed that these two special chips will compete with Nvidia's superchip, Grace fire.
"The whole world wants more chips to perform a generative AI, whether it's a GPU or whether it's an Amazon chip we designed," AWS CEO Adam Selipsky said in an interview with CNBC International, quoted Monday, August 14.
"I think we are in a better position than anyone on Earth to supply the desired capacity of our customers collectively."
Trainium will first appear on the market in 2021, following the release of Inferentia in 2019, which has now become its second generation.
" Machine learning is broken down into these two different stages. So you train the machine learning model and then you run the inference on the trained model," said AWS product VP Matt Wood explaining the usefulness of the two chips.
"Trainium is increasing by about 50 percent in terms of price performance compared to other ways to train machine learning models in AWS," he continued.
Wood said customers can use Inferentia to provide very cheap machine learning inference, high throughput and low latency.
"This is all predictions when you type a command into your generative AI model, that's where everything is processed to give you a response," Wood explained.
However, it is not known for sure when the two Amazon AI-only chips will begin to be marketed globally. As previously reported, Nvidia Grace fire was announced and available in 2024.
This chip is a CPU acceleration breakthrough designed to address the most complex problems in the world, such as LLM, a recommendation system and a vector database.