Partager:

The artificial intelligence chip startup SambaNova Systems announced a new semiconductor on Tuesday, September 19, designed to allow customers to use a higher-quality artificial intelligence (AI) model at a lower overall cost.

The SN40L chip is designed to run an AI model more than twice the size of the one used by OpenAI's advanced ChatGPT version, according to the company based in Palo Alto, California.

"SN40L was built specifically for big language models used in company applications," said SambaNova CEO Rodrigo Liang. "We have built a complete series that allows us to understand well the company's use case."

According to Liang, big companies that want to implement AI in new ways face different and more complex considerations than consumer software such as ChatGPT.

Security, accuracy, and privacy are all areas where AI technology should be designed differently in order to be useful for company customers.

Nvidia dominates the AI chip market, but the surge in demand sparked by interest in a generating AI software makes the sought-after chips difficult for some companies. Intel, Advanced Micro Devices (AMD), and startups like SambaNova have moved to fill the gap.

The new SambaNova chip is capable of moving models with 5 trillion parameters, and includes two sophisticated forms of memory. Memories can sometimes become an obstacle in processing AI data.

The company says its hardware combination allows customers to run larger AI models without compromising sizes for accuracy.Taiwan Semiconductor Manufacturing Company currently manufactures the chips for SambaNova


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)