JAKARTA - Nvidia will reportedly use smartphone-style memory chips on its artificial intelligence (AI) servers. This is a move that Counterpoint Research says could trigger a double spike in server memory prices in the world by the end of 2026.
The report released Wednesday, November 19, states that in the past two months, global electronic supply chains have experienced a shortage of old memory chips (legacy memory), as many manufacturers shifted their focus to high-end memory chips used in semiconductors for AI applications.
However, Counterpoint insists that new problems are lurking. Nvidia has decided to suppress power consumption on AI servers by replacing the type of memory used from DDR5, the standard memory for servers, to LPDDR, the type of power-efficient memory commonly found on phones and tablets.
Nvidia is scheduled to release its financial statements on Wednesday night.
According to Counterpoint, AI servers require a much higher number of memory chips than a smartphone, the changes are expected to create a sudden surge in demand that is not yet ready for industrial handling.
SEE ALSO:
Memory manufacturers such as Samsung Electronics, SK Hynix, and Micron are currently facing a long-standing DRAM supply shortage after reducing production to focus more on developing the High Bandwidth Memory (HBM) an important component in advanced accelerators that drive global AI technology explosions.
Counterpoint warns that scarcity in the lower segment of the memory market has the potential to spread to the upper segment, as chipmakers are considering diverting more of their factory capacities to LPDDR production in order to meet Nvidia's demand.
The biggest risk in sight is in advanced memory. Nvidia's switch to LPDDR makes them become customers as big as the main smartphone manufacturer a major change in the supply chain that doesn't easily absorb demand spikes like this," wrote Counterpoint.
The research firm estimates the price of the server memory chip will double by the end of 2026.
The increase in server memory prices is predicted to increase the cost burden for cloud service providers and AI developers, especially when the data center budget has been suppressed by large expenses for GPUs and power infrastructure improvements.
The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)