JAKARTA - Micron Technology, Inc., global leaders in memory and storage solutions, Monday 26 February announced that they have started mass production of their HBM3E (High Bandwidth Memory 3E) solution. The HBM3E 24GB 8H from Micron will be part of the NVIDIA H200 Tensor Tensor GPU, which will begin shipping in the second quarter of 2024.

This feat puts Micron at the forefront of industry, empowering artificial intelligence (AI) solutions with the performance and efficiency of leading energy in the HBM3E industry.

With demand for AI continuing to increase, the need for memory solutions that can follow the expanded workload is crucial. Micron's HBM3E solution overcomes this challenge with:

With a pin speed of more than 9.2 gigabits per second (Gb/s), the HBM3E of Micron provides more than 1.2 terabytes per second (TB/s) bandwidth memory, enabling rapid data access for AI accelerators, supercomputers, and data centers.

Micron's HBM3E leads the industry with a power consumption of ~30% lower than competitors' offerings. To support increased demand and AI use, HBM3E offers a maximum throughput with the lowest power consumption rate to increase data center operating cost metrics.

With the current capacity of 24 GB, Micron's HBM3E allows data centers to compile their AI applications without a hitch. Either to train large neural networks or speed up inference tasks, the solution from Micron provides the necessary memory bandwidth.

"Through this HBM3E achievement, Micron provides a trifecta: market leadership, best industry performance, and a different power efficiency profile," said Summit Sadana, executive vice president and chief business official at Micron Technology.

"The AI workload relies heavily on bandwidth and memory capacity, and Micron is in excellent position to support significant AI growth going forward through leading HBM3E in our industry and HBM4 roadmap, as well as our complete portfolio of DRAM and NAND solutions for AI applications," he added.

Micron developed this leading HBM3E design using their 1-beta technology, advanced through-silicon via (TSV), and other innovations that enable different packaging solutions. Micron, a proven leader in memory for a 2.5D/3D stack and advanced packaging technology, is proud to be a partner in TSMC's 3DFabric Alliance and help shape the future of semiconductor innovation and systems.

Micron is also expanding its leadership with a 36GB 12-High HBM3E sample, which is scheduled to provide more than 1.2 TB/s of performance and superior energy efficiency compared to competing solutions, in March 2024.

Micron is a sponsor at NVIDIA GTC, a global AI conference that started on March 18, where companies will share more about AI's leading memory portfolio in their industry and roadmaps.


The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)