SK Hynix has announced that its 16-layer HBM3e DRAM will be available in 2025. The device is expected to sample in 1H25.
The company’s CEO, Kwak Noh-Jung, announced the development of the 16-layer HBM3e memory with a capacity of 48Gbytes at the SK Group’s AI Summit in Seoul, Korea.
“We stacked 16 DRAM chips to realize 48Gbyte capacity and applied advanced MR-MUF [mass reflow-molded underfill] technology proven for mass production. In addition, we are developing hybrid bonding technology as a backup process,” said Kwak in a keynote.
“The 16-layer HBM3E can improve AI learning performance and inference performance by up to 18 and 32 percent, respectively, compared to the 12-layer HBM3E.” he added.
“In the long term, we will commercialize custom HBM and CXL optimized for AI to become a full stack AI memory provider,” he concluded.
Kwak added that SK Hynix is working with the world’s leading foundry – TSMC – to improve the performance of the foundational die for next-generation standard HBM4. This is a logical carrier die upon which the stack of DRAMs is mounted. The goal is to optimize the die to reduce power consumption.
“With our ‘one-team’ partnership, we will deliver the most competitive products and further solidify our position as the HBM leader,” Kwak said.