196 Views

SK hynix starts mass production of 192GB AI server memory

LinkedIn Facebook X
April 29, 2026

Get a Price Quote

SK hynix has commenced mass production of a cutting-edge 192GB SOCAMM2 memory module, aimed at next-generation AI servers and paving the way for more efficient, high-performance infrastructure.

The newly developed module is based on the company’s 1cnm (sixth-generation 10nm-class) LPDDR5X DRAM process and is tailored specifically for AI workloads, such as large language model (LLM) training and inference. This advancement underscores the increasing importance of memory innovation in overcoming bottlenecks in AI systems, particularly in terms of power efficiency and bandwidth scaling.

Revolutionizing AI Memory Architecture

SOCAMM2 (Small Outline Compression Attached Memory Module 2) marks a departure from traditional server memory strategies by repurposing low-power mobile DRAM for data center applications. The module offers a compact form factor with scalability, while its compression connector enhances signal integrity and simplifies replacement.

SK hynix claims that the 192GB SOCAMM2 provides more than double the bandwidth of conventional RDIMM modules while enhancing power efficiency by over 75%. This makes it well-suited for demanding AI tasks where memory throughput and energy consumption are critical factors.

The product has been fine-tuned for compatibility with the NVIDIA Vera Rubin platform, highlighting the increasing alignment between memory suppliers and AI accelerator ecosystems. As AI models expand in complexity — often reaching hundreds of billions of parameters — memory performance plays an increasingly pivotal role in overall system efficiency.

Addressing AI Infrastructure Challenges

SK hynix anticipates that SOCAMM2 will tackle crucial bottlenecks in AI infrastructure, particularly during LLM training and inference. By enhancing both bandwidth and efficiency, the module aims to accelerate processing speeds across entire systems.

The company also noted a broader trend in the AI market, with a shift in focus from inference to training workloads. This shift is fueling demand for memory solutions capable of handling intensive data processing while keeping power consumption low — an area where LPDDR-based modules could offer a competitive advantage.

To cater to global cloud service providers, SK hynix stated that it has already established a stable mass production system and expanded its supply offerings for AI-centric memory solutions.

“By introducing the 192GB SOCAMM2, SK hynix has set a new benchmark for AI memory performance,” remarked Justin Kim, President & Head of AI Infra (CMO, Chief Marketing Officer) at SK hynix. “We are committed to solidifying our position as the most reliable AI memory solution provider through close collaboration with our global AI clientele.”

As hyperscalers and AI chip manufacturers race to expand their infrastructure, innovations like SOCAMM2 underscore how memory is evolving into a strategic differentiator in the AI era.


Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.