Micron set to supply SOCAMM chips to Nvidia ahead of Samsung, SK Hynix

Micron has obtained Nvidia’s approval for the AI chip, while the Korean chipmakers have yet to get the nod

Micron SOCAMM is a modular LPDDR5X memory solution for AI servers in data centers (Courtesy of Micron)
Micron SOCAMM is a modular LPDDR5X memory solution for AI servers in data centers (Courtesy of Micron)
Eui-Myung Park 3
Jun 10, 2025 (Gmt+09:00) uimyung@hankyung.com
Korean chipmakers

In a move that could reshape the global memory industry’s competitive dynamics, Nvidia Corp. has chosen US chipmaker Micron Technology Inc. as its first supplier of the next-generation memory solution known as SOCAMM.

SOCAMM, short for small outline compression attached memory module, is a new type of high-performance, low-power memory for AI servers in data centers.

The technology is closely watched in the industry, with some dubbing it the “second HBM,” or high-bandwidth memory, given its critical role in enabling AI acceleration.

According to people familiar with the matter on Tuesday, Nvidia commissioned all three major memory makers – Samsung Electronics Co., SK Hynix Inc. and Micron – to develop SOCAMM prototypes. And Micron was the first to receive Nvidia’s approval for mass production, leveraging its edge in low-power DRAM performance to outpace its larger rivals.

Nvidia is the world's top AI chip designer
Nvidia is the world's top AI chip designer

Two South Korean chipmakers, Samsung and SK Hynix, which are the world’s two largest memory manufacturers, have developed SOCAMM chips but have yet to obtain Nvidia’s certification, sources said.

A NEW BATTLEGROUND IN AI MEMORY

Unlike HBM, which is vertically stacked and tightly integrated with the graphics processing unit (GPU), SOCAMM supports the central processing unit (CPU) and plays a key supporting role in optimizing AI workloads.

The first SOCAMM modules, designed by Nvidia and based on stacked LPDDR5X chips, will feature in Nvidia’s upcoming AI accelerator platform, Rubin, slated for release next year.

SOCAMM employs wire bonding with copper interconnects to link 16 DRAM chips per module – a contrast to HBM’s through-silicon vias. The copper-based structure enhances heat dissipation, critical for performance and reliability in AI systems.

(Graphics by Dongbeom Yun)
(Graphics by Dongbeom Yun)

Micron claims its latest LPDDR5X chips are 20% more power-efficient than those of its competitors, a key factor in Nvidia’s decision.

Each AI server will house four SOCAMM modules – 256 DRAM chips in total – further amplifying the importance of thermal efficiency.

Analysts point to Micron’s late adoption of extreme ultraviolet (EUV) lithography as an unexpected advantage.

Unlike Samsung and SK Hynix, which used EUV to improve yields and densities, Micron focused on architectural innovation, which enabled superior heat management capabilities, they said.

MICRON’S STRATEGIC COMEBACK

The mass production of SOCAMM chips signals a broader resurgence for Micron, which has historically lagged behind its Korean rivals in the advanced DRAM market.

Nvidia CEO Jensen Huang discusses the company's new GPU released in 2024 (Courtesy of Yonhap)
Nvidia CEO Jensen Huang discusses the company's new GPU released in 2024 (Courtesy of Yonhap)

Analysts said SOCAMM’s scalability means it could feature in a broader range of Nvidia products, including its upcoming personal supercomputer project, “DIGITS.”

This is not the only recent win for Micron.

The company is known to have supplied a leading share of LPDDR5X chips to Samsung for its Galaxy S25 series smartphones – a rare displacement of Samsung’s own semiconductor division.

In 2022, Micron also supplied the initial batch of LPDDR5X chips for Apple’s iPhone 15 lineup.

Micron’s enhanced capabilities in low-heat memory are also expected to boost its position in the fiercely competitive HBM segment.

As memory makers gear up for HBM4, which involves stacking 12 or even 16 DRAM layers, thermal management has become an increasingly critical differentiator.

Micron HBM3E 12H 36GB is now designed into the the NVIDIA HGX B300 NVL16 and GB300 NVL72 platforms (Courtesy of Micron)
Micron HBM3E 12H 36GB is now designed into the the NVIDIA HGX B300 NVL16 and GB300 NVL72 platforms (Courtesy of Micron)

“Micron may be a latecomer in HBM, but with superior heat control and US-based credibility, the company is well-positioned to make rapid gains,” said an executive at a semiconductor equipment supplier.

The Boise-based company is doubling down on its manufacturing footprint.

Micron has committed $14 billion in capital expenditures this year, including new HBM fabs in Singapore, Japan, Taiwan and New York State.

“Such aggressive investment suggests that Micron may already have secured long-term supply contracts from major hyperscalers,” said an industry executive.

Write to Eui-Myung Park at uimyung@hankyung.com

In-Soo Nam edited this article.

Samsung in talks to supply customized HBM4 to Nvidia, Broadcom, Google

Samsung in talks to supply customized HBM4 to Nvidia, Broadcom, Google

Samsung's HBM3E chip Samsung Electronics Co., the world’s leading memory chipmaker, is in talks to supply customized sixth-generation high-bandwidth memory, or HBM4, chips to major AI chipmakers, including Nvidia Corp., Broadcom Inc. and Google LLC.The South Korean tech giant expects to s

SK Hynix’s Q1 profit soars on HBM chips, beats Samsung’s overall earnings

SK Hynix’s Q1 profit soars on HBM chips, beats Samsung’s overall earnings

SK Hynix's DRAM plant in Icheon SK Hynix Inc., the world’s second-largest memory chipmaker, reported a dramatic surge in first-quarter profits on Thursday, delivering the second-highest quarterly earnings in its history, driven by soaring demand for high-bandwidth memory (HBM) used in art

SK Hynix bets on 3D HBM as next game-changer in AI chip race

SK Hynix bets on 3D HBM as next game-changer in AI chip race

Lee Kang-wook, vice president and head of SK Hynix’s package development division SK Hynix Inc., the world’s top DRAM memory chipmaker, aims to redefine the competitive landscape in artificial intelligence memory chips with plans to introduce vertically stacked 3D HBM from its fifth

SK Hynix to collaborate with Nvidia on Cosmos physical AI platform: Chey

SK Hynix to collaborate with Nvidia on Cosmos physical AI platform: Chey

SK Group Chairman Chey Tae-won holds a press conference during CES 2025 to announce SK Hynix's collaboration with Nvidia on the Cosmos platform for physical AI LAS VEGAS – SK Hynix Inc., the world’s second-largest memory chipmaker, has agreed to collaborate with Nvidia Corp. on the

Lotte to supply advanced copper foil to Doosan for Nvidia’s Blackwell

Lotte to supply advanced copper foil to Doosan for Nvidia’s Blackwell

Lotte Energy Materials' copper foil plant in Iksan, North Jeolla province Lotte Energy Materials Corp., a leading South Korean battery materials maker, said on Tuesday it will supply advanced copper foil to Doosan Corp.'s business group (Doosan BG) for use in glass substrates.Lotte said it

Rebellions-Sapeon Korea merger to challenge Nvidia’s AI chip dominance

Rebellions-Sapeon Korea merger to challenge Nvidia’s AI chip dominance

Chipset made by Rebellions Rebellions Inc., a South Korean artificial intelligence chip design startup, is merging with its crosstown rival Sapeon Korea Inc., an AI chip firm backed by telecom giant SK Telecom Co., to get ahead of the curve in the race to develop next-generation AI semiconducto