Micron set to supply SOCAMM chips to Nvidia ahead of Samsung, SK Hynix
Micron has obtained Nvidia’s approval for the AI chip, while the Korean chipmakers have yet to get the nod
By 21 HOURS AGO
Seoul-backed K-beauty brands set to make global mark


Seoul, once-rising art hub, feels pinch of global strains


Korea’s ruling party revives corporate law reform with stricter rules


S.Korea’s Lee to make diplomatic debut at G7 after phone call with Trump


Micron set to supply SOCAMM chips to Nvidia ahead of Samsung, SK Hynix



In a move that could reshape the global memory industry’s competitive dynamics, Nvidia Corp. has chosen US chipmaker Micron Technology Inc. as its first supplier of the next-generation memory solution known as SOCAMM.
SOCAMM, short for small outline compression attached memory module, is a new type of high-performance, low-power memory for AI servers in data centers.
The technology is closely watched in the industry, with some dubbing it the “second HBM,” or high-bandwidth memory, given its critical role in enabling AI acceleration.
According to people familiar with the matter on Tuesday, Nvidia commissioned all three major memory makers – Samsung Electronics Co., SK Hynix Inc. and Micron – to develop SOCAMM prototypes. And Micron was the first to receive Nvidia’s approval for mass production, leveraging its edge in low-power DRAM performance to outpace its larger rivals.

Two South Korean chipmakers, Samsung and SK Hynix, which are the world’s two largest memory manufacturers, have developed SOCAMM chips but have yet to obtain Nvidia’s certification, sources said.
A NEW BATTLEGROUND IN AI MEMORY
Unlike HBM, which is vertically stacked and tightly integrated with the graphics processing unit (GPU), SOCAMM supports the central processing unit (CPU) and plays a key supporting role in optimizing AI workloads.
The first SOCAMM modules, designed by Nvidia and based on stacked LPDDR5X chips, will feature in Nvidia’s upcoming AI accelerator platform, Rubin, slated for release next year.
SOCAMM employs wire bonding with copper interconnects to link 16 DRAM chips per module – a contrast to HBM’s through-silicon vias. The copper-based structure enhances heat dissipation, critical for performance and reliability in AI systems.

Micron claims its latest LPDDR5X chips are 20% more power-efficient than those of its competitors, a key factor in Nvidia’s decision.
Each AI server will house four SOCAMM modules – 256 DRAM chips in total – further amplifying the importance of thermal efficiency.
Analysts point to Micron’s late adoption of extreme ultraviolet (EUV) lithography as an unexpected advantage.
Unlike Samsung and SK Hynix, which used EUV to improve yields and densities, Micron focused on architectural innovation, which enabled superior heat management capabilities, they said.
MICRON’S STRATEGIC COMEBACK
The mass production of SOCAMM chips signals a broader resurgence for Micron, which has historically lagged behind its Korean rivals in the advanced DRAM market.

Analysts said SOCAMM’s scalability means it could feature in a broader range of Nvidia products, including its upcoming personal supercomputer project, “DIGITS.”
This is not the only recent win for Micron.
The company is known to have supplied a leading share of LPDDR5X chips to Samsung for its Galaxy S25 series smartphones – a rare displacement of Samsung’s own semiconductor division.
In 2022, Micron also supplied the initial batch of LPDDR5X chips for Apple’s iPhone 15 lineup.
Micron’s enhanced capabilities in low-heat memory are also expected to boost its position in the fiercely competitive HBM segment.
As memory makers gear up for HBM4, which involves stacking 12 or even 16 DRAM layers, thermal management has become an increasingly critical differentiator.

“Micron may be a latecomer in HBM, but with superior heat control and US-based credibility, the company is well-positioned to make rapid gains,” said an executive at a semiconductor equipment supplier.
The Boise-based company is doubling down on its manufacturing footprint.
Micron has committed $14 billion in capital expenditures this year, including new HBM fabs in Singapore, Japan, Taiwan and New York State.
“Such aggressive investment suggests that Micron may already have secured long-term supply contracts from major hyperscalers,” said an industry executive.
Write to Eui-Myung Park at uimyung@hankyung.com
In-Soo Nam edited this article.
-
Korean chipmakersSamsung in talks to supply customized HBM4 to Nvidia, Broadcom, Google
Apr 30, 2025 (Gmt+09:00)
3 Min read -
EarningsSK Hynix’s Q1 profit soars on HBM chips, beats Samsung’s overall earnings
Apr 24, 2025 (Gmt+09:00)
3 Min read -
Korean chipmakersSK Hynix bets on 3D HBM as next game-changer in AI chip race
Apr 18, 2025 (Gmt+09:00)
3 Min read -
Korean Innovators at CES 2025SK Hynix to collaborate with Nvidia on Cosmos physical AI platform: Chey
Jan 09, 2025 (Gmt+09:00)
3 Min read -
BatteriesLotte to supply advanced copper foil to Doosan for Nvidia’s Blackwell
Dec 17, 2024 (Gmt+09:00)
3 Min read -
Mergers & AcquisitionsRebellions-Sapeon Korea merger to challenge Nvidia’s AI chip dominance
Jun 12, 2024 (Gmt+09:00)
4 Min read