SK Hynix ships world’s first 12-layer HBM4 samples early

The S.Korean high-bandwidth memory chip giant originally planned to ship the samples in early 2026

SK Hynix booth design at GTC 2025 in San Jose, CA (Courtesy of SK Hynix) 
SK Hynix booth design at GTC 2025 in San Jose, CA (Courtesy of SK Hynix) 
Chae-Yeon Kim 3
Mar 19, 2025 (Gmt+09:00) why29@hankyung.com
Korean chipmakers

South Korean memory chip giant SK Hynix Inc. has begun delivering samples of the world’s most advanced, next-generation high bandwidth memory chip, the 12-layer HBM4, to its key customers ahead of its original schedule, further widening its lead over rivals.

The HBM pioneer, a core supplier to artificial intelligence chip leader Nvidia Corp. and other AI chipmakers, announced on Wednesday that it had delivered the first HBM4 samples to its customers for quality evaluation six months earlier than its original timeline, which was early 2026.

SK Hynix said it plans to get ready for mass production of the 12-layer HBM4 within the second half of this year for the immediate start of its supply.

The faster-than-planned sample supply comes in response to mounting pressure from its customers asking for faster, more efficient memory for next-generation AI applications, especially Nvidia, amid an intensifying AI race.  

Last November, Nvidia CEO Jensen Huang asked SK Hynix to supply 12-layer HBM4 chips about six months before the Korean chipmaker’s original schedule.  

The sixth-generation high-performance memory chip is expected to be fitted in Nvidia’s next-generation graphics architecture Vera Rubin, the successor of the Blackwell AI chip.

On Tuesday, Huang said at the Nvidia GPU Technology Conference (GTC) 2025 – dubbed the “Super Bowl of AI” – that Blackwell Ultra is slated for the second half of 2025, while the Rubin AI chip is expected to launch in late 2026.

Nvidia CEO Jensen Huang at GTC 2025 (Courtesy of Getty Images)
Nvidia CEO Jensen Huang at GTC 2025 (Courtesy of Getty Images)

This is also the first HBM chip developed in collaboration between SK Hynix and global top foundry player Taiwan Semiconductor Manufacturing Co. (TSMC).

Last December, the Korean chipmaker announced that it would produce the HBM4 chip on a 3-nanometer node.

It said it would adopt TSMC’s advanced logic process for its HBM4’s base die, which is placed at the bottom of an HBM connected to a graphics processing unit and acts as its brain. 

MOST ADVANCED HBM

SK Hynix’s latest innovation boasts the industry’s best capacity and speed, which are essential for AI processors that handle vast amounts of data, said the company.

Its 12-layer HBM4 is the industry’s first HBM capable of processing more than 2 terabytes (TB) of data per second. This means data equivalent to more than 400 full-HD movies (5GB each) can be processed in a second, more than 60% faster than the previous generation, the HBM3E, explained the company.

The Korean chipmaker also adopted the advanced MR-MUF process, which helps prevent chip warpage and maximizes product stability by improving heat dissipation, to achieve a capacity of 36GB, the highest among 12-layer HBM products.

SK Hynix HBM4 (Courtesy of SK Hynix) 
SK Hynix HBM4 (Courtesy of SK Hynix) 

SK Hynix has long been a leader in HBM development, holding an estimated 50% market share in the global HBM chip market.

It succeeded in mass-producing the 8- and 12-layer HBM3E in 2024, becoming the world’s first to supply the 12-high HBM3E to Nvidia.

AT GTC 2025

The Korean chipmaker will showcase its HBM and other memory products for AI data centers, as well as on-device and memory solutions for automotive applications at GTC 2025, a global AI conference taking place March 17-21 in San Jose, CA.

It will present the 12-high HBM3E and SOCAMM2, or small outline compression attached memory module, for AI servers, as well as a model of the 12-high HBM4.

SK Hynix's HBM3E, the fifth-generation HBM chip, on display at an AI exhibition in October, 2024
SK Hynix's HBM3E, the fifth-generation HBM chip, on display at an AI exhibition in October, 2024

HBM technology, which stacks multiple dynamic random access memory (DRAM) chips vertically to accelerate data transfer speeds, has become a cornerstone of the AI revolution.

Nvidia, the dominant force in AI processing, has leaned heavily on HBM technology to enhance its high-end GPUs, which power large-scale AI models such as OpenAI’s ChatGPT.

Rivals Samsung Electronics Co. and Micron Technology Inc. are also working on next-generation HBM solutions, aiming to capture a share of the growing AI infrastructure market.

Especially, SK Hynix’s cross-town rival Samsung recently outlined aggressive expansion plans for its HBM capacity but it is behind SK Hynix in both technology prowess and schedule.

Write to Chae-Yeon Kim at why29@hankyung.com


Sookyung Seo edited this article.

SK Hynix to produce HBM4 on 3 nm foundry process in 2025

SK Hynix to produce HBM4 on 3 nm foundry process in 2025

SK Hynix Inc. will adopt the 3-nanometer process, the most advanced foundry technology available, to produce customized HBM4 chips in the second half of 2025, according to semiconductor industry sources on Tuesday.The South Korean chipmaker had originally planned to manufacture the sixth-gener

Tesla asks Samsung, SK Hynix to supply HBM4 chip samples

Tesla asks Samsung, SK Hynix to supply HBM4 chip samples

Tesla's Cybertruck on display at Korea's Future Innovation Tech Expo 2024 in Daegu on Oct. 23, 2024 (Courtesy of News1 Korea)  Samsung Electronics Co. and SK Hynix Inc. are said to each be developing a sixth-generation high-bandwidth memory (HBM4) chip prototype for Tesla Inc., which has j

SK Hynix chief says no delay in 12-layer HBM3E supply as demand soars

SK Hynix chief says no delay in 12-layer HBM3E supply as demand soars

SK Hynix CEO Kwak Noh-jung unveils its HBM chip development roadmap during a press conference at its headquarters on May 2, 2024 Kwak Noh-jung, chief executive of SK Hynix Inc., the world’s second-largest memory chipmaker, has said there will be no delay in the company’s planned mas

SK Hynix mass-produces 12-layer HBM3E for Q4 shipment

SK Hynix mass-produces 12-layer HBM3E for Q4 shipment

SK Hynix mass-produces 36 GB high-bandwidth memory (HBM) chips (Courtesy of SK Hynix) SK Hynix Inc. said on Thursday it has begun mass-producing the world’s first 12-layer HBM3E with a capacity of 36 gigabytes (GB) for shipment later this year.It did not identify the customers, but indust

Samsung Electronics, TSMC tie up for HBM4 AI chip development

Samsung Electronics, TSMC tie up for HBM4 AI chip development

TSMC is the world's largest foundry player TAIPEI – South Korea’s Samsung Electronics Co., the world’s largest memory chipmaker, is partnering with its foundry rival Taiwan Semiconductor Manufacturing Co. (TSMC) to jointly develop a next-generation artificial intelligence chip

SK Hynix to supply 12-layer HBM3E to Nvidia in Q4; profit soars in Q2

SK Hynix to supply 12-layer HBM3E to Nvidia in Q4; profit soars in Q2

SK Hynix's DRAM plant in Icheon SK Hynix Inc., the world’s second-largest memory chipmaker after Samsung Electronics Co., said on Thursday it has made a significant turnaround in the second quarter with record quarterly sales and its highest operating profit in six years.Analysts expect S

SK Hynix’s HBM chip orders fully booked; 12-layer HBM3E in Q3: CEO

SK Hynix’s HBM chip orders fully booked; 12-layer HBM3E in Q3: CEO

SK Hynix CEO Kwak Noh-jung unveils its HBM chip development roadmap at a press conference at the company's headquarters on May 2, 2024 ICHEON, Gyeonggi Province – SK Hynix Inc., the world’s second-largest memory chipmaker after Samsung Electronics Co., said on Thursday its capacity

SK Hynix, TSMC tie up to stay ahead of Samsung for HBM supremacy

SK Hynix, TSMC tie up to stay ahead of Samsung for HBM supremacy

South Korea’s SK Hynix Inc. said on Friday it is partnering with Taiwan Semiconductor Manufacturing Co. (TSMC) to jointly develop next-generation chips for artificial intelligence as the two chipmakers push to strengthen their positions in the fast-growing AI chip market.The collaboratio

SK Hynix mass-produces HBM3E chip to supply Nvidia

SK Hynix mass-produces HBM3E chip to supply Nvidia

(Courtesy of SK Hynix) SK Hynix Inc., the world’s No. 2 memory chipmaker, began mass-producing HBM3E, the best-performing DRAM chip for AI applications, for the first time in the industry to supply Nvidia Corp., the leading global semiconductor designer. SK Hynix said on Tuesday it p

Samsung Elec to launch HBM4 in 2025 to win war in AI sector

Samsung Elec to launch HBM4 in 2025 to win war in AI sector

Hwang Sang-joon, executive vice president of DRAM product & technology at Samsung (Courtesy of Samsung) Samsung Electronics Co., the world’s top memory chipmaker, aims to introduce sixth-generation top-performance High Bandwidth Memory4 (HBM4) DRAM chips in 2025 to win the intensifyin