Skip to content
  • KOSPI 2727.69 -25.31 -0.92%
  • KOSDAQ 855.86 -14.51 -1.67%
  • KOSPI200 370.87 -3.73 -1.00%
  • USD/KRW 1354.7 +3.7 +0.27%
  • JPY100/KRW 869.71 +0.37 +0.04%
  • EUR/KRW 1471.34 +3.21 +0.22%
  • CNH/KRW 187.46 +0.41 +0.22%
View Market Snapshot
Korean chipmakers

SK Hynix’s HBM chip orders fully booked; 12-layer HBM3E in Q3: CEO

The chipmaker downplayed potential HBM oversupply, saying capacity expansion is in lockstep with demand growth

By May 02, 2024 (Gmt+09:00)

5 Min read

SK Hynix CEO Kwak Noh-jung unveils its HBM chip development roadmap at a press conference at the company's headquarters on May 2, 2024
SK Hynix CEO Kwak Noh-jung unveils its HBM chip development roadmap at a press conference at the company's headquarters on May 2, 2024

ICHEON, Gyeonggi Province – SK Hynix Inc., the world’s second-largest memory chipmaker after Samsung Electronics Co., said on Thursday its capacity to make high-bandwidth memory (HBM) chips is almost fully booked through next year, underscoring surging demand for semiconductors used in artificial intelligence devices.

“In terms of production, our HBM chips have already sold out for this year and are almost sold out for 2025,” SK Hynix Chief Executive Kwak Noh-jung said during a press conference at the chipmaker’s headquarters in Icheon, southeast of Seoul.

To solidify its HBM market leadership, he said, the company plans to provide samples of its next-generation 12-layer HBM3E chip to clients in May and begin mass production in the third quarter.

Just last week, SK Hynix said it plans to complete the development of next-generation 12-layer HBM3E chips by the end of the third quarter and produce them in large quantities from 2025.

Analysts said the South Korean chipmaker is advancing its HBM chip development roadmap to widen the gap with fast followers, including crosstown rival Samsung.

SK Hynix CEO Kwak Noh-jung (second from left) and company executives take questions during a press conference at SK Hynix's headquarters on May 2, 2024
SK Hynix CEO Kwak Noh-jung (second from left) and company executives take questions during a press conference at SK Hynix's headquarters on May 2, 2024

Earlier this week, Kim Jae-june, Samsung’s memory business vice president, said the company had begun mass production of its HBM chips for generative AI chipsets, called 8-layer HBM3E, and sales revenue from the chips will start from the end of the second quarter.

The Samsung executive also said it plans to start making the fifth-generation 12-layer version as early as the second quarter.

The SK Hynix CEO said demand for high-speed, high-capacity and low-power memory chips will “grow exponentially” in the coming years as the use of AI chips is rapidly spreading from data centers to on-device AI systems, including smartphones and autonomous vehicles.

“With the rapid increase in the number of parameters to improve AI performance and the expansion of AI service providers, the HBM market is set to grow at a rate unseen before,” he told reporters.

SK DISMISSES HBM OVERSUPPLY CONCERNS

The CEO said the chip industry’s investment in HBM differs from past practices as its capacity expansion is in lockstep with demand following consultations with customers regarding volume and chip specifications.

SK Hynix's HBM3E, the extended version of the HBM3 DRAM chip
SK Hynix's HBM3E, the extended version of the HBM3 DRAM chip

“After HBM4, the manufacturing of HBM chips will be custom-tailored, meaning the risk of an industry oversupply will significantly decrease,” he said.

According to the company, the portion of AI chips, including HBM and high-capacity DRAM modules, is expected to account for 61% of the global memory market in terms of value by 2028 from 5% in 2023.

“We expect to see annual demand growth for HMB at an average of 60% in the mid to long term,” said an SK Hynix executive at the press conference

In response to Samsung’s forecast that its accumulated HBM sales from 2016 to the end of this year could reach some $10 billion, the SK Hynix CEO said: “We cannot give exact figures, but we expect our HBM sales to come in that range or slightly above it.”

MR-MUF TECH UP TO 16-LAYER HBM CHIPS

SK Hynix executives emphasized the importance of MR-MUF, an advanced chip packaging technology for HBM chips.

“There is an industry view that MR-MUF technology may have limitations in high-level stacking, but that is not the case,” said Choi Woo-jin, vice president and head of Packaging & Test (P&T) at SK Hynix.

SK Hynix's HBM3E AI chip supplied to Nvidia
SK Hynix's HBM3E AI chip supplied to Nvidia

“We are already mass-producing 12-layer HBM3 products with advanced MR-MUF technology. We expect the technology to be the most suitable to control chip bending and an excellent solution for stacking with a high-temperature, low-pressure method,” he said.

MU-MUF technology reduces chip stacking pressure by 6% compared to previous processes and cuts the processing time, resulting in a fourfold rise in productivity and 45% improvement in heat dissipation, according to SK Hynix.

“We’re on the right track to developing 16-layer HBM chips with our MR-MUF technology. We’re also proactively considering using hybrid bonding technology,” Choi said.

LEADING THE PACK

CEO Kwak said SK Hynix, which joined the SK Group in 2012, has been investing heavily in advancing its DRAM technology even when its rivals were drastically cutting such investment.

Although Samsung is the world’s top memory chipmaker, it lags behind crosstown rival SK Hynix, which has been leading the pack in the HBM DRAM segment for years.

Last week, SK Hynix said it swung to a profit in the first quarter with record quarterly sales, emerging as the latest chipmaker to confirm solid recovery in the memory semiconductor market.

A rendering of SK Hynix's M15X fab (Courtesy of Yonhap)
A rendering of SK Hynix's M15X fab (Courtesy of Yonhap)

The company attributed its stronger-than-expected results to robust sales of its premium products for AI, including HBM chips and a rebound in NAND flash memory chips.

Among memory chipmakers, SK Hynix is the biggest beneficiary of the explosive increase in AI adoption, as it dominates the production of HBM, critical for generative AI computing and is the top supplier of AI chips to Nvidia Corp., which controls 80% of the AI chip market.

To hold its lead, SK Hynix earlier last month announced a $3.87 billion investment to build an advanced chip packaging plant in the US state of Indiana with an HBM chip line and research and development for AI products.

It also announced a collaboration with Taiwan Semiconductor Manufacturing Co. (TSMC), the world’s top contract chipmaker, to develop next-generation AI chips, called HBM4.

In late April, the company said it would spend 20 trillion won to build a new HBM DRAM plant at a site originally designated for a NAND facility in Korea to ride the AI wave.

Write to Eui-Myung Park at uimyung@hankyung.com


In-Soo Nam edited this article.
More to Read
Comment 0
0/300