Skip to content
  • KOSPI 2683.65 -8.41 -0.31%
  • KOSDAQ 867.48 -1.45 -0.17%
  • KOSPI200 364.31 -0.82 -0.22%
  • USD/KRW 1376.5 -8.5 -0.61%
  • JPY100/KRW 885.52 -9.04 -1.01%
  • EUR/KRW 1475.13 -9.17 -0.62%
  • CNH/KRW 190.58 -0.9 -0.47%
View Market Snapshot
Korean chipmakers

SK Hynix, TSMC tie up to stay ahead of Samsung for HBM supremacy

The collaboration between the two rivals comes as global chipmakers race to secure their places amid the AI boom

By Apr 19, 2024 (Gmt+09:00)

4 Min read

SK Hynix, TSMC tie up to stay ahead of Samsung for HBM supremacy

South Korea’s SK Hynix Inc. said on Friday it is partnering with Taiwan Semiconductor Manufacturing Co. (TSMC) to jointly develop next-generation chips for artificial intelligence as the two chipmakers push to strengthen their positions in the fast-growing AI chip market.

The collaboration between the two competitors is also aimed at solidifying their leadership in the high-bandwidth memory (HBM) segment against fast followers, including Samsung Electronics Co., analysts said.

Although Samsung is the world’s top memory chipmaker, it lags crosstown rival SK Hynix, which has been leading the pack in the HBM DRAM segment for years.

SK Hynix, the world's second-largest memory chipmaker, said in a statement it has signed a memorandum of understanding with TSMC, the world's largest foundry, or contract chip manufacturer, to collaborate on producing next-generation HBM chips.

SK Hynix's HBM3E, the extended version of the HBM3 DRAM chip
SK Hynix's HBM3E, the extended version of the HBM3 DRAM chip

SK Hynix dominates the production of HBM, critical for generative AI computing, while the Taiwanese foundry player’s advanced packaging technology helps HBM chips and graphic processing units (GPUs) work together efficiently.

Through the initiative, SK Hynix said it plans to proceed with its development of HBM4, or the sixth generation of the HBM family, slated to be mass-produced in 2026.

"We expect a strong partnership with TSMC to help accelerate our efforts for open collaboration with our customers and develop the industry's best-performing HBM4," said Justin Kim, president and head of AI Infra at SK Hynix. "With this cooperation in place, we will strengthen our market leadership as the total AI memory provider further by beefing up competitiveness in the space of the custom memory platform."

The announcement comes as global chipmakers race to take advantage of the AI boom, which is driving demand for logic semiconductors such as processors. SK Hynix and TSMC are key suppliers to Nvidia Corp., the leader in the AI chip market.

TSMC is the world's top foundry player
TSMC is the world's top foundry player

INITIAL FOCUS ON BASE DIE

SK Hynix said the two companies will initially focus on improving the performance of the base die at the very bottom of the HBM package.

HBM is made by stacking a core DRAM die on top of a base die through processing technology called Through Silicon Via (TSV). The base die is connected to the GPU, which controls HBM chips.

SK Hynix said it has used proprietary technology to make base dies up to HBM3E, the fourth-generation DRAM memory, but plans to adopt TSMC’s advanced logic process for HBM4’s base die so additional functionality can be packed into limited space.

That helps SK Hynix produce customized HBM that meets a wide range of customer demand for performance and power efficiency, it said.

The structure of HBM DRAM chip packaging (Courtesy of AMD)
The structure of HBM DRAM chip packaging (Courtesy of AMD)

The two companies will also collaborate to optimize the integration of SK Hynix's HBM and TSMC's 2.5D packaging process, called CoWoS technology while cooperating in responding to common customer requests related to HBM.

“Looking ahead to the next-generation HBM4, we’re confident that we will continue to work closely in delivering the best-integrated solutions to unlock new AI innovations for our common customers,” said Kevin Zhang, senior vice president of TSMC’s Business Development and Overseas Operations Office.

SAMSUNG TURNS UP THE HEAT

A laggard in the sought-after HBM chips, Samsung has vowed to spend heavily to develop next-generation AI chips.

Last month, Kyung Kye-hyun, head of Samsung's semiconductor business, said it is developing a next-generation AI chip, Mach-1, to unveil a prototype by year-end.

SK Hynix, TSMC tie up to stay ahead of Samsung for HBM supremacy

Samsung said in February it developed HBM3E 12H, the industry's first 12-stack HBM3E DRAM and the highest-capacity HBM product to date. Samsung said it will start mass production of the chip in the first half of this year.

HBM has become an essential part of the AI boom, as it provides the much-needed faster processing speed compared with traditional memory chips.

Nvidia said last month it is qualifying Samsung's HBM chips for use in its products.

Currently, SK Hynix, Samsung and Micron Technology Inc. are capable of providing HBM chips that can be paired with powerful GPUs, like Nvidia's H100 systems, used for AI computing.

Market tracker TrendForce estimates SK Hynix is likely to secure a 52.5% share of the global HBM market this year, followed by Samsung at 42.4% and Micron at 5.1%.

Earlier this month, SK Hynix said it would spend $3.87 billion to build advanced AI chip packaging and research and development facilities in Indiana, which would be the first of its kind in the US and the company’s first overseas HBM chip plant.

Write to Jeong-Soo Hwang at hjs@hankyung.com


In-Soo Nam edited this article.
More to Read
Comment 0
0/300