Skip to content
  • KOSPI 2745.05 +10.69 +0.39%
  • KOSDAQ 872.42 +1.16 +0.13%
  • KOSPI200 374.09 +1.29 +0.35%
  • USD/KRW 1365.5 +5.5 +0.4%
  • JPY100/KRW 877.66 -1.72 -0.2%
  • EUR/KRW 1468.19 +5.31 +0.36%
  • CNH/KRW 188.86 +0.61 +0.32%
View Market Snapshot
Korean innovators at CES 2024

Samsung to double HBM chip production to lead on-device AI chip era

The world’s No. 1 memory maker has put its latest advanced AI chip series on display at CES 2024

By Jan 12, 2024 (Gmt+09:00)

3 Min read

▲ Prepared for the AI Revolution! Explore Samsung Semiconductor's Exclusive CES 2024 Showcase!


LAS VEGAS – Samsung Electronics Co., the world’s largest memory chipmaker, plans to more than double its high bandwidth memory (HBM) chip production volume as it aims to take the lead in the artificial intelligence chip segment.

Han Jin-man, executive vice president responsible for Samsung's US semiconductor business, said on Thursday the company is pinning high hopes on high-capacity memory chips, including the HBM series, to lead the fast-growing AI chip segment.

“We will raise our HBM chip production volume by 2.5 times this year compared to last year’s output. The pace will continue with another twofold increase next year,” he told reporters during a media session at CES 2024.

“Memory chips will play the leading role in the AI era. Samsung will not be influenced by the industry’s ups and downs. We will steadily expand our investment in the growth sector,” he said.

Han is the highest-level Samsung executive to unveil the company’s HBM chip production plans for this year and next.

Han Jin-man, executive VP of Samsung's US semiconductor business, outlines the chipmaker's HBM chip business plans at CES 2024
Han Jin-man, executive VP of Samsung's US semiconductor business, outlines the chipmaker's HBM chip business plans at CES 2024


The Suwon, South Korea-based chipmaker has been striving to enhance its presence in the HBM segment, in which it is falling behind its crosstown rival SK Hynix Inc.

HBM is a high-capacity, high-performance semiconductor chip, demand for which is soaring as it is used to power generative AI devices like ChatGPT, high-performance data centers and machine learning platforms.

The HBM series of DRAM is the talk of the town these days as electronics makers are unveiling products equipped with on-device AI technology, which enables customized and personalized AI functions on smartphones and other smart gadgets.

HBM3, one of the most advanced such chips currently available, is said to have a capacity 12 times higher and a bandwidth 13 times higher than GDDR6, the latest DRAM product.

Samsung's advanced DRAM chips
Samsung's advanced DRAM chips

According to market tracker TrendForce, the global HBM market is forecast to grow to $8.9 billion by 2027 from an estimated $3.9 billion this year.

Last October, Samsung said it plans to unveil next-generation HBM4 chips in 2025 as part of its push to lead in the AI chip sector.

MEMORY-FOUNDRY TURNKEY SERVICE

Samsung said it aims to raise its HBM competitiveness by offering its clients a turnkey service, in which the company packages the graphic processing unit (GPU) made by Samsung Foundry and HBM chips into a single chipset.

“We’re positively considering producing next-generation HBM chips not at the memory process but at the foundry process to maximize business efficiency as we do both memory and foundry,” Han said at CES 2024.

Samsung Electronics Chairman Jay Y. Lee (third from left) visits a Samsung chip packaging line in Korea on Feb. 17, 2023
Samsung Electronics Chairman Jay Y. Lee (third from left) visits a Samsung chip packaging line in Korea on Feb. 17, 2023

Leading chipmakers such as foundry leader Taiwan Semiconductor Manufacturing Co. (TSMC) and Intel Corp. are fiercely competing for advanced packaging to enhance chip performance without having to shrink the nanometer through ultra-fine processing, which is technologically challenging and more time consuming.

In November, industry sources said Samsung plans to unveil an advanced three-dimensional (3D) chip packaging technology in 2024 to better compete with its rivals.

SAMSUNG’S NEW AI MEMORY CHIPS AT CES 2024

At this year’s electronics show, Samsung is showcasing several latest memory chips currently under development or already in supply to its clients.

The HBM series of DRAM is the talk of the town at CES as they are in growing demand for use in generative AI devices
The HBM series of DRAM is the talk of the town at CES as they are in growing demand for use in generative AI devices

To meet growing demand from generative AI chip users, the company has put on display 12-nanometer 32-gigabyte double data rate 5 (DDR5) DRAM chips; Shinebolt, its HBM3E chip; and CMM-D, a computer express link (CXL) DRAM module.

For on-device AI functions, Samsung is showcasing LPDDR5X-PIM, an advanced DRAM chip that helps process data like a central processing unit (CPU).

Samsung's H-Cube chip packaging solution
Samsung's H-Cube chip packaging solution

The company is also showing off its 2.5D packaging technology H-Cube and I-Cube series at CES. 

“From 2025, chip demand will exceed supply. Barring the unexpected, client orders will rise significantly,” Han said.

Write to Jeong-Soo Hwang at hjs@hankyung.com

In-Soo Nam edited this article.
More to Read
Comment 0
0/300