Skip to content
  • KOSPI 2656.33 +27.71 +1.05%
  • KOSDAQ 856.82 +3.56 +0.42%
  • KOSPI200 361.02 +4.51 +1.27%
  • USD/KRW 1379 +4 +0.29%
  • JPY100/KRW 871.32 -12.1 -1.37%
  • EUR/KRW 1474.56 -0.75 -0.05%
  • CNH/KRW 189.7 +0.19 +0.1%
View Market Snapshot
Korean chipmakers

Samsung establishes HBM team to up AI chip production yields

“HBM leadership is coming to us,” Kyung Kye-hyun, Samsung's semiconductor business chief says

By Mar 29, 2024 (Gmt+09:00)

2 Min read

Samsung in February unveiled HBM3E 12H, the industry’s largest capacity HBM with a 12-layer stack (Courtesy of Samsung Electronics)
Samsung in February unveiled HBM3E 12H, the industry’s largest capacity HBM with a 12-layer stack (Courtesy of Samsung Electronics)

Samsung Electronics Co., the world’s No. 1 memory chipmaker, recently set up a high bandwidth memory (HBM) team within the memory chip division to increase production yields as it is developing a sixth-generation AI memory HBM4 and AI accelerator Mach-1.

The new team is in charge of the development and sales of DRAM and NAND flash memory, according to industry sources on March 29.

Hwang Sang-joon, corporate executive vice president and head of DRAM Product and Technology at Samsung, will lead the new team. It has not yet been decided how many employees will work for the division.

It is Samsung’s second HBM-dedicated team after the company launched the HBM taskforce team in January this year, composed of 100 talent from its device solutions division.

Samsung is ramping up efforts to upend its local rival SK Hynix Inc., the dominant player in the advanced HBM segment. In 2019, Samsung disbanded the then-HBM team on the conclusion that the HBM market would not grow significantly, a painful mistake it regrets.

Hwang Sang-joon, corporate executive vice president and head of DRAM Product and Technology at Samsung
Hwang Sang-joon, corporate executive vice president and head of DRAM Product and Technology at Samsung

TWO-TRACK STRATEGY

To grab the lead in the AI chip market, Samsung will pursue a “two-track” strategy of simultaneously developing two types of cutting-edge memory chips: HBM and Mach-1.

It plans to mass-produce HBM3E in the second half of this year and produce its follow-up model HBM4 in 2025.

Currently, HBM3E is the best-performing DRAM for AI applications and a fifth-generation DRAM memory, succeeding the previous generations: HBM, HBM2, HBM2E and HMB3.

“Customers who want to develop customized HBM4 will work with us,” Kyung Kye-hyun, head of Samsung's semiconductor business, said in a note posted on a social media platform on Friday.

“HBM leadership is coming to us thanks to the dedicated team’s efforts,” he added.

At Memcon 2024, a gathering of global chipmakers, held in San Jose, California on Tuesday, Samsung's Hwang said he expects the company to increase its HBM chip production volume by 2.9 times this year, compared to last year’s output.

Samsung Electronics' annual general meeting on March 20, 2024
Samsung Electronics' annual general meeting on March 20, 2024


HBM is a high-performance memory chip stacking multiple DRAMs vertically and an essential component of AI chips in processing great volumes of data.

According to Yole Group, a French IT research firm, the HBM market is forecast to expand to $19.9 billion in  2025 and $37.7 billion in 2029, compared to an estimated $14.1 billion in 2024.

MACH-1

Last week, Kyung said at its annual general meeting that the Mach-1 AI chip is currently under development and the company plans to produce a prototype by year-end.

Mach-1 is in the form of a system-on-chip (SoC) that reduces the bottleneck between the graphics processing unit (GPU) and HBM chips.

Samsung is also preparing to develop Mach-2, a next-generation model of inference-committed AI accelerator Mach-1.

“We need to accelerate the development of Mach-2, for which clients are showing strong interest,” Kyung said in the note on Friday.

Write to Jeong-Soo Hwang at hjs@hankyung.com
Yeonhee Kim edited this article. 
More to Read
Comment 0
0/300