Skip to content
  • KOSPI 2712.14 -32.91 -1.20%
  • KOSDAQ 870.15 -2.27 -0.26%
  • KOSPI200 368.83 -5.26 -1.41%
  • USD/KRW 1369 +3 +0.22%
  • JPY100/KRW 879.4 +1.65 +0.19%
  • EUR/KRW 1471.95 +4.18 +0.28%
  • CNH/KRW 189.34 +0.37 +0.2%
View Market Snapshot
Korean chipmakers

Samsung Elec to launch HBM4 in 2025 to win war in AI sector

The Korean tech giant is developing the sixth-gen HBM model while planning to supply fifth-gen HBM3E samples

By Oct 10, 2023 (Gmt+09:00)

2 Min read

Hwang Sang-joon, executive vice president of DRAM product & technology at Samsung (Courtesy of Samsung)
Hwang Sang-joon, executive vice president of DRAM product & technology at Samsung (Courtesy of Samsung)

Samsung Electronics Co., the world’s top memory chipmaker, aims to introduce sixth-generation top-performance High Bandwidth Memory4 (HBM4) DRAM chips in 2025 to win the intensifying battle for dominance in the fast-growing artificial intelligence chip segment, an executive of the company said on Tuesday.

Hwang Sang-joon, executive vice president of DRAM product & technology at Samsung, said the company is developing the product while planning to supply samples of the fifth-generation HBM3E to customers.

HBM is a high-capacity, high-performance semiconductor chip, demand for which is soaring as it is used to power generative AI devices like ChatGPT, high-performance data centers and machine learning platforms.

Samsung has been in cut-throat competition against its smaller domestic rival SK Hynix Inc. for dominance in the sector.

“Samsung commercialized HBM for high-performance computing (HPC) in 2016 for the first time in the world,” Hwang said in a contribution to Samsung Newsroom, the company’s public relations website. “We pioneered the AI memory chip market while mass-producing second- to fourth-generation HBM products.”

The South Korean tech giant aims to lead the AI chip market with its semiconductor turnkey service, which includes foundry, memory chip supplies, advanced packaging and tests.

“We are providing customized turnkey services such as the cutting-edge 2.5D and 3D packaging along with HBM,” Hwang said. “We will offer the best service for the AI and HPC era.”

FUTURE MEMORY CHIP BUSINESS

Hwang unveiled a blueprint for the future memory chip business including computer express link (CXL) DRAM and processing-in-memory (PIM).

CXL is a next-generation interface that adds efficiency to accelerators, DRAM and storage devices used with central processing units (CPUs) in high-performance server systems. PIM is a DRAM that helps process data like a CPU.

“Memory bottlenecks are fatal for devices, which handle massive data like ChatGPT,” Hwan said. “The HBM-PIM we have recently developed eased bottlenecks of data bandwidth while increasing work performance by 12 times.”
(Captured from Samsung website)
(Captured from Samsung website)

The company is also working on the construction of PIM structure on CXL DRAMs, he added.

Hwang has high hopes for a low-power compression attached memory module (LPCAMM) unveiled last month. The LPCAMM is a more advanced memory form factor built upon a package of multiple low-power double data rate (LPDDR) memory chips, the industry’s first of its kind.

“The LPCAMM will help make laptops and other devices thinner as it needs up to 60% narrower space for installation than the existing products,” he said. “Its performance and power efficiency improved by as much as 50% and 70%, respectively. So, it will be used not only for laptops but also for data centers.”

Samsung plans to use processing technologies of below 10 nanometers, the inflection point of the global DRAM market, in the future, he said.

“We will provide memory products of ultra-high performance, ultra-high capacity and ultra-low power consumption that the world wants in the AI era.”

Write to Ik-Hwan Kim at lovepen@hankyung.com
 

Jongwoo Cheon edited this article.
More to Read
Comment 0
0/300