Skip to content
  • KOSPI 2692.06 +4.62 +0.17%
  • KOSDAQ 868.93 -0.79 -0.09%
  • KOSPI200 365.13 +0.65 +0.18%
  • USD/KRW 1387 +11 +0.8%
  • JPY100/KRW 879.1 -2.22 -0.25%
  • EUR/KRW 1480.21 +4.73 +0.32%
  • CNH/KRW 191.24 +1.24 +0.65%
View Market Snapshot
Korean chipmakers

Samsung, SK Hynix: key beneficiaries of ChatGPT fever

High-performance DRAMs produced by the S.Korean tech giants are in the spotlight as global tech firms eye ChatGPT-like services

By Feb 13, 2023 (Gmt+09:00)

4 Min read

A keyboard is seen reflected on a computer screen displaying the website of ChatGPT, an AI chatbot from OpenAI, in this picture taken Feb. 8, 2023 (Courtesy of Reuters/Yonhap)
A keyboard is seen reflected on a computer screen displaying the website of ChatGPT, an AI chatbot from OpenAI, in this picture taken Feb. 8, 2023 (Courtesy of Reuters/Yonhap)


Samsung Electronics Co. and SK Hynix Inc., the world’s two largest memory chipmakers, stand to benefit from the global popularity of ChatGPT. The chatbot —developed by Microsoft Corp.-backed OpenAI based on generative artificial intelligence technology — requires high-performance DRAMs.

The rush by tech companies worldwide to launch similar services is expected to bolster the global memory chip industry, which has faced weakening demand due to a global economic slowdown.

Samsung and SK Hynix are enjoying surging orders for high bandwidth memory (HBM) DRAMs, which substantially increase data processing speed compared with existing models, by connecting several DRAMs vertically.

HBM products, which work with central processing units (CPUs) and graphic processing units (GPUs) to improve the learning and computational performance of servers, are installed on AI accelerators, supercomputers and high-performance servers.

SK Hynix supplied its third-generation HBM DRAM to Nvidia Corp. for the graphics chipmaker’s A100 GPU used for ChatGPT, according to industry sources in Seoul Monday. Nvidia also installed SK Hynix’s fourth-generation HBM product on its H100, which the US company began supplying for ChatGPT servers.

Nvidia was known to have kept asking SK Hynix to supply its latest HBM3, while the world’s top CPU maker, Intel Corp., reportedly focused on the sale of products with the HBM3.

Samsung developed processing-in-memory (PIM)-enabled HBM DRAM that can not only store but also calculate data to expand the market. In October, the company supplied the HBM-PIM for the AI accelerators of Advanced Micro Devices, Inc.

“Demand for HBM DRAM is expected to keep growing in line with the expansion of advanced AI services such as ChatGPT,” said Park Jae-Gun, a distinguished professor at Hanyang University’s Advanced Semiconductor Material/Device Development Center in Seoul.

Reflecting such optimism, Samsung and SK Hynix's share prices have risen 13.7% and 20.8%, respectively, so far this year on the Korean bourse, outperforming an average 9.7% gain on the main Kospi.

Samsung's HBM-PIM (Courtesy of Samsung)
Samsung's HBM-PIM (Courtesy of Samsung)

LANDSCAPE CHANGE IN MEMORY CHIP SECTOR CAUSED BY CHATGPT

HBM DRAMs had until now received little attention as global tech companies focused on memory chips for PCs and smartphones. The average price of the high-performance products is more than triple those of normal DRAMs due to complex production processes and high-level technology.

Demand for HBM DRAMs has surged recently as the latest AI services such as ChatGPT require speedy data processing and as major tech firms lined up to buy the high-performance products.

“The prices of HBM3 have soared to as much as five times that of the best-performing DRAMs,” said an industry source in Seoul. “The speed of the market's growth is more than double what Samsung and SK Hynix initially predicted.”

The memory chip market is likely to be re-established in a few years, focusing on products specializing in AI, industry sources said.

“The era of competition over microprocessing technology development among memory chipmakers is over,” said another industry source. “It will become important to develop the AI semiconductor technology which efficiently processes data and has computational processing capabilities.”

R&D RACE

Samsung and SK Hynix are racing to develop new products to attract customers to the market, which is still at an early stage.

SK Hynix, the global HBM leader, developed and mass-produced the world’s first HBM in 2013 with AMD. The Korean company has since released next-generation products such as HBM2, HBM2E and HBM3 to dominate the market with an estimated 60%-70% share.
SK Hynix's HBM3 (Courtesy of SK Hynix)
SK Hynix's HBM3 (Courtesy of SK Hynix)

In September 2022, the memory chipmaker also unveiled a product solution to which PIM had been applied.

The high-value-added DRAM is expected to revive the global memory chip industry, which has been suffering from weak demand as consumers tighten spending on electronics.

Last month, Samsung announced it has no plans to cut semiconductor production as it expected demand for DRAM to recover thanks to the growing AI sector.

“Interactive AI services such as ChatGPT based on natural language processing technology are expected to have a positive impact on memory chip demand in the future,” said Kim Jae-joon, Samsung’s vice president for the memory chip business.

“A combination of a high-performance process capable of mass computation and a high-capacity memory is essential for learning and inference of models based on AI technology.”

Write to Jeong-Soo Hwang and Ji-Eun Jeong at hjs@hankyung.com
Jongwoo Cheon edited this article.
More to Read
Comment 0
0/300