Tesla asks Samsung, SK Hynix to supply HBM4 chip samples
The US EV giant is expected to replace old HBM chips used in Dojo with the newer model under development
By Nov 19, 2024 (Gmt+09:00)
LG Chem to sell water filter business to Glenwood PE for $692 million


KT&G eyes overseas M&A after rejecting activist fund's offer


Kyobo Life poised to buy Japan’s SBI Group-owned savings bank


StockX in merger talks with Naver’s online reseller Kream


Meritz backs half of ex-manager’s $210 mn hedge fund



Samsung Electronics Co. and SK Hynix Inc. are said to each be developing a sixth-generation high-bandwidth memory (HBM4) chip prototype for Tesla Inc., which has joined its US Big Tech peers in a race to develop its own artificial intelligence chips, according to semiconductor industry sources on Tuesday.
A slew of next-generation HBM chip orders received by the South Korean memory chip cross-town rivals suggest the AI-driven HBM boom will continue through next year.
Industry sources said that the US EV giant has asked the Korean chip duo to supply HBM4 chips for general use, and it is expected to choose one of the two companies as its HBM4 supplier after testing their samples.
The Korean chipmakers have been developing customized HBM4 chips for US Big Tech companies such as Google LLC, Meta Platforms Inc. and Microsoft Corp., seeking to lower their reliance on Nvidia Corp.'s AI chips.
Joining the Big tech companies, Tesla is expected to use the next-generation HBM chip to enhance its AI chip capability.

Tesla operates Dojo, its custom-built supercomputer designed to train its “Full Self-Driving” neural networks. This is also expected to be the cornerstone of Tesla’s AI ambitions beyond self-driving.
HBM chips are one of the key parts in running the supercomputer to train AI models with massive datasets, and Tesla is expected to use the sixth-generation HBM chip in Dojo, also powered by its own AI chip D1.
The HBM4 chip could also be used in Tesla’s AI data centers under development and its self-driving cars, which are currently fitted with HBM2E chips for pilot programs.
WHY HBM4?
More advanced HBM chips can improve efficiency in processing massive data and AI model training.
The performance of the sixth-generation HBM chip is expected to be significantly improved compared with its predecessors, which were built with a base die method that connects the bottom layer of an HBM stack to the graphics processing unit (GPU).

The HBM4 uses a logic die, which sits at the base of the stack of dies and is a core component of an HBM chip.
According to SK Hynix, the HBM4 chip delivers a bandwidth that is 1.4 times faster than that of the fifth-generation HBM3E and consumes about 30% less power.
Since HBM3E delivers a bandwidth of 1.18 terabytes (TB) per second, the HBM4’s bandwidth is expected to top 1.65 TB/s. The newer model’s drain power voltage (VDD) is also set to drop to 0.8 V from 1.1 V.
FIERCE HBM4 BATTLE
Samsung and SK Hynix, the world’s two biggest memory chipmakers, are going all-out to take the lead in the HBM4 market, poised to bloom later next year.
The HBM market is forecast to grow to $33 billion in 2027 from $4 billion in 2023, according to Morgan Stanley.
The HBM market is currently led by SK Hynix, a major HBM chip supplier for the global AI chip giant Nvidia, which controls more than 90% of the global AI chip market.

To catch up to SK Hynix, its bigger memory rival Samsung Electronics has even formed a partnership with foundry archrival Taiwan Semiconductor Manufacturing Company Ltd. (TSMC) under an agreement, in which TSMC will manufacture base dies for Samsung’s HBM4 chips upon requests by the latter’s customers.
Samsung Electronics currently promotes a turnkey order for HBM chips, covering from memory architecture design to production and foundry.
In July, the world’s top memory chipmaker said it will use its cutting-edge 4-nanometer (nm) foundry process to mass-produce the HBM4 chip.
Bagging an HBM4 order from Tesla after quality tests would allow it to turn the tide in the global HBM market.
But SK Hynix is also expected to accelerate the development of HBM4 chips to win orders from Tesla with high AI ambitions, a move expected to cement its leadership.
SK Hynix has been actively seeking to develop automotive HBM chips, considered among the next-generation memory chips.
Write to Chae-Yeon Kim at why29@hankyung.com
Sookyung Seo edited this article.
-
Korean chipmakersNvidia asks SK Hynix to bring forward HBM4 supply by 6 months
Nov 04, 2024 (Gmt+09:00)
2 Min read -
Korean chipmakersSamsung Electronics, TSMC tie up for HBM4 AI chip development
Sep 05, 2024 (Gmt+09:00)
3 Min read -
Korean chipmakersSamsung, SK Hynix up the ante on HBM to enjoy AI memory boom
Sep 04, 2024 (Gmt+09:00)
3 Min read -
Korean chipmakersSamsung to mass-produce HBM4 on 4 nm foundry process
Jul 15, 2024 (Gmt+09:00)
3 Min read -
Korean chipmakersHBM chip war intensifies as SK Hynix hunts for Samsung talent
Jul 08, 2024 (Gmt+09:00)
4 Min read -
Korean chipmakersSK Hynix works on next-generation HBM chip supply plans for 2025
May 30, 2024 (Gmt+09:00)
3 Min read -
Korean chipmakersSK Hynix’s HBM chip orders fully booked; 12-layer HBM3E in Q3: CEO
May 02, 2024 (Gmt+09:00)
5 Min read