Korean chipmakers
SK Hynix provides samples of best-performing HBM3E chip to Nvidia
Boasting the industry’s fastest data processing speed, the chip is suitable for various AI applications
By Aug 21, 2023 (Gmt+09:00)
3
Min read
Most Read
LG Chem to sell water filter business to Glenwood PE for $692 million


KT&G eyes overseas M&A after rejecting activist fund's offer


Kyobo Life poised to buy Japan’s SBI Group-owned savings bank


StockX in merger talks with Naver’s online reseller Kream


Meritz backs half of ex-manager’s $210 mn hedge fund



SK Hynix Inc., the world’s second-largest memory chipmaker after Samsung Electronics Co., said on Monday it has developed HBM3E, the industry’s best-performing DRAM chip for artificial intelligence applications, and has provided samples to its client Nvidia Corp. for performance evaluation.
HBM3E, the extended version of HBM3, or high bandwidth memory 3, is fifth-generation DRAM memory, succeeding the previous generations – HBM, HBM2, HBM2E and HMB3.
HBM is high-value, high-performance memory that vertically interconnects multiple DRAM chips, dramatically increasing data processing speed compared with earlier DRAM products.
SK Hynix said it plans to mass-produce the latest DRAM chip in the first half of next year to solidify its "unrivaled leadership in the AI memory market.”
Industry sources said Nvidia will likely use SK Hynix's HBM3E in its next-generation AI accelerator GH200 due later next year.

The South Korean chipmaker said the latest product not only meets the industry’s highest standards of speed – the key specification for AI memory products – but also offers better performance than rival products in terms of capacity, heat dissipation and user-friendliness.
The HBM3E chip can process data up to 1.15 terabytes (TBs) a second, equivalent to processing more than 230 full-HD movies of 5 gigabytes (GBs) in a single second.
SK Hynix said the product comes with a 10% improvement in heat dissipation by adopting technology called advanced mass reflow molded underfill (MR-MUF).
The latest chip also provides backward compatibility, enabling the adoption of the latest product even onto the system prepared for HBM3 chips without design or structure modification.
“We have a long history of working with SK Hynix on high bandwidth memory for leading-edge accelerated computing solutions,” said Ian Buck, vice president of Hyperscale and HPC Computing at Nvidia. “We look forward to continuing our collaboration with HBM3E to deliver the next generation of AI computing.”

SK HYNIX COMPETES WITH SAMSUNG FOR HBM3 CHIPS
The HBM series of DRAM is in growing demand as the chips power generative AI devices that operate on high-performance computing systems.
Such chips are used for high-performance data centers as well as machine learning platforms that enhance the AI and super-computing performance level.
“By increasing the supply share of the high-value HBM products, SK Hynix will also seek a fast business turnaround,” said Ryu Sung-soo, head of DRAM Product Planning at SK Hynix.
SK Hynix was the first memory vendor to start mass production of the world’s first HBM3 in June 2022.

The company said at the time it will supply the HBM3 DRAM to US graphic chipmaker Nvidia Corp. for its graphic processor unit H100.
Earlier this month, Samsung, the world’s top memory maker, said it is providing its HBM3 chip samples and chip packaging service to Nvidia for the US company’s graphics processing units.
Samsung is said to be unveiling its fifth-generation HBM3P, with its product name Snowbolt, by the end of this year, followed by its sixth-generation HBM product next year.
According to market research firm TrendForce, the global HBM market is forecast to grow to $8.9 billion by 2027 from $3.9 billion this year. HBM chips are at least five times more expensive than commoditized DRAM chips.
(Updated with a source comment on the possibility of Nvidia using the HBM3 chip in its next-generation AI accelerator and HBM market growth forecasts)
Write to Jeong-Soo Hwang at hjs@hankyung.com
In-Soo Nam edited this article.
More to Read
-
Korean chipmakersSamsung Elec to provide HBM3, packaging service to Nvidia
Aug 01, 2023 (Gmt+09:00)
5 Min read -
Korean stock marketFunds with high exposure to Nvidia, Samsung yield higher returns
May 31, 2023 (Gmt+09:00)
3 Min read -
Korean chipmakersSK Hynix unveils industry’s slimmest 12-layer, 24 GB HBM3 chip
Apr 20, 2023 (Gmt+09:00)
2 Min read -
Korean chipmakersSamsung to make 3 nm chips for Nvidia, Qualcomm, IBM, Baidu
Nov 22, 2022 (Gmt+09:00)
3 Min read -
Korean chipmakersSamsung clinches 2nd deal to make Nvidia’s latest gaming chips
Dec 17, 2020 (Gmt+09:00)
3 Min read
Comment 0
LOG IN