Skip to content
  • KOSPI 2712.14 -32.91 -1.20%
  • KOSDAQ 870.15 -2.27 -0.26%
  • KOSPI200 368.83 -5.26 -1.41%
  • USD/KRW 1367 +1 +0.07%
  • JPY100/KRW 878.17 +0.42 +0.05%
  • EUR/KRW 1472.67 +4.9 +0.33%
  • CNH/KRW 189.21 +0.24 +0.13%
View Market Snapshot
Korean chipmakers

SK Hynix provides samples of best-performing HBM3E chip to Nvidia

Boasting the industry’s fastest data processing speed, the chip is suitable for various AI applications

By Aug 21, 2023 (Gmt+09:00)

3 Min read

SK Hynix's HBM3E DRAM chip
SK Hynix's HBM3E DRAM chip

SK Hynix Inc., the world’s second-largest memory chipmaker after Samsung Electronics Co., said on Monday it has developed HBM3E, the industry’s best-performing DRAM chip for artificial intelligence applications, and has provided samples to its client Nvidia Corp. for performance evaluation.

HBM3E, the extended version of HBM3, or high bandwidth memory 3, is fifth-generation DRAM memory, succeeding the previous generations – HBM, HBM2, HBM2E and HMB3.

HBM is high-value, high-performance memory that vertically interconnects multiple DRAM chips, dramatically increasing data processing speed compared with earlier DRAM products.

SK Hynix said it plans to mass-produce the latest DRAM chip in the first half of next year to solidify its "unrivaled leadership in the AI memory market.”

Industry sources said Nvidia will likely use SK Hynix's HBM3E in its next-generation AI accelerator GH200 due later next year.

Nvidia’s advanced graphic chips
Nvidia’s advanced graphic chips

The South Korean chipmaker said the latest product not only meets the industry’s highest standards of speed – the key specification for AI memory products – but also offers better performance than rival products in terms of capacity, heat dissipation and user-friendliness.

The HBM3E chip can process data up to 1.15 terabytes (TBs) a second, equivalent to processing more than 230 full-HD movies of 5 gigabytes (GBs) in a single second.

SK Hynix said the product comes with a 10% improvement in heat dissipation by adopting technology called advanced mass reflow molded underfill (MR-MUF).

The latest chip also provides backward compatibility, enabling the adoption of the latest product even onto the system prepared for HBM3 chips without design or structure modification.

“We have a long history of working with SK Hynix on high bandwidth memory for leading-edge accelerated computing solutions,” said Ian Buck, vice president of Hyperscale and HPC Computing at Nvidia. “We look forward to continuing our collaboration with HBM3E to deliver the next generation of AI computing.”

SK Hynix's HBM3E DRAM chip
SK Hynix's HBM3E DRAM chip

SK HYNIX COMPETES WITH SAMSUNG FOR HBM3 CHIPS

The HBM series of DRAM is in growing demand as the chips power generative AI devices that operate on high-performance computing systems.

Such chips are used for high-performance data centers as well as machine learning platforms that enhance the AI and super-computing performance level.

“By increasing the supply share of the high-value HBM products, SK Hynix will also seek a fast business turnaround,” said Ryu Sung-soo, head of DRAM Product Planning at SK Hynix.

SK Hynix was the first memory vendor to start mass production of the world’s first HBM3 in June 2022.

SK Hynix developed the industry's first HBM3 DRAM chip in 2022
SK Hynix developed the industry's first HBM3 DRAM chip in 2022

The company said at the time it will supply the HBM3 DRAM to US graphic chipmaker Nvidia Corp. for its graphic processor unit H100.

Earlier this month, Samsung, the world’s top memory maker, said it is providing its HBM3 chip samples and chip packaging service to Nvidia for the US company’s graphics processing units.

Samsung is said to be unveiling its fifth-generation HBM3P, with its product name Snowbolt, by the end of this year, followed by its sixth-generation HBM product next year.

According to market research firm TrendForce, the global HBM market is forecast to grow to $8.9 billion by 2027 from $3.9 billion this year. HBM chips are at least five times more expensive than commoditized DRAM chips.

(Updated with a source comment on the possibility of Nvidia using the HBM3 chip in its next-generation AI accelerator and HBM market growth forecasts)

Write to Jeong-Soo Hwang at hjs@hankyung.com

In-Soo Nam edited this article.
More to Read
Comment 0
0/300