SK hynix Confirms HBM4 High-Bandwidth Memory Development Begins In 2024

SK hynix has confirmed in a blog post that the development of its next-generation HBM4 high-bandwidth memory commences in 2024.

SK hynix To Begin Development of Next-Gen HBM4 High-Bandwidth Memory In 2024, To Power The Next Era of Data Centers & AI

So far, we have seen Micron & Samsung list down their next-generation HBM4 memory products while also confirming the development. These two companies have highlighted a launch timeframe of around 2025-2026. Based on the latest confirmation from SK hynix, the company has also unveiled that it plans to commence production of the next generation of high-bandwidth memory in 2024.

Talking about its HBM products, Senior Manager Kim Wang-soo highlighted that the company will be mass-producing its own HBM3E solution, an enhanced variant of the existing HBM3 memory, in 2024. The new memory will offer increased speeds and capacities. But the same year, SK hynix also plans to initiate the development of its HBM4 memory which will mark a major step in the continued evolution of the HBM product stack.

The competitive advantage will continue next year. GSM team leader Kim Wang-soo said, “With mass production and sales of HBM3E planned for next year, our market dominance will be maximized once again.” He added, “As development of HBM4, the follow-up product, is also scheduled to begin in earnest, SK Hynix’s HBM will enter a new phase next year. “It will be a year where we celebrate,” he said.

via SK hynix

Now as development is planned for 2024, we can expect actual products utilizing such memory dies to be available by the end of 2025 or 2026. A recent roadmap shared by Trendforce expects that the first HBM4 samples are expected to feature up to 36 GB capacities per stack and the full spec is expected to be released around 2H 2024-2025 timeframe by JEDEC. The first customer sampling and availability is expected in 2026 so there’s still a lot of time before we get to see the new high-bandwidth memory solutions in action.

Image Source: Trendforce

With 36 GB stacks, you can get up to 288 GB of capacity and there’s an even higher capacity planned. The HBM3E memory already tops out at 9.8 Gbps so we can expect HBM4 to be the first to break the double digit 10+ Gbps barrier. As for products, NVIDIA’s Blackwell is expected to utilize HBM3E memory modules so it will be the successor to Blackwell (possibly codenamed after Vera Rubin) or an upgraded version of that like Hopper H200 (HBM3E) to be the first to utilize the HBM4.

NVIDIA Data Center / AI GPU Roadmap

GPU Codename X Rubin Blackwell Hopper Ampere Volta Pascal
GPU Family GX200 GR100 GB200 GH200/GH100 GA100 GV100 GP100
GPU SKU X100 R100 B100 H100/H200 A100 V100 P100
Memory HBM4e? HBM4? HBM3e HBM2e/HBM3/HBM3e HBM2e HBM2 HBM2
Launch 202X 2025 2024 2022-2024 2020-2022 2018 2016

Share this story