HBM Industry Set To Double In Revenue By 2025, Courtesy of Next-Gen AI Chips By NVIDIA & Others

The HBM market, which is often said to be the backbone of AI computing, is set to double in market revenue by 2025, as the industry moves towards next-gen chips.

The HBM Markets Have Seen a New Hope With The Influence of AI, Catalyzing Immense Growth in Revenue

High-bandwidth memory is a segment that has seen a tremendous rise in demand in recent times, especially with the advent of AI hype, which brought in companies such as Samsung and SK hynix deeply into the business. Looking at the current landscape, Samsung, SK hynix, and Micron are the three “big” players in the HBM markets, and they are expected to dominate in the future as well, courtesy of the ongoing developments by the companies, specifically of next-gen processes such as HBM4, which has received immense interest from the industry.

Seeing this, the market researcher Gartner reports that the HBM market will reach a whopping US$4.976 billion by 2025, which is almost a two-times increment when looking at the figures achieved in 2023. The estimate is solely based on the current and anticipated demand from the industry, and no surprises here since the key area where HBM sells the most is its application in AI GPUs. As reported multiple times in the past, the sudden rise in AI GPU demand created a shortage of HBM in the markets, since HBM is the primary component in the composition of an AI accelerator.

There is also a huge sense of optimism held with the future of HBM, since according to previous coverage, the industry is indeed shifting towards newer standards, with the likes of HBM3e and HBM4 set to receive widespread adoption by manufacturers.

NVIDIA has a lot planned for its customers in 2024, since the company has already announced the H200 Hopper GPU, which is expected to see mass adoption by next year, followed by the introduction of the B100 “Blackwell” AI GPUs, both of which will be based on the HBM3e memory technology. A similar situation is at AMD’s camp as well, with the debut of their next-gen AMD Instinct GPUs featuring the newer HBM type.

HBM Memory Specifications Comparison

I/O (Bus Interface) 1024 1024 1024 1024 1024-2048 1024-2048
Prefetch (I/O) 2 2 2 2 2 2
Maximum Bandwidth 128 GB/s 256 GB/s 460.8 GB/s 819.2 GB/s 1.2 TB/s 1.5 – 2.0 TB/s
DRAM ICs Per Stack 4 8 8 12 8-12 8-12
Maximum Capacity 4 GB 8 GB 16 GB 24 GB 24 – 36 GB 36-64 GB
tRC 48ns 45ns 45ns TBA TBA TBA
tCCD 2ns (=1tCK) 2ns (=1tCK) 2ns (=1tCK) TBA TBA TBA
VPP External VPP External VPP External VPP External VPP External VPP TBA
VDD 1.2V 1.2V 1.2V TBA TBA TBA
Command Input Dual Command Dual Command Dual Command Dual Command Dual Command Dual Command

The memory industry has indeed seen a revival with the advent of AI in the markets, and as far as what the indicators tell us, things won’t look to cool down for now.

News Source: Ctee

Share this story