AMD Instinct MI300A APUs Power French “Adastra” Supercomputer, MI300 Expected To Ship 400,000 Units In 2024

Ahead of AMD’s Instinct MI300 launch at the “Advancing AI” event, the AI accelerators have been adopted by French Supercomputer, Adastra.

AMD Powers French Adastra Supercomputer With Instinct MI300A APUs Ahead of MI300 AI Accelerator Launch Event Today

The AMD MI300A Instinct APUs incorporate the CDNA 3 GPU architecture with a TCO-optimized memory capacity & Zen 4 CPU cores on a single package This massively powerful chiplet-esque design has led AMD to fulfill its promise of delivering an Exscale APU.

It is now reported that the Adastra supercomputer which is located at CINES, Montpellier in France, is equipped with the latest HPE Cray EX4000 cabinet, which has 14 HPE Cray EX255a Blades that feature 28 different nodes housing the AMD Instinct MI300A APUs. The whole system makes use of the HPE Slingshot Interconnect, offering 11 NIC 200 Gbps connectivity.

The Adastra supercomputer took the 17th position in the Top500 list, courtesy of the computing power provided by Team Red’s Instinct APUs and 3rd Generation EPYC CPUs. The HPC showed an amazing 46.1 petaflops peak performance as of testing in November.

In addition to the Adastra supercomputer, AMD is expected to become one of the major suppliers of AI chips in the coming year as DigiTimes reports that the company will ship an estimated 300K-400K units of its Instinct MI300 accelerators in 2024. The company also announced that Microsoft Azure would be the first cloud service to feature the MI300X AI Accelerators starting Q1 2024.

AMD Instinct MI300A – Densely Packaged Exascale APUs Now A Reality

We have waited for years for AMD to finally deliver on the promise of an Exascale-class APU and the day is nearing as we move closer to the launch of the Instinct MI300A. The packaging on the MI300A is very similar to the MI300X except it makes use of TCO-optimized memory capacities & Zen 4 cores.

AMD Instinct MI300A Accelerator.

One of the active dies has two CDNA 3 GCDs cut out and replaced with three Zen 4 CCDs which offer their separate pool of cache and core IPs. You get 8 cores and 16 threads per CCD so that’s a total of 24 cores and 48 threads on the active die. There’s also 24 MB of L2 cache (1 MB per core) and a separate pool of cache (32 MB per CCD). It should be remembered that the CDNA 3 GCDs also have the L2 cache separate.

Rounding up some of the highlighted features of the AMD Instinct MI300 Accelerators, we have:

  • First Integrated CPU+GPU Package
  • Aiming Exascale Supercomputer Market
  • AMD MI300A (Integrated CPU + GPU)
  • AMD MI300X (GPU Only)
  • 153 Billion Transistors
  • Up To 24 Zen 4 Cores
  • CDNA 3 GPU Architecture
  • Up To 192 GB HBM3 Memory
  • Up To 8 Chiplets + 8 Memory Stacks (5nm + 6nm process)

As far as the industry’s application, the Adastra supercomputer is used mainly for R&D purposes in domains such as climate research, astrophysics, material sciences and chemistry, new energies including fusion, and biology and health applications. The Adastra supercomputer, under the European firm GENCI, has become a leading HPC product in Europe, and the inclusion of AMD’s new Instinct APUS is certainly a great one.

News Source: Datacenter Dynamics

Share this story