AMD’s Instinct MI300X AI GPU Is Causing “Headaches” For Competitors As It Receives Massive Interest

AMD’s recently released Instinct MI300X accelerators have taken the AI market by storm, as it is causing “headaches” for competitors due to their appeal in the industry

AMD’s Instinct MI300X AI GPU Wipes Out NVIDIA’s Counterpart In Performance & Potentially In Market Adoption As Well

AMD’s Instinct MI300X AI GPU was launched almost a month ago, and in terms of performance gain, it destroyed its competitors in extensive benchmarking done by Team Red itself. The Instinct MI300X accelerator has been designed solely on the CDNA 3 architecture and there is a lot of stuff going on. It is going to host a mix of 5nm and 6nm IPs, all combining to deliver up to 153 Billion transistors (MI300X). Memory is another area where a huge upgrade has been witnessed, with the MI300X boasting 50% more HBM3 capacity than its predecessor, the MI250X (128 GB). Here is how it compares to NVIDIA’s H100:

  • 2.4X Higher Memory Capacity
  • 1.6X Higher Memory Bandwidth
  • 1.3X FP8 TFLOPS
  • 1.3X FP16 TFLOPS
  • Up To 20% Faster Vs H100 (Llama 2 70B) In 1v1 Comparison
  • Up To 20% Faster Vs H100 (FlashAttention 2) in 1v1 Comparison
  • Up To 40% Faster Vs H100 (Llama 2 70B) in 8v8 Server
  • Up To 60% Faster Vs H100 (Bloom 176B) In 8v8 Server

Taiwan Economic Daily reports that AMD’s Instinct MI300X AI GPU has been shipped to customers, with the likes of LaminiAI, an AI startup backed by AMD’s support, verifying the integration of the new accelerator into their portfolio.

Sources within the supply chain cite that AMD’s new Instinct GPU has managed to gain the spotlight in the markets, snatching the interest captured by competitors. This is because not only the performance gains with the MI300X are huge, but AMD has timed its release perfectly, as NVIDIA is currently stuck with the “weight” of order backlogs, which has hindered its growth when it comes to gaining new clients.

The AI markets are expected to grow rapidly over the upcoming years, with AMD’s CEO Lisa Su categorizing them as the next big thing in the tech industry. AMD believes that the AI segment has the potential to grow to a whopping $400 billion by 2027, and the firm has indeed positioned itself to capitalize on this enormous growth. Similarly, NVIDIA’s CEO Jensen Huang labeled 2023 as the “first year of AI”, claiming that we have yet to see much more ahead, since AI computing is expanding into new domains, with the likes of mobile and the automotive industry.

The future is indeed interesting, not just for the companies involved in the AI race, but with the growth of both client and consumer markets. However, it seems like with time, the competition will be stricter than ever, and 2024 won’t give NVIDIA the “easy pass” to the AI gold mine, since this time, Team Red has come back stronger.

News Source: Taiwan Economic Daily

Share this story