AMD’s Instinct Is To Compete With NVIDIA’s AI Ascendancy; Meet The MI300
A Silicon Interposer, Eight HBM3 stacks And Four Sets Of GPU Or CPU Tiles
Serve The Home has done their best to capture the live event featuring Dr. Lisa Su revealing AMD’s new Instinct MI300 AI chip. As the title says, the design is unique for AMD and features new HBM3 onboard the chunky 153B transistor chip. The 192GB of HBM3 is paired with CDNA3 GPU IP blocks or AMD Zen 4 CPU IP blocks, and they communicate with the 5.2TB/s of memory bandwidth over a 896GB/s Infinity Fabric connection. That huge amount of memory and bandwidth should allow them to use less GPU stacks to accomplish the same tasks as NVIDIA’s H100 which is the product AMD is looking to compete with.
The next generation of Instinct will feature a model with three GPU tiles and one CPU tile, making it more flexible than the pure GPU or CPU chips of this generation; multiple Instinct MI300 can of course be strung together to offer both. AMD also announced that the new Instinct MI300 supports CXL Type-3, something they have not done previously.
Just to be clear, there are only a few AI companies that have a realistic chance to put a dent in NVIDIA’s AI share in 2023. NVIDIA is facing very long lead times for its H100 and now A100 GPUs. As a result, if you want NVIDIA for AI and do not have an order in today, we would not expect to deploy before 2024.
More Tech News From Around The Web
- Microsoft’s Azure mishap betrays an industry blind to a big problem @ The Register
- Hackers can steal cryptographic keys by video-recording power LEDs 60 feet away @ Ars Technica
- WordPress Stripe payment plugin bug leaks customer order details @ Bleeping Computer
- RDP honeypot targeted 3.5 million times in brute-force attacks @ Bleeping Computer
- Apple II – Now With ChatGPT @ Hackaday
- Clippy designer was too embarrassed to include him in his portfolio @ The Register
- Sihoo M57 Ergonomic Mesh Chair Review @ NikKTech