AMD outlined its AI roadmap at the recent earnings call, saying that the Instinct AI GPU lineup is set to generate “tens of billions” of dollars in the upcoming years.
AMD Plans To Generate “Tens of Billions” In Revenue Through Its Datacenter Business; Instinct AI Lineup To Scale Up Massively
Team Red’s journey in the AI business hasn’t been much of an impressive one, since the firm not only failed to capitalize on the hype early on, but with its Instinct AI accelerator lineup, AMD didn’t see sales traction similar to its competitor NVIDIA. However, the future surely does look bright for the company, and in the recent earnings calls, the firm has given us an update on its Instinct lineup, revealing development on the Instinct MI325X and MI350 AI accelerators, along with giving a glimpse of the future MI400 lineup.
Starting with the MI300 series, AMD’s MI300X has been a massive hit in terms of adoption in the markets since Team Red claims that the accelerator has seen deployment by the likes of Microsoft, IBM, and Meta, along with large-scale AI and HPC clusters like the El Capitan system. Interestingly, AMD has decided to start volume production of its next-gen MI350 AI accelerator by the middle of the year, which was initially scheduled for H2 2025. This move was made in light of Intel’s cancellation of Falcon Shores and AMD’s optimism towards the AI market.
The customer feedback on MI350 series has been strong, driving deeper and broader customer engagements with both existing and net new hyperscale customers in preparation for at-scale MI350 deployments.
Based on early silicon progress and the strong customer interest in the MI350 series, we now plan to sample lead customers this quarter and are on track to accelerate production shipments to mid-year
– AMD
This is indeed interesting, given that AMD’s MI350 will be based on a 3nm process node, offer up to 288 GB HBM3E memory, and be a direct competitor to NVIDIA’s Blackwell lineup. To top it all off, the MI350 is said to feature AMD’s next-gen “CDNA 4” architecture, which will show us how far AMD has come along in terms of generational gaps between its AI architectures, likely giving us a better rundown on the future of AMD’s plans.

Another interesting mention is about AMD’s next-gen MI400 AI lineup, which is said to employ the “CDNA-next” architecture. Team Red has verified that the lineup is slated to launch by 2026 and is said to offer competitive performance, likely utilizing HBM4 and much more. Overall, AMD is bullish on its AI business, as the firm claims that it expects revenue to increase massively over the upcoming years.
We believe this places AMD on a steep long-term growth trajectory, led by the rapid scaling of our data center AI franchise from more than $5 billion in revenue in 2024 to tens of billions of dollars of annual revenue over the coming years.
– AMD
It will be interesting to see how the future pans out for AMD, since the company looks to be determined to establish a stronghold over the markets.