AMD sold more than $1 billion worth of its MI300X AI chips during the second quarter of 2024.
Speaking to analysts on an earnings call after AMD posted its financial results for the quarter ending June 29, CEO Lisa Su said that sales of the company’s Nvidia alternative chips had been “higher than expected.”
As a result of the stronger-than-anticipated sales, Su said the company now expected to see data center GPU revenue exceed $4.5bn in 2024, up from the $4bn she predicted when AMD posted its Q1 results in April.
Launched in early December 2023, the MI300-series is designed to train and run large language models. AMD claims the chips are the highest-performance accelerators in the world for generative AI.
In June, Microsoft announced it would be making AMD’s MI300X accelerators available to customers through the company’s Azure cloud computing service, with Scott Guthrie, EVP of Microsoft’s Cloud and AI group, describing the MI300X as the “most cost-effective GPU out there right now for Azure OpenAI.”
Data center revenue reached a “record” $2.8bn for the quarter, a year-on-year increase of 115 percent and accounting for almost half of AMD’s total sales for the quarter, which were up nine percent year-on-year to $5.8bn.
Su told analysts that the “significantly higher sales of our data center and client processors more than offset declines in gaming and embedded product sales,” which saw declines of 59 percent and 41 percent respectively during the quarter.
Looking ahead, Su reconfirmed the announcement she made at Computex earlier this year that AMD is accelerating its Instinct product roadmap to an annual cadence, and said the company would be launching its MI325X chip later this year.
“MI325X leverages the same infrastructure as MI300 and extends our generative AI performance leadership by offering twice the memory capacity and 1.3 times more peak compute performance than competitive offerings,” Su said on the earnings call.
She added that AMD plans to follow the MI325X with the MI350 series. Due to be launched in 2025, it will be based on the new CDNA 4 architecture, which is on track to deliver a 35x increase in performance compared to CDNA 3.