AMD posted a robust fourth quarter of 2024 that included record quarterly revenue, including record-high sales numbers in both the data center and client computing businesses. Yet, all those records did little to dismiss concerns around future AI market uncertainty and AMD's status as a challenger to Nvidia, as several analysts responded to the quarter by lowering their price targets on AMD stock.
The company’s overall revenue for Q4 ‘24 came in at about $7.7 billion, representing 24% year-over-year growth and 12% growth from the previous quarter. About $3.86 billion of that take came from the data center market, where segment revenue grew 69% year-over-year and 9% sequentially. Client segment revenue hit $2.3 billion in Q4, up 58% year-over-year and 23% sequentially. Looking at AMD’s two smaller business segments, gaming sector revenue was the downer of the quarter, as it slipped 59% to $563 million, though it was up sequentially about 22%. Meanwhile embedded segment revenue was down 13% year-over-year to $923 million.
For the full year of 2024, AMD revenue reached $25.8 billion, about 14% higher than 2023, as the data center segment accounted for roughly half of that total. Much of that was due to progress with new AI chips, as AMD President and CEO Lisa Su noted on the Q4 earnings call that the company “delivered greater than $5 billion of data center AI revenue for the year.”
Remarking on the progress of AMD’s Instinct AI chip family, she added, “Looking at the fourth quarter, MI300X production deployments expanded with our largest cloud partners. Meta exclusively used MI300X to serve its Llama 405B frontier model on Meta.AI, and added our Instinct GPUs to its OCP-compliant Grand Teton platform, which is designed for deep learning recommendation models and large-scale AI inferencing workloads. Microsoft is using MI300X to power multiple GPT 4-based Copilot services and launched flagship instances that scale up to thousands of GPUs for AI training and inference and HPC workloads.”
Su also noted that IBM late in 2024 said it would put the MI300X on its Watson X AI platform this year. “Instinct platforms are currently being deployed across more than a dozen CSPs globally, and we expect this number to grow in 2025.”
The Instinct MI325X is in volume production, and looking further out, she added, “Based on early silicon progress and the strong customer interest in the MI350 series, we now plan to sample to our lead customers this quarter, and we are on track to accelerate production shipments to mid-year. As we look forward to our multi-year road map, I'm excited to share that MI400 series development is also progressing very well.”
The timelines Su mentioned are slightly accelerated from previous goals, and she added, regarding future revenue projections, “We see this business growing to tens of billions [of dollars in revenue], as we go through the next couple of years.”
AMD CFO Jean Hu said n the earnings call othat after the strong fourth quarter of 2024 the first quarter of 2025 may not compare favorably. “We expect revenue [for Q1 2025] to be approximately $7.1 billion, plus or minus $300 million, up 30% year over year, driven by strong growth in our data center and the client business.” The company’s gaming business is still expected to see a “significant decline” in Q1, while the embedded business declines more modestly, she said, adding that overall, “We expect revenue to be down sequentially approximately 7%, driven primarily by seasonality across our businesses.”
Analysts are not thrilled about AMD's progress amid unease around AI generally
Despite the strong Q4 and positive data center vibes, analysts continue to be unenthused by AMD’s progress and plans. To take just one of several possible examples, Citi lowered its AMD stock price target from $175 to $110, which was lower than the level at which the stock closed on Wednesday just before AMD reported its earnings. As of mid-day Thursday, the price had plunged below that level to about $107 per share.
Part of the miasma around AMD has to do with its ongoing status as a distant AI runner-up to Nvidia, but there also has been a growing sense of unease around the AI sector in general and the prospect for slowing growth. Those concerns recently were augmented by the revelation of China’s DeepSeek and the methods it used to reduce the number of GPUs and the power needed to process AI.
DeepSeek’s potential effect has been widely downplayed in recent days, and major players like AMD and Nvidia have quickly moved to show how well their products run DeepSeek’s open-source R1 model. Su cast the DeepSeek news in a positive light for AMD, saying on the earnings call, “We think that innovation on the models and the algorithms is good for AI adoption. The fact that there are new ways to bring about training and inference capabilities with less infrastructure actually is a good thing because it allows us to continue to deploy AI compute and broader application space and more adoption. I think from our standpoint, we also like very much the fact that.. we are big believers in open source.”