AMD Q2 2024 Financial Results Highlight Strong Momentum in Data Center and PC Segments

AMD Q2 2024 Financial Results Highlight Strong Momentum in Data Center and PC Segments

The News: AMD (NASDAQ:AMD) announced revenue for the second quarter (Q2) of 2024 of $5.8 billion, with a gross margin of 49%, operating income of $269 million, net income of $265 million and diluted earnings per share of $0.16. On a non-GAAP basis, gross margin was 53%, operating income was $1.3 billion, net income was $1.1 billion and diluted earnings per share was $0.69. “AMD executed well in the second quarter, with revenue above the midpoint of our guidance driven by strong growth in the Data Center and Client segments,” said AMD EVP, CFO and Treasurer Jean Hu, also highlighting the expansion of gross margin while increasing strategic AI investments – a broad foundation for growth. Read the full press release here.

AMD Q2 2024 Financial Results Highlight Strong Momentum in Data Center and PC Segments

Analyst Take: As expected, a strong quarter for AMD, which was quick to understand the opportunity that AI disruption was likely to deliver, and took initiative early enough in the cycle to take full advantage of its potential. As we expected, the company beat consensus estimates for Q2, with record sales of data center processors, and a strong sales forecast that also includes AI-enabled solutions for the client (PC) segment delivering significant uplift.

“We delivered strong revenue and earnings growth in the second quarter driven by record Data Center segment revenue,” explained AMD Chair and CEO Dr. Lisa Su. “Our AI business continued accelerating and we are well positioned to deliver strong revenue growth in the second half of the year led by demand for Instinct, EPYC and Ryzen processors.”

Su also added that the velocity of advances in generative AI continue to drive demand for additional compute capacity and capabilities across virtually all core segments that depend on semiconductors, which of course AMD treats as a massive opportunity and revenue funnel as the company continues to deliver the very AI solutions that address this demand.

Overview of AMD’s Q2 2024 numbers by Segment

  • Overall: $5.8 billion in revenue (up 9% YoY), gross margin of 49%, operating income of $269 million, net income of $265 million and diluted earnings per share of $0.16. (Non-GAAP numbers: gross margin at 53%, operating income at $1.3 billion, net income at $1.1 billion and diluted earnings per share at $0.69.)
  • Data Center: Record segment revenue of $2.8 billion (up 115% YoY) primarily on account of a marked acceleration of AMD Instinct GPU shipment, and strong growth in 4th Gen AMD EPYC CPU sales. Revenue increased 21% sequentially primarily driven by the strong ramp of AMD Instinct GPU shipments.
  • Client: Segment revenue was $1.5 billion (up 49% YoY and 9% QoQ), mostly attributable to strong sales of AMD Ryzen processors. We attribute this change to strong demand for AI PCs, particularly following the announcement of powerful new Windows Copilot+ PCs entering the market this year.
  • Gaming: Segment revenue was $648 million (down 59% YoY and 30% QoQ) , which reflects a weakening of the semi-custom segment’s revenue.
  • Embedded: segment revenue landed at a healthy $861 million (down 41% YoY), but this was expected as customer inventories continue to normalize. Despite the YoY drop, revenue increased 2% QoQ.

Additional Context on AMD’s Momentum

Two quick observations: The first is AMD’s velocity – its ability to address market needs and respond to competitive threats (and opportunities) quickly, as evidenced by the speed with which some of its best-performing IP in critical segments are already ramping up. The second is AMD’s momentum, or rather speed to momentum, and this is a point I feel may not be given enough attention. Here is a bit more context about why I feel that it is important:

1) At Computex 2024 in June, AMD unveiled its expanded Instinct accelerator roadmap, which clarified the company’s annual cadence of key AI solutions releases. It highlighted the new AMD Instinct MI325X accelerator’s expected availability in Q4 2024 (with impressive memory capacity and compute performance), and its next gen CDNA 4 architecture (expected in 2025) which could bring up to a 35x increase in AI inference performance compared to AMD Instinct accelerators based on CDNA 3). Lots of excitement around this.

2) AMD also announced its new Ryzen AI 300 Series processors (the company’s third generation processor for AI PCs) with industry-leading 50 TOPs of AI processing power for Windows Copilot+ PCs. PC OEMs including Acer, ASUS, HP, Lenovo and MSI unveiled impressive new devices powered by the AI 300 series, which will be folded into the Copilot+ ecosystem (likely sometime this fall/winter). I cannot stress enough how important this class of processor will be to AMD’s opportunity in the PC segment as it enters its next supercycle this year, particularly against Intel, which has struggled to demonstrate the same speed to market thus far. (More on that in a moment.)

3) AMD’s Radeon PRO W7900 Dual Slot GPU launch also caught my eye: This is a GPU for high-performance AI workstations, with AMD’s expanded ROCm 6.1.3 software support “to enhance AI development and deployment with select AMD Radeon desktop GPUs.” AMD also launched new client and graphics offerings: the new Ryzen 9000 Series processors based on its Zen 5 architecture (for gaming, productivity and content creation), and the Ryzen PRO 8040 Series and 8000 Series (enterprise mobile and desktop processors) for today’s enterprises. Consistent positioning against NVIDIA in the GPU segment, which could catch a tailwind if AMD’s push into the AI PC space is as successful as I believe it might be in the next 12-18 months, is an AMD opportunity that also appears to be contributing to the company’s overall momentum.

4) Cloud providers also showcased offerings powered by AMD Instinct MI300X accelerators, with Microsoft announcing the general availability of new Azure ND MI300X V5 instances. This is, you guessed it, a play against NVIDIA for generative AI (read GPT) workloads, both across performance and pricing vectors. This is particularly important to highlight as demand for AI chips continues to outstrip production capacity. The theory of the case here is that AMD ought to be capable of taking market share from NVIDIA based on any (or all) of these three elements: Price, performance, and availability/speed of delivery. Microsoft’s Azure investment in AMD certainly help make that case for the remainder of the market, especially given AMD’s Q2 numbers in the data center segment.

Additionally, AMD previewed its 5th Gen AMD EPYC processors, codenamed “Turin,” powered by the new “Zen 5” core architecture (planned availability: H2 2024). AMD also announced its EPYC 4004 Series processors, a new cost-optimized offering with enterprise-class features for small and medium businesses. Meanwhile, Oracle highlighted its new HeatWave GenAI solution, powered by AMD EPYC CPUs, which is designed to enable customers to integrate generative AI to their enterprise data without any AI expertise. Also worthy of note, Oak Ridge National Lab’s Frontier supercomputer (considered to be the fastest supercomputer in the world) continues to be powered by AMD EPYC CPUs and AMD Instinct GPUs, which as new systems powered by AMD’s Instinct MI300A APU go online, helps confirm the company’s leadership in the HPC solutions space.

5) Lastly, AMD announced its Ultra Accelerator Link promoter group with industry partners, which plans to leverage AMD Infinity Fabric technology “to advance open standards-based AI networking infrastructure systems.” This is important because building apps, IP, and infrastructure solutions for AI is going to be critical to drive adoption and scale. This effort is designed to help accelerate that motion across AMD’s partner ecosystem, which also plays into the company’s overall momentum going into Q3 and Q4.

Additional considerations

Could the incoming PC refresh supercycle deliver a significant uplift to AMD’s Client business? I may sound like a broken record at this point, but keep a close eye on the PC segment. Disruption from on-device (on-PC) AI capabilities is expected to reset the refresh cycle for the entire segment starting in H2 and accelerating into 2025 and beyond. Microsoft’s first wave of Windows Copilot+ PC releases (in partnership with every major PC OEM) this past quarter marks the opening round of this refresh. And while this first wave was limited to AI PCs powered by Qualcomm’s new Arm-based Snapdragon X platform, it set the stage for the inevitable addition of AMD and Intel powered Copilot+ PCs into that ecosystem.

AMD and its PC OEM partners have already begun releasing AMD-powered AI PCs, and while AMD waits for Windows to formally welcome these devices into the Copilot+ family, they are already shipping with impressive specs and the telltale pre-installed Copilot key (a handy identifying feature for these new PCs). In fact, AMD and HP just announced an AI PC capable of delivering up to 55 TOPS on the NPU (a surprising full 5 TOPS higher than AMD’s announcement for its AI 300 series specs), signaling that AMD is not only serious about leveraging this refresh cycle opportunity to expand its PC market share, but that it brings a significant performance and differentiation story to the segment. To be continued, but I suspect that AMD’s Client business won’t just experience a significant uplift from this opportunity for the next few quarters, but that its swift entry in the segment may be enough to capture market share from Intel, whose Lunar Lake AI PC chips are slated to ship towards the end of this year (or early next year).

Can AMD dilute NVIDIA’s data center AI accelerator market share? The answer is yes, I believe that AMD can and will. The real questions are when, how much, and how fast. (We are tracking this and expect to have forecasts available soon.)

For starters, AMD’s data center revenue jumped 115% YoY to reach $2.83 billion for the quarter, driven by demand for Instinct graphics processors and Epyc server processors, which confirms that AMD is taking full advantage of the generative AI opportunity and continued demand. Second, zooming in on AMD’s MI300 series chips I see that the company sold over $1 billion during the quarter, showing encouraging momentum and stickiness of the platform against NVIDIA. Third, as demand for AI chips continues to stress supply chains, AMD is well positioned to take advantage of anticipated delivery delays and other production capacity friction to land business that might have otherwise favored NVIDIA. Four, some AMD chips may be better suited for certain types of AI-enabling use cases. (AMD IP has its own performance advantages.) And finally five, AMD’s pricing can be an additional advantage against NVIDIA’s.

Per Lisa Su: “Our AI business continued accelerating and we are well positioned to deliver strong revenue growth in the second half of the year led by demand for Instinct, EPYC and Ryzen processors. The rapid advances in generative AI are driving demand for more compute in every market, creating significant growth opportunities as we deliver leadership AI solutions across our business.”

But beware the impact of chip manufacturers’ price increases: Something to keep an eye on that slipped into our SWOT’s threat quadrant for the AI chip segment is the potential impact of price increases teased by Taiwanese fabs this spring. TSMC was among the companies soft-launching these price hikes, citing the extremely high sustained demand for AI chips and the at-capacity status of chip manufacturing facilities. We don’t expect to see this situation change anytime soon (even with construction of new fabs and packaging facilities in the US, Europe and Asia moving forward), so the economics of supply and demand support the theory that chip manufacturing prices may jump anywhere from 5% to 15% between now and 2025.

This wave of price hikes, particularly from TSMC, should impact AMD, NVIDIA, Qualcomm, Apple, Arm, Broadcom, Marvell, and MediaTek, among others. While we didn’t detect any signs of this change in AMD’s earnings, we are keeping an eye on the first hints of it turning up in other TSMC customers’ numbers for the past quarters and going into H2. As this should manifest itself evenly across the entire segment, and margins are comfortable enough to absorb the shock, considering that AI solutions are expected to warrant some measure of premium pricing, I don’t feel that this changes much about our outlook for the industry. It could, however, put a bit of unexpected downward pressure on earnings for companies relying heavily on revenue from AI chips, particularly those relying on 3nm and 5nm processes, for the next quarter or two, until the segment adjusts to the change. Not a big deal, but worth being aware of.

Looking Ahead

For the Q3 2024, AMD expects revenue to be approximately $6.7 billion (plus or minus $300 million). At the midpoint of the revenue range, this represents YoY growth of roughly 16%, with QoQ uplift of about 15%. Non-GAAP gross margin is expected to be approximately 53.5%. Given AMD’s momentum and speed of execution both in the data center and the client segments, these are rightly where AMD should continue to invest the majority of its energy, particularly as AI continues to provide the company with on-ramps to revenue growth and market share expansion.

Daniel Newman and his co-host of The Six Five Webcast, Patrick Moorhead of Moor Insights and Strategy discusses AMD’s earnings in their latest episode. Check it out here and be sure to subscribe to The Six Five Webcast so you never miss an episode.

Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author is a former employee of Infleqtion and holds an equity position in the company. The author does not hold an equity position in any other company mentioned in this article.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.

Other Insights from The Futurum Group:

AMD Q1 2024 Earnings Showcase Its Data Center and PC Solutions

The PC Segment’s AI Inflection Point: Your 2024-2025 Copilot+ PC Cheat Sheet

The Copilot+ PC Disruption Is Here: What Happens Now?

Author Information

Olivier Blanchard

Research Director Olivier Blanchard covers edge semiconductors and intelligent AI-capable devices for Futurum. In addition to having co-authored several books about digital transformation and AI with Futurum Group CEO Daniel Newman, Blanchard brings considerable experience demystifying new and emerging technologies, advising clients on how best to future-proof their organizations, and helping maximize the positive impacts of technology disruption while mitigating their potentially negative effects. Follow his extended analysis on X and LinkedIn.

SHARE:

Latest Insights:

Entry, Mid-Level Models Plus More on Cyber Resiliency
Camberley Bates at The Futurum Group covers NetApp’s announcement on block, cyber resiliency, and BlueXP.
Zoho Unveiled a New Agentic Platform to Deliver AI-enhanced Portfolio-wide Improvements Including Zoho IoT
Futurum’s Ron Westfall explores how Zoho is positioned to develop AI-enhanced solutions that can drive customer value including the optimization and full security of IoT assets that improve business outcomes and overall experience.
Microsoft Adopts a Paid Metered Approach for Autonomous Agents, Highlighting Evolving AI Monetization in Microsoft 365 Copilot
Keith Kirkpatrick, Research Director at The Futurum Group, examines Microsoft’s shift to metered pricing for autonomous agents in Microsoft 365 Copilot and its impact on enterprise AI adoption.
Andi Gutmans, VP & GM of Databases at Google Cloud, and Karan Batta, SVP of Oracle Cloud Infrastructure, join Patrick Moorhead to share their insights on leveraging the best of Oracle Database and Google Cloud for organizational growth. This collaboration aims to provide enhanced data management solutions.

Thank you, we received your request, a member of our team will be in contact with you.