Search
Close this search box.

AMD Q4 2023 Earnings Highlight a Strong Finish

AMD Q4 2023 Earnings Highlight a Strong Finish

The News: AMD announced revenue for the fourth quarter (Q4) of 2023 of $6.2 billion, gross margin of 47%, operating income of $342 million, net income of $667 million, and diluted earnings per share (EPS) of $0.41. On a non-Generally Accepted Accounting Principles (non-GAAP) basis, gross margin was 51%, operating income was $1.4 billion, net income was $1.2 billion, and diluted EPS was $0.77.

For the full year 2023, the company reported revenue of $22.7 billion, gross margin of 46%, operating income of $401 million, net income of $854 million, and diluted EPS of $0.53. On a non-GAAP basis, gross margin was 50%, operating income was $4.9 billion, net income was $4.3 billion, and diluted EPS was $2.65. You can read more on the AMD website.

AMD Q4 2023 Earnings Highlight a Strong Finish

Analyst Take: “AMD executed well in 2023 despite a mixed demand environment,” said AMD EVP, CFO, and Treasurer Jean Hu. “We drove year-over-year revenue growth in our Data Center and Embedded segments and successfully launched our AMD Instinct MI300 GPUs positioning us for a strong product ramp in 2024.” That is correct. Between AMD’s swift retargeting on AI and the sluggish but unmistakable start of a rebound in the PC market, AMD finished the year not only better than it started, but with strong onramps to accelerating sales into 2024 and beyond.

AMD looks well-positioned for both revenue and margin growth in 2024, initially driven by strong demand in AI-focused data center products such as its Instinct GPU and EPYC CPU, to be followed by the impending disruption of the PC market by AI-PCs starting in H2. At last year’s Microsoft Ignite event, Microsoft and AMD made the case for how Instinct MI300X accelerators, EPYC CPUs, and Ryzen CPUs with AI engines would, together, deliver an entirely new ecosystem mix of compute capabilities and services spanning cloud and PCs starting in 2024. That trajectory bears out, with a valid business case to be made for the model, particularly in the enterprise. We feel good about this development, and AMD looks to be on the right track when it comes to capitalizing on the AI opportunity both in the data center and on the edge. Let’s take a look at Q4 2023.

Data Center

The Data Center segment revenue for Q4 2023 was $2.3 billion, up 38% year-over-year (YoY) and 43% sequentially. These numbers were driven by strong growth in 4th Gen EPYC CPUs and Instinct GPUs. For the year, Data Center segment revenue added up to $6.5 billion, a 7% increase YoY, also driven by strong growth in 4th Gen EPYC CPUs and Instinct GPUs .

It is becoming clear that AMD’s MI300 accelerators—a competitive alternative to NVIDIA’s supply-challenged H100—are providing AMD with a strong onramp for growth in the AI market. AMD’s latest, the Instinct MI300X accelerators (with their boosted memory bandwidth performance for generative AI workloads) and AMD Instinct MI300A APUs (for HPC and AI workloads), are already powering cloud and enterprise AI infrastructure, with the MI330x reportedly being used by Microsoft, Meta, Oracle, Dell Technologies, Hewlett Packard Enterprise (HPE), Lenovo, Supermicro, Arista, Broadcom, and Cisco.

The bigger story here is that while NVIDIA enjoys an enviably larger share of the market, AMD provides the most viable alternative to the H100 with the MI300. This becomes all the more relevant for OEMs looking to de-risk their supply chains (or simply addressing chronic supply constraints). This stopgap/alternative play is not limited to the H100.

AMD’s MI300 accelerators have been shown to outperform the H100, rather than just provide an acceptable alternative. One of the accelerator’s strengths is its CDNA 3 architecture, which reportedly makes training AI workloads as much as 5x more energy efficient and nearly doubles the number of transistors of the H100. Among other advantages, the higher transistor count helps MI300 accelerators as a better option for inference workloads than NVIDIA’s H100. The MI300 also delivers more than 2x the memory of NVIDIA’s H100, with 1.6x more bandwidth.

It is no surprise then that the Instinct MI300 accelerator in particular, which AMD started shipping to commercial OEMs in Q4, was likely a key contributor to the company’s $2.3 billion Q4 data center sales, and why it seems like a credible growth catalyst for AMD in 2024, although we might not see growth take off until H2 2024. Nevertheless, Meta, Microsoft, and Oracle already look to be on board with the MI300, especially with the importance and growth of inferencing workloads, some of which the MI300 performs up to 1.2x faster than the H100.

Despite the inevitably of NVIDIA’s upcoming H200, which will put new competitive strain on the MI300, AMD seems to have a credible onramp to capture a significant share of the estimated 2027 $400 billion AI hardware market.

Client

The Client segment revenue was $1.5 billion, up 62% YoY, driven primarily by strong Ryzen 7000 Series CPU sales. For the year, Client segment revenue was $4.7 billion, down 25% YoY, caused by a persistent slump in the PC market that dragged numbers down in H1.

AMD’s client segment looks well-positioned to continue taking advantage of the PC market’s recovery in 2024. Its Ryzen AI portfolio looks off to a good start with the Ryzen Pro 7040 Series CPUs and their integrated AI engine for Windows workstations based on Intel x86. At CES 2024 just a few weeks ago, AMD also launched the next-generation Ryzen 8000G-Series CPUs with a dedicated AI NPU, which will start shipping to PC OEMs in Q2 2024. So far, Ryzen AI has secured at least 50 laptop design wins with PC OEMs such as ACER, ASUS, Lenovo, HP, and Razer. The software ecosystem behind the chips also looks off to a good start, with a growing number of optimized apps and developer tools helping build the business case for not only the AIPC category but Ryzen AI specifically. Given current timelines and the likelihood that next-gen AI PCs powered by AMD chips will initially scale in the enterprise, we do not expect a significant ramp-up until at least H2 2024.

Gaming

The Gaming segment revenue was $1.4 billion, down 17% YoY and 9% sequentially, due to a decrease in semi-custom revenue, though this was partially offset by growth in Radeon GPU sales. For the year, Gaming segment revenue was $6.2 billion, down 9% YoY, again because of a drop in lower semi-custom sales.

Embedded

The Embedded segment revenue was $1.1 billion, down 24% YoY and 15% sequentially, primarily due to customers reducing their inventory levels. For the year, Embedded segment revenue was $5.3 billion, up 17% YoY, though the primary cause of the uplift was the addition of a full year of revenue tied to the acquisition of Xilinx, which was completed in February 2022.

Looking Forward

“We finished 2023 strong,” said AMD Chair and CEO Dr. Lisa Su, “with sequential and year-over-year revenue and earnings growth driven by record quarterly AMD Instinct GPU and EPYC CPU sales and higher AMD Ryzen processor sales. Demand for our high-performance data center product portfolio continues to accelerate, positioning us well to deliver strong annual growth in what is an incredibly exciting time as AI re-shapes virtually every part of the computing market.”

Daniel Newman provides insights into the latest AMD earnings on X:

Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.

Other Insights from The Futurum Group:

AMD Q3 2023 Earnings Signal PC Segment Recovery, Broadening AI Reach

AMD Datacenter & AI Conference Recap: All Eyes on AI and Cost Optimization

AMD and Hugging Face Team Up to Democratize AI Compute – Shrewd Alliance Could Lead to AI Compute Competition, Lower AI Costs

Author Information

Olivier Blanchard

Olivier Blanchard has extensive experience managing product innovation, technology adoption, digital integration, and change management for industry leaders in the B2B, B2C, B2G sectors, and the IT channel. His passion is helping decision-makers and their organizations understand the many risks and opportunities of technology-driven disruption, and leverage innovation to build stronger, better, more competitive companies.

SHARE:

Latest Insights:

Nick Coult, Director at Amazon Web Services, joins Keith Townsend to share insights on Amazon ECS's 10th anniversary, discussing its evolution, key innovations, and future vision, highlighting the impact Gen AI has on the industry.
Join hosts Patrick Moorhead and Melody Brue to explore the groundbreaking impact of high-capacity SSDs and QLC technology in driving AI's future, emphasizing Solidigm's role in leading this transformative journey.
Adobe Reports Record FY2024 Revenue Driven by Strong Digital Media and Digital Experience Segments While Leveraging AI to Drive Innovation and Meet Analyst Expectations
Keith Kirkpatrick, Research Director at The Futurum Group, analyzes Adobe’s FY2024 performance. Growth in the Digital Media and Digital Experience segments contributed to record revenue while addressing challenges like the impacts of foreign exchange.
Matt Yanchyshyn, VP at AWS, joins Dion Hinchcliffe to share insights on the evolving cloud marketplace landscape, highlighting AWS Marketplace's new features and the impact of GenAI on business operations.