Arm Q3 FY24 Earnings

Arm Q3 FY24 Earnings

The Six Five team discusses Arm Q3 FY24 earnings.

If you are interested in watching the full episode you can check it out here.

Disclaimer: The Six Five Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we ask that you do not treat us as such.

Transcript:

Daniel Newman: So Arm, you and I both got to talk to Rene Haas on earnings day. I don’t think I could have expected what came next.

Patrick Moorhead: It was an absolute just explosion. Let me give you the data, but we talk about somebody does a beat, beat for the quarter, beat top line, beat on the bottom line. They did a beat, beat raise, raise. They raise for the quarter and they raise for the year. They absolutely categorically crushed it. After hours, they were up 20 some percent, which was like, “Oh, my gosh, this is amazing,” and then yesterday ran up 50%. Just crazy.

A little bit of background. Listen, we’re not equities analysts, we’re industry analysts, but I feel like we have a much better view about a company’s long-term prospects market fit, what they can do to go in and grab profitable market share. Therefore, I was very bullish on its growth opportunities of what the company could do that they came out, people were like, “There’s no way they can stay in this.” Stock went down. I feel vindicated at this point for the amount of growth that I knew that they had in them.

It’s a really simple story. You have all of the markets that Arm is in getting an overall lift from AI. So the water line is going up across data center, across networking, across PCs, across smartphones. Then don’t forget, I called the PC super cycle in the fourth quarter in January. Then you have segment share gain. So they’re gaining market share, they’re gaining market share and infrastructure. AWS, you’ve got Azure doing their custom Cobalt. You’ve got some folks overseas doing that and they’re gaining market share in automotive. It’s split between Intel and Arm, but right now, the basis is some custom chips that were MIPS-based architectures. It’s weird, but work with me on this.

Then you’ve got content gains in areas like smartphones. Well, what do you mean by that? Well, you have some people, you have small cores and you have big cores, and then you’ve got v8 version cores and v9 cores. Bigger cores, look at MediaTek, they’re putting these giant big cores in there. You pay more for the big cores than you do for the small cores. The other thing is that Arm isn’t just CPUs, it’s GPUs, it’s memory controllers, it’s interfaces, it’s buses, it’s all that great stuff.

Then you add on top of that this turnkey white glove offering that they do that says, “Hey, we’re not just going to throw some IP over the wall. We’re going to offer you IP that has been taken all the way down and qualified in a specific fab on a specific process, and we’re going to do some of the software testing and validation on this for you.” It was very clear that what Microsoft Azure was doing with Cobalt was exactly that. I will bet that AWS Graviton has a certain element of turnkey as well.

So here we are. I don’t know where they opened today. Pretty dramatic. They asked me on Yahoo Finance, Julie, “Hey, is this real?” I said, “Listen, it’s rich, okay? Investors have to be putting in their products that don’t exist yet, maybe some new markets that I am unaware of to maintain that.” I only say that when I look at the market cap of let’s say a company like Qualcomm or a company like AMD, and I’ll end with this. They asked me, “Hey, Pat, what are some of the next AI chip plays?” and my comment was, “Look directly. If you want to make it simple, let’s look directly at the companies who are Arm’s customers, and Qualcomm is Arm’s biggest customer. So I would look there for potential growth.”

Daniel Newman: You called it out, I called it out. It was really, really interesting. I’m not a revisionist history person, but I think Arm, when it first came out, was maybe trying a little too hard to have an AI story and this was right around the IPO, but of course, it was appropriate given what was going on in the market to try, but I think in the back and up their sleeve, they knew that this was going to happen. So ’23 was a really GPU-centric year.

The thing about the GPU-centric year was there were some plays for CPU. It was substantial because, again, the GPU, it’s not running the application. You still got to have those big CPU cores. That’s the whole Grace Hopper. If you wanted a proof point, NVIDIA gave you a proof point. They chose an Arm-based architecture for the CPU side of their high-performance computing offerings that they were developing.

Having said that though, the obvious part is that what’s the killer app? When we get to get to the Vision Pro, and we could talk about this some more. God, it’s dumb. All right. Sorry, but the killer app for AI is not training. Training is the enabler of killer apps for AI. So you need all this horsepower in the data center so you can train all these models. We’re going to see the continuation of both these large models. God, did you see the news this morning, that Sam Altman wants to go get 5 to $7 trillion? So he does apparently want to build patterns because there’s no way you need 7 trillion for chip design, but anyways, God, I love going on these tangents.

Patrick Moorhead: Our audience expects us to as long as we’re not droning and being stupid.

Daniel Newman: Well, I can’t promise all of that, but some of it, but the point is you see where all this heading. Well, the killer app of inference basically is heavily run on the CPU. It’s heavily going to be delivered on ASICs. It’s going to be heavily delivered on CPU. By the way, this isn’t new. Intel has been talking DL Boost for years running on Xeon AI. You’ve seen all the ASIC development from AWS, you’ve seen it coming out with Microsoft. There are some training chips, but a lot of these are inference chips. A lot of these are focused on running applications, Pat. So Arm has a really big role to play in all the computer horsepower that’s going to run side by side. So you’ve got this really, really positive growth engine sitting behind it.

You got content, Pat. You got content in every device. You got content in every vehicle that’s growing. By the way, that’s all more Arm. You’re talking billions and billions, of course, of Arm-based product content going into all these different devices. Pat, you called the AIPC surge, but I don’t think people even realize how big this is going to be in terms of expanding the amount of spend with Arm.

So they’ve got this multifaceted tailwind that’s going to be more content per device, higher pricing on royalties because they’re going to be doing more of the service delivery, and on top of what you talked about with the additional subsystems, the CSS stuff that they’re doing, which is also another revenue stream. So of course, Rene was lit up and glowing when we talked to him because he’s like, “We’re just getting to the beginning of this tailwind,” but just to confuse people, this isn’t like an Arm instead of NVIDIA thing. This is like Arm is the end play.

By the way, this tailwind is holistic because it’s going to be good for x86 too. I know people want it to be like one or the other, but when this AIPC boom goes, it’s going to go on all fronts. It’s going to go where x86 PCs are going to have an AIPC and an NPU and going to run these same apps, and so are the Arm-based versions. You mentioned Qualcomm, one of the biggest winners, Apple, one of the biggest winners. I know, I know it’s hard to say it.

By the way, all these vendors that are creating different systems and the cloud providers, Pat, AWS, Microsoft, Ampere, and Google, Ampere and Oracle, they’re all going to be winners too. So the market, by the way, I got one prediction, final note on this topic. The one reason people loved this was because for the very first time, these analysts could actually put something in their spreadsheet and start to figure out how much money Arm is going to make from AI. There’s no other reason the stock goes up 50% or more in a debt, except for the fact that people can now say, “Oh, my gosh, let’s see how they’re going to make money,” which was the question mark when Arm came out was they don’t make a lot on their licensing and their royalties. Well, they’re changing their business model.

By the way, every company that comes out IPO, you can be sure probably has about six to eight quarters that they see in their forward-looking mirror that they know they can hit targets. It’s not always the case, but you can be sure SoftBank and Arm had a good idea of how they were going to make this run and they picked a good time to come out.

Patrick Moorhead: I had to throw in a truth bomb on Yahoo Finance, basically backing up what you said. Actually, most inferences is run on a CPU, and I’m not saying on an SOC that has an NPU. That’s what nobody talks about, but the problem is is nobody can measure that. Intel can’t say X percent of this is run on this because a CPU is the most diverse piece of silicon you can have and you can’t measure it. Anyways.

Daniel Newman: Maybe we will. Measures of merit are important. We should ask Brian Shrout, President of Signal 65.

Patrick Moorhead: No, I actually think this is more in the line of your data business, to be honest.

Daniel Newman: Well, I just mean we should have some fun testing and competing all these different AIPCs and data center workloads, but we got something for everybody. You want the data? We got that too.

Author Information

Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.

From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.

A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.

An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.

SHARE:

Latest Insights:

Brad Shimmin, VP and Practice Lead at The Futurum Group, examines why investors behind NVIDIA and Meta are backing Hammerspace to remove AI data bottlenecks and improve performance at scale.
Looking Beyond the Dashboard: Tableau Bets Big on AI Grounded in Semantic Data to Define Its Next Chapter
Futurum analysts Brad Shimmin and Keith Kirkpatrick cover the latest developments from Tableau Conference, focused on the new AI and data-management enhancements to the visualization platform.
Colleen Kapase, VP at Google Cloud, joins Tiffani Bova to share insights on enhancing partner opportunities and harnessing AI for growth.
Ericsson Introduces Wireless-First Branch Architecture for Agile, Secure Connectivity to Support AI-Driven Enterprise Innovation
The Futurum Group’s Ron Westfall shares his insights on why Ericsson’s new wireless-first architecture and the E400 fulfill key emerging enterprise trends, such as 5G Advanced, IoT proliferation, and increased reliance on wireless-first implementations.

Book a Demo

Thank you, we received your request, a member of our team will be in contact with you.