Search
Close this search box.

Debating The Size of The AI GPU & ASIC Market

Debating The Size of The AI GPU & ASIC Market

The Six Five team discusses Debating The Size of The AI GPU & ASIC Market

If you are interested in watching the full episode you can check it out here.

Disclaimer: The Six Five Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we ask that you do not treat us as such.

Transcript:

Daniel Newman: So this week, Futurum Group and our intelligence team, we did a build of the AI GPU market and the ASIC market. We looked at a few specific things when it came to that. We really did a teardown of 2022-’23 history, every SKU, everything we could pull from public data. Went through the process of the same for cloud instances and XPU sales, and that’s sell in. So that’s numbers that we actually know shipped in, and that’s stuff that we could publicly find available. Very interesting exercise pattern.

There’s two things I want to talk about here that came out of these numbers. One is trying to reconcile the XPU market from what we know from Broadcom versus what I’m seeing from shipments that are going into these cloud providers. Then the second thing that was really interesting, Pat, is just how far apart the market sizing can be depending on the exercise. Our team, we put a 30% CAGR on the GPU market, Pat. It was about 36 billion this year, rising to 2028 on GPUs at about a 138 billion.

We’re hearing from Lisa Su and others. They’re talking about a $400 billion AI ASIC and GPU market. Now, again, how are they describing that in terms of networking and systems? Is it all in, is it the whole thing, or are they literally just talking chips? Because if that’s the case, there’s a pretty massive reconciliation between where we landed and where they landed. I will say upfront I think we were conservative on our number. I think 30%’s a very conservative growth rate. I think the challenge, Pat, is you heard me talking about Nvidia, is the sales of this pressure on the CapEx side versus at what point does the market put pressure back on all those making massive CapEx investments now to start realizing revenue? Then how much does that annual cycle, that pressure and annual cycle, create an unsustainable growth if we don’t find a way to start selling out all this stuff into applications and workloads? When does all that stuff happen?

So the take was, Pat, we had 92% of the GPUs in ’23 sitting with Nvidia. I’ve seen numbers anywhere for data center between about 88%. I think I saw a Petty research number come out. I’ve heard numbers as high as 95 and 96 depending on exactly where everything lands. But I guess rather than making this a big readout, because I’ll share the link to what we’ve published, Pat, I’m interested in your take. I feel like we’re conservative. I feel like Lisa and the 400 billion has been very aggressive. Where does your head land on where this market’s going?

Patrick Moorhead: Yeah. So, Dan, it’s interesting. I play both sides of the fence here. I was a vendor for many years and I would work with companies like yours to get data forecasts. Then what we would do is create what I like to call a fusion model, which is we’re putting … And typically insiders at a company actually know this better than the data providers, but they also have the ability to do what I like to call bending the curve, which is it’s one thing to say, okay, based on all these data points, this is the size of the market. But if you’re one of the two or three leaders who is actually making this stuff happen, you know what your roadmap is. You have an idea of what your competitors are and, therefore, you can extrapolate that out in terms of the size of the market, but also the market share that you want to take.

I’d like to invoke something that Michael Dell had said years ago in a conversation that we were having publicly with the size of the IoT market in the next 10 years. Once you start getting into the hundreds of billions and trillions, his response was, yeah, it’s a big market, we all agree, and the numbers are large. So, therefore, going after that market long term is a smart thing. I think that the numbers will get more interesting as Nvidia gets more competition. I mean I don’t know, Red Bull winning everything all the time just got to be a complete snoozer. You’ve got McLaren coming in. I mean Mercedes won a few races. Now everybody’s like competition is back.

Long term, you never have one player just dominating everything. I think the best case today of total domination from a profit standpoint is probably Apple. Even though Apple only has about a third of the overall global market for smartphones, they’re taking most of the profit pool. Then Samsung and the other Android providers are picking the scraps for profit. Now great news about companies like Samsung is they’re vertically integrated and they make most all of the content inside of their devices. So they are making margin on that, too.

But what I’m really excited about is the future of ASICs versus GPUs. That’s what I am most excited about. I’m hoping, Dan, in some of your future reports, you can tease out companies like Groq. You can tease out Cerebras. You can tease out Untether AI and the long list of companies who do AI chips. Another interesting one that I hope you do is going to be on networking, and whether it’s scale up, scale out type of networking technologies. The network, in correlated chips, is arguably as important today as it’s a super element to throttling training time and inference latency.

Daniel Newman: Yeah. To give you a little tease there, I had some estimates on 2023 revenue. Cerebras had about 170 million.

Patrick Moorhead: Okay.

Daniel Newman: Graphcore 152, Groq at 60. These were our estimates. It’s all in the data, the report-

Patrick Moorhead: Oh, it’s in there. That’s great.

Daniel Newman: Yeah. It is in there.

Patrick Moorhead: I don’t have a license to the data, Dan, so I haven’t had the ability to go out there. I was just reading it on the multiple CNBC appearances that your data had.

Daniel Newman: Yeah. One of the big challenges I’m having with this, though, is the public data of what’s selling in. There’s a lot of mystery too, like what Meta’s buying. There’s no way to really … And they’re one of the biggest consumers, for instance, of XPU. So we’re looking at cloud instances and what we know from … We’re looking at the build out of shipments from Intel Gaudi and others, these small names that you mentioned, the smaller names you mentioned, and that stuff I can really track. what I’m having a harder time tracking is how many XPUs went into Meta, how many XPUs … Sorry, TPUs, for instance, did Google buy for its own use? I can track the cloud instances that it’s selling out to customers, and they have the lion’s share of instance use right now.

So XPUs are being used by cloud providers. They have the vast majority. Then, of course, AWS came in second right now with Inferentia. On the other end of things, though, the stuff that’s being used inside, like Broadcom’s pretty masked in terms of how those shipments look. So what you talked about, I think the market for XPU is much bigger than what we can size. So that’s where I think a lot of growth comes in is because there is a big amount being shipped into these hyperscalers that don’t resell. So we’re talking about the ones that are using it for their own use.

Patrick Moorhead: Yeah.

Daniel Newman: Then, of course, it is just still early how AI is defined, meaning there is still a lot of debate on what … Like we were trying to track CPU instances that are dedicated for AI use, but we know a ton of CPUs are used for multipurpose. They’re general purpose, and they use some AI. And so, those aren’t being counted. But even in just the short period of time that Grace Hopper was shipping in ’23, because the first version of it, I mean the most dedicated compute cores, even though you have all the Intel going with the H series as head node, you’re still seeing the Grace cores being the highest core count.

So it’s really interesting to see how this goes. But like I said, the way CPU/GPUs work together, there is a lot of CPU not in the count because it’s being used for other purposes, which means Intel’s role is bigger than maybe it’s portrayed in AI right now. And so is AMD, by the way, from EPYC, because EPYC had a big part too in a number of different configurations with Nvidia GPUs and others. So all that, Pat, it’s super interesting stuff. I’ll drop the link. This has driven a ton of interest, very exciting times, and we will … Pat, I think your point about networking, it’d be very interesting to do a sort of … What did you call it? Like open ultra ethernet versus-

Patrick Moorhead: Yeah, ethernet open … Yeah, NVLink. I mean there’s so many elements in networking today.

Daniel Newman: In the rack, out of the rack.

Patrick Moorhead: Yeah.

Daniel Newman: In the rack, out of the rack, and what are people using? I think there’s a big op to really dig deeper on that particular thing.

Author Information

Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.

From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.

A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.

An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.

SHARE:

Latest Insights:

Frank Geraci, President at Cronos, joins David Nicholson to share his insights on Huddle, a groundbreaking Smartsheet solution set to redefine configuration management, version control, and the use of Smartsheet portals.
Cicero, Director of Product Marketing at Smartsheet joins David Nicholson to share his insights on ENGAGE 2024. Discover the groundbreaking announcements and the unique energy that makes ENGAGE an unmissable event.
Jennifer Stockton and Courtney Finger share how Smartsheet transformed Conga's marketing operations from "chaos to collaboration," highlighting the pivotal role of Smartsheet in streamlining processes and enhancing creativity at scale.
Amilcar Alfaro, Sr. Director, Product Marketing at Smartsheet, joins Keith Townsend to share insights on the crucial updates from ENGAGE 2024, emphasizing the value of enterprise-grade scale and the platforms' user-friendliness.