Search
Close this search box.

Talking Synopsys, NVIDIA, Broadcom, AI GPU & ASIC Market, Microsoft, Downward Job Revision

Talking Synopsys, NVIDIA, Broadcom, AI GPU & ASIC Market, Microsoft, Downward Job Revision

On this episode of The Six Five Webcast, hosts Patrick Moorhead and Daniel Newman discuss the tech news stories that made headlines this week. The handpicked topics for this week are:

  1. Synopsys Q3FY24 Earnings
  2. NVIDIA Earnings Predictions
  3. Broadcom Wins 1st & 2nd Generation AI ASIC Programs From OpenAI?
  4. Debating The Size of The AI GPU & ASIC Market
  5. Windows Recall Gets A Date
  6. Massive Downward Job Revision

For a deeper dive into each topic, please click on the links above. Be sure to subscribe to The Six Five Webcast so you never miss an episode.

Watch the episode here:

Listen to the episode on your favorite streaming platform:

Disclaimer: The Six Five Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we ask that you do not treat us as such.

Transcript:

Daniel Newman: Hey, everybody. Welcome back. It’s Friday. It’s another Six Five Podcast.

Patrick Moorhead: Yeah, it is.

Daniel Newman: We’re on schedule this week. We’re on schedule this week. We’re on schedule this week, which is great. We had a Monday pod, Pat. I don’t know if it feels like it just happened, but it actually has been four days since we’ve podded, and it’s Friday, and it’s been a good week. It’s been calm before the storm. You and I did a little traveling to a top secret location to meet with some really important people that we can’t share. But you’ll see it soon enough. You’ll see it soon enough. But, Pat, happy Friday morning. Looking good, my friend.

Patrick Moorhead: Happy Friday, bestie. Yeah, it was a great day yesterday. I think we were wheels up at 6:30, wheels down I think around 9:00, crushing it. But, yeah, we met with some senior executives at a company I think you know, at a place you know, but we just can’t talk about it yet. It’ll be out soon enough.

Daniel Newman: Yeah, it was in an undisclosed-

Patrick Moorhead: Exactly, undisclosed location. No, but it’s Friday. I’m pretty jazzed. It’s funny, it’s been … The past couple of weeks has not been a ton of travel, not zero travel, but with our clients and the industry not making too big of announcements out there, getting their kids off to school and stuff like that. But, hey, baby, I don’t know about you, but I think I’m home maybe one week through now and November in totality.

Daniel Newman: It’s freaking brutal. It’s brutal.

Patrick Moorhead: Yeah.

Daniel Newman: My schedule is just crazy. But it’s good, man. I mean, look, it’s good to be in demand. It’s good that everybody wants to-

Patrick Moorhead: It’s good to be in demand. Yes.

Daniel Newman: It’s good to-

Patrick Moorhead: Yes.

Daniel Newman: We’ll talk about the craziness of the economy, and it’s not actually that good out there despite rumors. It all depends what you consume. We’re going to hit on that a little bit. But in the tech industry, deflationary, AI, growth, productivity, efficiencies. There’s a heck of a lot happening, Pat. But we’ve got a great show. It’s a little different than normal. It’s not so much news. I mean it’s a little bit of news, but we’re going to have the chance to opine on some bigger things that are going on. We all know that next week is the mega earnings moment that’s going to set the entire quarter up with Nvidia. Everybody wants to know what is going on there.

But we also had Synopsys this week, a company you and I have talked about, very important, very essential to a lot of what is going on in this AI build and other parts of the chip build. We’re going to talk about some rumors because, Pat, I’ve finally gotten him to the other side to be willing to come and talk rumors with me. But there was a post … It was from a good source, this wasn’t a nonsense source, that suggested a pretty big win for Broadcom on the accelerator side. Then we’re going to talk more about the GPU and ASIC market with this week’s big ZT announcement with AMD. It created a bit of opinion about what’s going to go on with how OEMs and cloud scale and all this stuff, rack scale, happens. Then, of course, Futurum Group put out a massive AI chipset intelligence report that was featured around the world.

Patrick Moorhead: Yeah, they did.

Daniel Newman: Pat, you and I are going to get back to a AI PC story on Windows and Recall. This was the mega feature that didn’t actually end up launching immediately with the device launches. But it’s back. It’s on the calendar. Microsoft’s leader on that particular part of the business came out and gave some updates there. Then we’re going to talk about … Pat, this is a little different for us. We’re going to tie it back to tech. We’ve heard over the last few weeks Dell, Cisco have been doing some substantial layoffs, and, of course, other tech companies just quietly been laying people off. The jobs in the economy are … There was like a biggest revision in 20 years, like 15 to 20 years down. It was interesting timing. We’re in the middle of the Democratic National Convention. We’re heading into a global … It’s a national election, but it impacts the globe.

So, anyways, Pat, lots to cover. This show, for entertainment and informational purposes only. Don’t make any investments based on our advice. Pat, let’s kick off with the one central earnings topic that was big on chips this week. Let’s talk Synopsys. You ready to jump in?

Patrick Moorhead: Yeah, let’s jump in. Thanks for calling my number on that. So, yeah, Synopsys at earnings. I had the opportunity to chat with Sassine Ghazi, the CEO. So I appreciate a lot to get that double-click. I mean net-net they had a great quarter. They had a beat and a beat. They had record revenue. Then their forecast looked in line on revenue with an upward revision on EPS. There was a slight uptick after-hours and their stock actually declined the day after. But the NASDAQ, everything else was down, too. It’s interesting, as opposed to digging into specifically in the quarter, I really wanted to dive into what was said on the call. Synopsys is one of those companies that, in regards to earnings, you literally have to go on the call to get any insights.

The amount of insights and how it relates to the rest of the industry, I just thought was fascinating. On the call, Sassine … First of all, he talked about Ansys, which still expects it to close first half of 2025. But then there were these … They didn’t use their names. Sassine used characterization of their customers and how they were doing this. So debug right is a major time and resource investment for any chip, and he had said that a large US-based GPU company adopted the Verdi platform, and a large US mobile SOC company reducing failure debug from days to minutes. So how do you say Nvidia and Qualcomm without saying Nvidia and Qualcomm? That’s the way you describe it. Verification is also another key part of the design and test process for any IP or SOC vendor. With VSO.ai, a marquee US GPU company deployed the VSO.ai across multiple IPs with time improvements of 2X to 7X and coverage improvements up to 33%. So 33%, more stuff that you test 2X to 7X the speed.

Now I’ll throw one final one out here, which is … Actually I’ll do two. ZeBu, which is part of the EP product line for emulation and prototyping, hence the E and the P, they had a significant hardware expansion at a large US hyperscaler in a direct win versus the competition, competitive takeout with cadence. I guess there’s just one of three of these hyperscalers. My final one here is it also gives you an insight into tape-outs and foundries. He said we continue to win new design wins on ICV with 20 tape-outs in Q3. Four of these tape-outs were on TSMC N3 and one on IFS 18A where engagements are increasing rapidly.

What’s interesting, we had a little taste of IFS versus TSMC leading. What’s interesting is I thought Intel had said that there won’t be any tape-outs on 18A until ’25. Anyways, maybe I misheard what Intel said, or maybe they’re talking about Intel Design company using Synopsys to do a tape-out on 18A. I don’t know, we’ll see. But the insights … And I think what this did, just to make a long story longer, it just shows the value that Synopsys has on EDA.

Daniel Newman: Yeah, Pat, you hit it on the head. I mean this IP and EDA company continues to be a quiet but really critical star of this AI transformation that we have going on, whether it comes down to generative AI tools to speed up design, to being a key partner to … You kind of named without naming. It’s pretty much pervasive across the players. They’re expanding. They’re investing and divesting. They’ve been adding the things that I think are essential for their growth. They spun off some things, I think, that were not focal points for the business. They saw record revenue. They were able to confirm guidance on the earnings. They were able to get to the revenue midpoint.

I think they’re able to be somewhat conservative but optimistic because of the overall situation and the amount of demand. I mean, look, as we keep hearing about these next … We’re going to talk about this throughout the show today. We’re hearing about these design wins, whether it’s OpenAI building an XPU. I mean Synopsys just has a part to play in so many of this scale of this AI movement. Of course, it’s not just AI chips. Every time you hear about AI, there’s more storage, there’s more networking, there’s more traditional general purpose compute. This stuff all happens in tandem, memory. I mean all this stuff happens, and Synopsys has a role to play in so many of these parts.

So it’s an exciting company. I’ve had a number of conversations with Sassine. I mean since he’s taken the helm, I feel like he’s doing a really good job steering the ship. I’m glad you had the chance to sit down with him. It sounds to me like there’s a lot of reason for optimism over there. It’s going to be one of those … I don’t know if I used the word unsung hero, because they’re definitely not unknown, but I think when people talk about chips, they like to talk about AMD and Nvidia, Intel, Qualcomm. A lot of times there’s this whole supply chain, this whole … There’s the material stuff … We talked about Coherent last week … laser beams, there’s machinery, there’s wafer. What I’m saying is there’s just so much more to producing these final products, and it’s not always as cool, but, gosh, try to do this stuff without these companies. It’s just … You can’t. I mean-

Patrick Moorhead: Yeah, it’s interesting. It’s very similar to the way nobody had really ever heard of Arm, right?

Daniel Newman: Yeah.

Patrick Moorhead: Then, boom, the market gets some education. I think EDA players are going to be the power brokers. As you get more people, as you get more people making SOCs, doing IPs, chip load-based architectures enable small garage shops to do this stuff. Add on that the system part of it with Ansys and now you’ve got end-to-end.

Daniel Newman: Absolutely, Pat. So let’s talk about the earnings that … I tweeted something. I talked about our friend, maybe he’ll actually be back next week, that loves nothing more than a good quip or some good hyperbole, we could call it, and said you’ll hear a pin drop next week as we wait for … That’s Dan Ives. He’s been on our show before. Maybe we’ll see him back next week, in fact. But Nvidia, Pat. I mean we’ve been through the earnings wave. Nvidia has this interesting benefit of coming at the tail end of the tech earnings. I tend to agree, everybody’s waiting to see what’s happened. We’ve had this really crazy ebb and flow. The economy was going to crash. The Japanese … The yen carry trade was unwinding. We were going to go into a massive recession. The world was ending.

Then two days later, everyone forgot about it. Now we’re back to how many GPUs can somebody sell this quarter? Oh, by the way, there were delays and then there’s not delays, and then there’s a demand issue and then there’s not a demand issue, and then there’s … There’s just so much. The regulatory environment’s creeping. Pat, here’s my take. I did a breakdown. I tweeted this out. I hope-

Patrick Moorhead: It was so good. It was so good. I think you and I spent 25 minutes on the flight of me asking you questions. I’m like, “Where did this come from?” It was so good. Let’s go hit it.

Daniel Newman: Yeah, let’s start with where it came from. I mean, look, I saw this note that came out from Unusual Whales. It’s a great account on Twitter, by the way, one that I like to follow. It came out talking about the implied volatility, which basically means what is going to happen upon its earnings announcement is the largest it’s been in 10 quarters for Nvidia. So Nvidia’s always been a bit of a firebrand in that way, where it’s earnings, even back in its gaming days, it was just volatile. If you look at years … Like I looked at like 20 years. It’s not a company that went up and down 10%. It was like a company that was up 200%, then down 70%, then up 300%. There was just crazy numbers over the years. So the movement on this company has been pretty crazy.

AI has totally changed it now. It went from a gaming-dominant company to … Data center’s like 10 times the size, if not more, of its gaming business now, and it happened really quickly. So there’s this huge volatility, which basically means there’s this bearish and bullish thought that it’s either going to blow it out and it’s going to go parabolic up or it’s going to have a surprise. By the way, it may still blow it out and have a surprise just in something it says about demand or something it’s going to say about issues in manufacturing supply. We’ve heard rumblings, but it seems like they’ve got most of their stuff in order. They’ve got this annual cadence right now.

But here’s the long and short, Pat. The market’s waiting and it’s on pins and needles because this is the is AI moving forward trade? All the BS from the bubble bears that I like to talk about about AI is the digestion/ingestion period of when AI starts to find its way into applications and into industry, no one cares right now because these five companies keep spending hundreds of billions on CapEx that will fuel the near term. So the next two, three, four quarters, I think, are really bullish for Nvidia.

First off, they can’t meet supply. One of the rumors that came out this week across the internet was that basically OpenAI, and we’ll actually back this further with other rumors we heard this week, is struggling to get enough GPU demand right now, and that beyond, obviously, Microsoft and Google and some of the biggest buyers on the CapEx side, but that Meta and xAI, Tesla, and others have basically caused OpenAI some indigestion as it relates to getting enough GPU. The second, Pat, is you and I can read the tea leaves. TSMC’s numbers pretty much gave us the all clear. You’ve got more demand for three-nanometer, more demand for costs than they can make, and it was pretty well-understood that most of that demand is going to Nvidia. And so, they’re taking most of that demand, which means they’re basically going to pump out every single GPU they can make in this period of time.

Third is that, Pat, you and I, we’ve talked to our channel partners, channel checks across the board. All of our partners are basically … They’re all lusting for more supply. They’re all trying to get their hands … Basically saying they can sell everything they can get their hands on. Now there’s still a lot of consternation in the channels about their ability to make money, but this is what the customers want. So the pull through has been undeniably powerful. The annual upgrade cycle, Pat, has gone to a year. We’re seeing AMD doing the same thing. The one-year cycle is real, and every company is going to be forced into doing this to some extent because they won’t be able to keep up. So if Google goes all in on Grace Blackwell and on the B series, Amazon can’t be like, “No, we’re not going to offer the best chip.” They’re going to have to … So it’s going to create this pull.

Then, finally, Pat, I think I shared, but we’re starting to see end customer use cases come, which is even more of a bullish trigger because Walmart’s really strong earnings. Now, again, retail, we’re not going to dig into too many broad, macro topics, but retail, Pat, outperformed this week, but I think it outperformed because pricing and inflation numbers are high, which kept revenue numbers high. But these companies are coming out talking about AI. So Walmart talked about massive efficiency gains with AI. Of course, if there’s massive efficiency gains with AI and companies like Walmart are investing big, you can bet those big investments are going through Nvidia.

Patrick Moorhead: Great breakdown, Dan. Super impressive. Like I said, it was fun actually talking about your one tweet for 25 minutes.

Daniel Newman: I don’t do that often, but thank you.

Patrick Moorhead: No, it was good stuff. Listen, you did comprehensive. I’m going to do simple. There is nothing that’s going to be stopping this train short term, albeit a complete and total economic collapse. The reason is FOMO. As an IaaS provider, you cannot even blink on the amount of CapEx you’re going to be spending in the next 12 and maybe 18 months. The only thing that’s going to keep that happening is going to be investor pressure. I don’t think they’ll get any investor pressure unless they have super declines in their core business, like Google Ads, like AWS, Microsoft, and, therefore, they’re going to keep cranking it out because the stakes are high. I mean we could look at 10, 20, 30 years of dominance if one of these companies pulls away from the pack.

And your Walmart example was really good as a downstream. Yes, over time, you have to have the ServiceNows, the SAPs, the app stacks at Microsoft, the app stacks at Google, even Google finding ways to better target their advertisers via generative AI. Until then, it’s the internet build out, baby. Nvidia’s the new Cisco. What I don’t see is pet food companies and hardware tool companies that have no business model selling stuff. A lot of the downside beneficiaries are very large companies.

Daniel Newman: Yeah, no doubt about that. I know I stole a lot of that oxygen on that one, so thanks for letting me ramble on. But I just-

Patrick Moorhead: You deserved it, man. That was good. I’ll give you all-day oxygen, dude,. That was epic.

Daniel Newman: I mean I’m super stoked. Aren’t you on TV today talking about this?

Patrick Moorhead: Yeah, I’m going to be on CNBC right … Not right after the bell, but around 3:45, Central, talking about this.

Daniel Newman: Love it. I love it. So, all right, let’s keep this momentum talking about more chip wins, Pat. I shared a rumor from a good source, I think, but OpenAI, are they going to Broadcom? Are they going to go to Broadcom for some help on a chip?

Patrick Moorhead: Yeah. So this was a JP Morgan report that came out that said Broadcom has “recently won OpenAI’s first and second generation AI ASIC programs, positioning it as OpenAI’s fourth AI ASIC partner”. So this brings up more questions than answers. But, first of all, if you look at a lot of what’s driving Azure right now as an IaaS service … And, yes, Microsoft has an Azure OpenAI PaaS service out there, but it’s OpenAI. That’s what’s driving a lot of this super growth. In fact, there’s so much growth that Microsoft has a partnership with OCI for GPUs. What it does is it begs the question of, hey, is OpenAI going to be going more outside of Microsoft and Azure for its IaaS services? I think that’s interesting, where they might direct it to these new GPU houses that are out there, like the ones that Chaitin from AWS went to, CoreWeave, right?

Daniel Newman: Yup.

Patrick Moorhead: Then the second thing is, wait a second, one of four ASIC providers? What? If you remember the big cameo picture shot with Jensen from Nvidia was him dropping off the first Blackwell GPU system, or at least that’s what the tweet said or the photos said. It was with Jensen and Sam Altman. But then it’s the fourth ASIC provider. Like fourth, okay. I know that Broadcom is the clear leader in this. I think Charlie had said $10 billion for XPU this year, moving to 11, and the rest, they’re picking up on networking. I think the overall was 50 billion for AI. By the way, that 10 billion number to 11 is just going to absolutely catapult.

So you have Broadcom, and then the other player in here could be Marvell. Marvell is absolutely in the hunt. Broadcom did say it had another consumer play at their AI investor day, but I’ve got to tell you, I had teed that up as Apple. They called it a consumer play, and maybe that’s just Broadcom’s way of putting it, or Meta. Apple or Meta wasn’t thinking OpenAI. So you’ve got Broadcom, you’ve got Marvell. Could this possibly be Intel’s Gaudi fan or is this going to be something like Groq? And that’s Groq with a Q.

So very provocative, very juicy. A JP Morgan note is not like a rumor that you pick up from some nameless, faceless thing out there on the Twitter. So amazing opportunity, puts the exclamation point on if you want to do something more efficiently, do it with an ASIC, whether it’s training or inference.

Daniel Newman: Pat, one of the things that’s really interesting too is that these TPUs and XPUs are … They’re not flexible like GPUs, but you’re seeing the logic cores and the combinations of head nodes and logic cores being created where they can be … They’re not so narrow that you can only … Like you’re seeing what we’ve heard about Gemini being trained up entirely on a … People did not think that was really plausible-

Patrick Moorhead: Exactly.

Daniel Newman: … and now you’re seeing mega builds happening on XPUs. So this raises a huge opportunity. So when you heard Sam Altman running around talking about trillions of dollars raised to do the future of infrastructure … And, again, this was not just the silicon. That’s where there’s a lot of metrics and numbers being derived is there’s all the silicon, then there’s the systems, and then there’s the actual infrastructure, and then there’s the cooling, and then there’s the thermal, and then there’s the actual racks and then the cabling, and then you go out to the fricking materials.

I mean there’s a lot that goes in. This is what we’ve been talking about throughout the … This is not just … Because a lot of people are like, “How big is the market?” Well, we’ll talk a little bit about the market itself here soon, but what we’re really talking about is the chip is part of this bigger system and the rack scale, building up the racks top to bottom. What are all the components in the … Nvidia’s got a lot of parts in that now. But, anyways, my point going back in all this is companies, whether it’s the hyperscale cloud providers, they want to vertically integrate. Look, none of them want to say that. In fact, I’m pretty sure some of them are banned from using that word in anything that they talk about, but we can talk about. They want to vertically integrate because they make more money when they do that.

Also, they want to own their own silicon and silicon design. It’s a differentiator, Pat. It’s kind of like the data in generative AI. Having their own silk and their own design is differentiating. I mean part of Google’s prowess and why it’s been able to power up so much into the AI era has been able to … It was doing this for a long time. It was doing this long before it was a thing. Before the cloud providers are really thinking about it, it had its own silicon for its own workloads, that it had designed an ASIC for itself for the TPU. It wasn’t planning to sell it in the cloud. It just so turned out that it was usable when it got to that point and that people wanted to-

Patrick Moorhead: Yeah.

Daniel Newman: So I mean, Pat, you mentioned this. Broadcom is the 800-pound gorilla here. Marvell is the next op right now. They’re fighting for a handful of key designs. But, look, I mean the OpenAI opportunity, the Apple opportunity is a big one. Everybody’s going to be thinking about … I think every major hyperscaler is going to build their own. I think you’re going to even see with what we’re looking at with mega enterprises and with what Synopsys can do and what … You’re going to see mega enterprises starting to build, I think, some of their own … When they understand their AI needs and workloads closely enough. But right now, Pat, this is a really fast-growing market and it’s a really interesting thing.

But one other thing I think you said that is really important is there is real competition, the Gaudis, the off-the-shelf TPUs and off-the-shelf Inferentias and such. OpenAI would be crazy not to look at that. There’s a lot of R&D and work that’s gone into them. So do they want to build their own? My guess, like Meta, Pat, is they’ll end up somewhere in between. They’ll want to have some that’s going to be very specific for their need. They’re going to find some that’s off-the-shelf. They’re going to keep buying NVIDIA for things that need that level of flexibility. But I think they’re going to take more control into their own hands.

So, all right, let’s bounce to topic four, Pat. I’ll keep this high level. So this week, Futurum Group and our intelligence team, we did a build of the AI GPU market and the ASIC market. We looked at a few specific things when it came to that. We really did a teardown of 2022-’23 history, every SKU, everything we could pull from public data. Went through the process of the same for cloud instances and XPU sales, and that’s sell in. So that’s numbers that we actually know shipped in, and that’s stuff that we could publicly find available. Very interesting exercise pattern.

There’s two things I want to talk about here that came out of these numbers. One is trying to reconcile the XPU market from what we know from Broadcom versus what I’m seeing from shipments that are going into these cloud providers. Then the second thing that was really interesting, Pat, is just how far apart the market sizing can be depending on the exercise. Our team, we put a 30% CAGR on the GPU market, Pat. It was about 36 billion this year, rising to 2028 on GPUs at about a 138 billion.

We’re hearing from Lisa Su and others. They’re talking about a $400 billion AI ASIC and GPU market. Now, again, how are they describing that in terms of networking and systems? Is it all in, is it the whole thing, or are they literally just talking chips? Because if that’s the case, there’s a pretty massive reconciliation between where we landed and where they landed. I will say upfront I think we were conservative on our number. I think 30%’s a very conservative growth rate. I think the challenge, Pat, is you heard me talking about Nvidia, is the sales of this pressure on the CapEx side versus at what point does the market put pressure back on all those making massive CapEx investments now to start realizing revenue? Then how much does that annual cycle, that pressure and annual cycle, create an unsustainable growth if we don’t find a way to start selling out all this stuff into applications and workloads? When does all that stuff happen?

So the take was, Pat, we had 92% of the GPUs in ’23 sitting with Nvidia. I’ve seen numbers anywhere for data center between about 88%. I think I saw a Petty research number come out. I’ve heard numbers as high as 95 and 96 depending on exactly where everything lands. But I guess rather than making this a big readout, because I’ll share the link to what we’ve published, Pat, I’m interested in your take. I feel like we’re conservative. I feel like Lisa and the 400 billion has been very aggressive. Where does your head land on where this market’s going?

Patrick Moorhead: Yeah. So, Dan, it’s interesting. I play both sides of the fence here. I was a vendor for many years and I would work with companies like yours to get data forecasts. Then what we would do is create what I like to call a fusion model, which is we’re putting … And typically insiders at a company actually know this better than the data providers, but they also have the ability to do what I like to call bending the curve, which is it’s one thing to say, okay, based on all these data points, this is the size of the market. But if you’re one of the two or three leaders who is actually making this stuff happen, you know what your roadmap is. You have an idea of what your competitors are and, therefore, you can extrapolate that out in terms of the size of the market, but also the market share that you want to take.

I’d like to invoke something that Michael Dell had said years ago in a conversation that we were having publicly with the size of the IoT market in the next 10 years. Once you start getting into the hundreds of billions and trillions, his response was, yeah, it’s a big market, we all agree, and the numbers are large. So, therefore, going after that market long term is a smart thing. I think that the numbers will get more interesting as Nvidia gets more competition. I mean I don’t know, Red Bull winning everything all the time just got to be a complete snoozer. You’ve got McLaren coming in. I mean Mercedes won a few races. Now everybody’s like competition is back.

Long term, you never have one player just dominating everything. I think the best case today of total domination from a profit standpoint is probably Apple. Even though Apple only has about a third of the overall global market for smartphones, they’re taking most of the profit pool. Then Samsung and the other Android providers are picking the scraps for profit. Now great news about companies like Samsung is they’re vertically integrated and they make most all of the content inside of their devices. So they are making margin on that, too.

But what I’m really excited about is the future of ASICs versus GPUs. That’s what I am most excited about. I’m hoping, Dan, in some of your future reports, you can tease out companies like Groq. You can tease out Cerebras. You can tease out Untether AI and the long list of companies who do AI chips. Another interesting one that I hope you do is going to be on networking, and whether it’s scale up, scale out type of networking technologies. The network, in correlated chips, is arguably as important today as it’s a super element to throttling training time and inference latency.

Daniel Newman: Yeah. To give you a little tease there, I had some estimates on 2023 revenue. Cerebras had about 170 million.

Patrick Moorhead: Okay.

Daniel Newman: Graphcore 152, Groq at 60. These were our estimates. It’s all in the data, the report-

Patrick Moorhead: Oh, it’s in there. That’s great.

Daniel Newman: Yeah. It is in there.

Patrick Moorhead: I don’t have a license to the data, Dan, so I haven’t had the ability to go out there. I was just reading it on the multiple CNBC appearances that your data had.

Daniel Newman: Yeah. One of the big challenges I’m having with this, though, is the public data of what’s selling in. There’s a lot of mystery too, like what Meta’s buying. There’s no way to really … And they’re one of the biggest consumers, for instance, of XPU. So we’re looking at cloud instances and what we know from … We’re looking at the build out of shipments from Intel Gaudi and others, these small names that you mentioned, the smaller names you mentioned, and that stuff I can really track. what I’m having a harder time tracking is how many XPUs went into Meta, how many XPUs … Sorry, TPUs, for instance, did Google buy for its own use? I can track the cloud instances that it’s selling out to customers, and they have the lion’s share of instance use right now.

So XPUs are being used by cloud providers. They have the vast majority. Then, of course, AWS came in second right now with Inferentia. On the other end of things, though, the stuff that’s being used inside, like Broadcom’s pretty masked in terms of how those shipments look. So what you talked about, I think the market for XPU is much bigger than what we can size. So that’s where I think a lot of growth comes in is because there is a big amount being shipped into these hyperscalers that don’t resell. So we’re talking about the ones that are using it for their own use.

Patrick Moorhead: Yeah.

Daniel Newman: Then, of course, it is just still early how AI is defined, meaning there is still a lot of debate on what … Like we were trying to track CPU instances that are dedicated for AI use, but we know a ton of CPUs are used for multipurpose. They’re general purpose, and they use some AI. And so, those aren’t being counted. But even in just the short period of time that Grace Hopper was shipping in ’23, because the first version of it, I mean the most dedicated compute cores, even though you have all the Intel going with the H series as head node, you’re still seeing the Grace cores being the highest core count.

So it’s really interesting to see how this goes. But like I said, the way CPU/GPUs work together, there is a lot of CPU not in the count because it’s being used for other purposes, which means Intel’s role is bigger than maybe it’s portrayed in AI right now. And so is AMD, by the way, from EPYC, because EPYC had a big part too in a number of different configurations with Nvidia GPUs and others. So all that, Pat, it’s super interesting stuff. I’ll drop the link. This has driven a ton of interest, very exciting times, and we will … Pat, I think your point about networking, it’d be very interesting to do a sort of … What did you call it? Like open ultra ethernet versus-

Patrick Moorhead: Yeah, ethernet open … Yeah, NVLink. I mean there’s so many elements in networking today.

Daniel Newman: In the rack, out of the rack.

Patrick Moorhead: Yeah.

Daniel Newman: In the rack, out of the rack, and what are people using? I think there’s a big op to really dig deeper on that particular thing. And so, all right, we got two more topics. We got 15 minutes. We’re going to make this, Pat. We’re going to make this show. We’re going to get through it. Let’s dive into Windows, Pat. Okay, this is yours, but Recall. You and I were both super excited about AI PC, Copilot+, and then you lost Recall. Are we getting Recall back?

Patrick Moorhead: Yeah, so a little backstory here. Recall was the premier feature for the generative AI-enabled version of Windows 11. It essentially allowed you to have an indefinite memory. Actually not indefinite, it scales with the amount of storage you have on the device. But essentially it would take snapshots of every application that is open on your desktop at various times. It would store that and then it would essentially allow you to query all of that data.

So here’s a for instance. We met with the CEO and the number two of a company yesterday. It would’ve been nice if I could go in there and do a search on their names and see all the different touch points that we had, let’s say, over the last six months that I had with the company. It would scan what I’ve done on the web. It would scan WhatsApp. It would scan Word. Basically every application it takes snapshots to be able to do and it would give me all of these different touch points that I had with those senior executives and the companies. By the way, even on Windows, they have an app called Phone Link where I pull in text messages. So text messages would be in there as well.

So a lot of benefit. A lot of people really screamed, and I’m empathetic on this security nightmare and privacy nightmare. On the security side, Microsoft, there’s been increased scrutiny on their security, and their president was testifying in front of Congress on security, and the day after, Microsoft pulled Recall in, meaning saying, “We are not going to be announcing this.” It was first opt out and in there, and then it became opt out. Then it got moved to, “We’re putting this in the Insider edition,” which think of that like a beta version that you have to sign up for. Oh, and you can only access it via Windows Hello. So a lot of changes at the same time. Most in the industry, I feel, were pretty sad, particularly those people who have stake in selling in and selling out AI PCs. So to make a long story even longer, the date-

Daniel Newman: You’re doing that today.

Patrick Moorhead: Yeah, I know. The date is not June. It’s not June opt out on every version of AI PC. It’s now going to be in October. Windows Insider, opt in Windows Hello. So big changes. I’ve got to tell you, I am going to use this. I’m going to turn this thing on and use it for everything because, Dan, I’m Gen X. I’m older and I need help with memory. I will turn it off on certain applications that I don’t feel comfortable doing that on. I view it like history in a browser, which is pretty much you’ve got history turned on. You can go in and see everything you’ve done for however long you’ve had it without clearing it. If you don’t want to be tracked then … Or you could just delete all your history or turn it off if you’re trying to cover your tracks for something. That’s the way I see this.

Now it is aggregating a lot of information. It is resident only on the PC. It is not transmitting this to the cloud. In fact, there’s over 40 different models that come with AI PCs that will enable this capability on the device itself. That’s where the 40 tops from Qualcomm and soon-to-be AMD and Intel with the Lunar Lake that will come out and have operating support for this. 15 years ago, if you would’ve asked me, “Hey, there’s going to be this service where I’m going to take pictures, I’m going to tell people exactly where I am in real time,” you’d be like, “Man, that’s creepy. That’s just like … ” Privacy is a boiling frog. We get used to it. As long as we see corresponding benefit from the privacy intrusions, we’re willing to do it. I mean Instagram, I mean they are mining everything on us. We don’t turn Instagram off. Anyways, yeah, I really made a one-line news release pretty long.

Daniel Newman: You really did. You really did. That’s okay, Pat. This was the mega moment for the AI PC. And so, I tend to think this was a very important update. The market was kind of like … This was the feature that was like why jump in head first? I mean, of course, these things are more powerful. They’ve got great battery life. There’s a lot of cool things going on with these next-gen devices. But this ability for old people like you to be able … I’m just kidding, everybody, there. I don’t need Recall because I remember everything. But for most people, they do. The truth is, Pat, I like that you have the opportunity to opt in and out depending on the application. There are some really interesting questions, I think, that we’re all going to need to reflect on about like … So we’ve really not thought much about … Like most of our … My company runs a lot of Google apps, Pat. I think you have a Google app shop for some of your stuff, too.

Patrick Moorhead: Backend, yeah.

Daniel Newman: My point, though, is like there is … In the MSA is like some privacy agreements, but we all know that there’s also these certain alarms with companies that index the internet, like a Google or Microsoft, that when you give them … Like you hope that the … And, by the way, I genuinely believe the intent is there, but at the same time we remember the Slack thing where they told everybody what a company’s favorite emojis were. Remember that thing?

Patrick Moorhead: Yeah.

Daniel Newman: Where is my data truly my data? Beyond you and I standing up our own private data center and building out a DCF instance or a fully Linux-based running as the mainframe, we’re depending on third parties to basically … So what I’m saying, though, is something that you would not think twice about your email, but are you going to, for instance, use Recall to index your email in certain ways? I don’t know. Or different applications, social apps. Because, again, it’s already all going out there, so what are you going to limit it to? What are the things that you’re really worried about?

I think it becomes pervasive. I really do. I think it becomes pervasive. I think the security of it all, Pat, comes down to saying, I want to run it in this app, but I want all the data to stay here. And so, that’s the big opportunity with these AI PCs is that it’s got this powerful MPU on device. It can do all this cool generative stuff without needing to always process in the cloud. So exciting times, Pat. Good moment. Congratulations on getting it out. It was a little-

Patrick Moorhead: Not out yet, buddy.

Daniel Newman: Sorry, the delay on getting a commitment to get it out.

Patrick Moorhead: Yeah.

Daniel Newman: Remember, absence makes the heart grow fonder.

Patrick Moorhead: It does. It really does. It makes me want it more. I hope while they were in there on the security machinery, they did some optimizations as well.

Daniel Newman: Is that a Frenchie? I don’t know.

Patrick Moorhead: That is an absolute Frenchie standing in front of the Eiffel Tower. It’s got a nice little cute little beret on.

Daniel Newman: It’s got a little woof. I’m like-

Patrick Moorhead: Don’t give me any ideas about what I might need to dress up my Frenchies in.

Daniel Newman: Oh, your little Frenchies. Anyways, tell them not to hop off the bed. They’ll break their legs. All right. All right, last topic. We’ve got about five, six minutes. So, Pat, you and I, we are not economists, but we sometimes like to play one. We are tech analysts, and the two things are symbiotic at times. Pat, this week, the Bureau of Labor Statistics … This is not really what I would call a partisan, this is a data-driven organization … revised down the jobs by 818,000. Massive number. Largest, I think, since somewhere around ’08 or ’09 if I’m remembering this. Obviously there was a reason those revisions at that time were complicated. That was the great financial crisis, in the wake of that.

But, Pat, we’ve been hearing all year soft landing, economy is strong. We’ve had decent job reports. The market’s been stable and steady. Delays have been in lowering interest rates. High interest rates have slowed down home buying. They’ve slowed down auto loans. They’re creating a bit of a credit crunch within businesses. It’s hard to get credit if you’re a small business right now. People are spending beyond their savings now. Defaults are rising quickly. A lot going on, Pat, and now we’re finding out the job market ain’t that good either.

By the way, something you and I have been seeing and feeling. We talked about the Cisco layoffs, we talked about the Dell layoffs. The entire market of the Silicon Valley is down to 2019 or below levels. So these big companies are laying people off in droves. So basically the data has come into the market path that validates what we have been seeing and feeling and hearing, but our eyes continue to deceive us with the data that was being presented to us. Pat, this was crazy. I mean I said it’s very disconcerting to have this kind of data be able to come out, completely pivot the narrative. We’ve been told the economy’s great, everything’s fine. Turns out to be bogus.

Now we’re heading into an election and we’re being told that parties are going to fix the economy. Both parties are going to fix the economy, which I don’t understand how that’s possible because one party has created this economy and now it’s creating a security-vulnerable product and then coming out with a security product to fix it. It doesn’t make sense to me. So it’s wild what’s going on. My quick tie to tech and pass over to you is, Pat, we are actually, in my opinion, just getting the validation that we all knew was true. I think every rational person had been watching the market, seeing how many fewer jobs that were posting. Companies, our clients are telling us about tighter budgets. They’re trying to do more with less. They’re cutting back their teams. If tech wasn’t growing, why in the world would we think anyone else is?

So it’s crazy, Pat, but I think this is exactly what has been signaled for some time. But it’s kind of like when a GPS tells me to turn right, but I can see the building I’m going to in the left. But then I get stuck in this, do I turn right? Every so often I just turn right, and it feels like most of the world keeps turning right, even though the building they are going to is right over there.

Patrick Moorhead: Yeah, Dan, there are so many things to talk about in relation to this. First of all, is this a political thing or not. Interesting, the timing of this is dropped right in front of the DNC Convention. If you say out of one side of your mouth, “Hey, the good numbers were political,” I mean this dropped right in the middle of the convention. So maybe you can say, well, it’s not political. If it’s not political, maybe it’s based on the data collection that … And we debate the means and methods on CPI a lot and how this can be this off.

By the way, based at least what Yahoo Finance said, this is the biggest downward revision. They only went down to 2018, but in six freaking years, and it’s a massive one. And so, there’s that element on it. Is it just a bad methodology? When I think broader in terms of what is happening and why our economy is not doing well, it’s super clear to me. We’re spending too much money. The government is way, way, way spending more money than it can bring in and not as much investment into true longer term economy-growing things. You can short term spike growth by increasing government spend on government stuff. In fact, in the last 10 years, we’ve seen the largest increase in government workers and government spend than in the complete history before that.

Then you’re thinking, okay, well, why don’t we vote for the party who’s going to do this? I am absolutely confident that no party is going to attack this. The reason is because it will not get you voted in. I’m hearing, “Hey, we’re going to grow our way out of this,” on the Conservative side. On the Democratic side, it seems to be, “Hey, we’re going to tax our way out of this by soaking the rich.” By the way, and Steven Sinofsky had a great analysis on it, you could take all the money, literally all the money, not just the taxes from the top hundred people, richest people, out there, it wouldn’t even make a dent in the deficit, not even a mark. So you can take all of the money away, kind of like the Russians did in the 1920s Bolshevik revolution, where they came in and they took all the gold and they took all the money from all the rich people and they nationalized everything, that’s not going to help us.

So I’m not that optimistic about that. My final comment on this is that there’s also a school of thought that says, hey, as long as we’re not the worst of the G-7 or the G-10 based on valuation and currency, we’re going to be fine. Here’s the problem. At some point, we all default on our loans and we can’t pay it.

Daniel Newman: Yeah.

Patrick Moorhead: Then it’s like, “Well, hey, if we all default, then … ” It’s like, no, that’s not going to solve this long term. Yeah, I’m not optimistic, Dan. I’m really not.

Daniel Newman: Yeah, and just a stat from the Tax Foundation. Half of taxpayers pay 97.7% of the tax. And so, this idea of people … And, again, I agree that people should pay their fair share, but I think it’s always taken out of context, because people are paying their fair share and then we use it to create this populist movement against people who pay. It creates a lot of resentment from job creators, for entrepreneurs, for business leaders who, by the way, are taxed at 50 different angles of every dollar they make. I’ve got to run, but then we got this whole thing now they want to tax money you haven’t even made yet is a topic that’s come up in … I put my post out on that, Pat. Absolutely one of the worst ideas ever. I can’t believe it’s even being taken seriously right now. I don’t know. I guess I’ll join the-

Patrick Moorhead: We have two case studies, Soviet Union and China. Once you disincent entrepreneurialism and moving up the chain, you lose innovation. People lose their motivation to work. Even those two countries, even though I know it’s CCP, they very much have changed their stance. Soviet Union is no longer. It’s Russia. I mean really there’s only two communist countries, North Korea and I think maybe Angola. Sorry. And it’s like that’s it. It does not work.

Daniel Newman: I don’t know. Things are going pretty well in Venezuela.

Patrick Moorhead: Yeah, I know. They actually are. Just cut out the fat and stuff that we don’t really need, and then-

Daniel Newman: That’s Argentina. Venezuela’s a mess.

Patrick Moorhead: Oh. Okay.

Daniel Newman: No, I know where you’re going with that. I know where you’re going with that. But, yeah, listen, dude, we could jump in and dive on this all the time. We won’t do it all the time because there’s so much tech stuff to cover. But, look, a strong economy starts with a strong tech market, and it is what it is. Tech is the driver. It’s deflationary, but it’s also productivity growth out the yin-yang. That’s an official term out the yin-yang. Listen, we’ve got a run, Pat. Great talk, great conversation. Always love my Friday. Way to start the day. Good job this week at our secret meetings in undisclosed locations. I’ll see you at VMware Explore on Monday, Pat. Have a great weekend.

Patrick Moorhead: Take care, man.

Daniel Newman: See you soon, buddy.

Patrick Moorhead: See you, bud.

Daniel Newman: Bye, everybody.

Author Information

Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.

From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.

A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.

An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.

SHARE:

Latest Insights:

Steven Dickens, Paul Nashawaty, Camberley Bates, Keith Townsend, Guy Currier, and Dion Hinchcliffe, analysts at The Futurum Group, share their insights on the evolving virtualization landscape, discussing how containers and hybrid cloud strategies are reshaping enterprise IT. Learn how VMs, Kubernetes, and hybrid models are driving the next wave of infrastructure innovation.
AI and Data Take the Stage at Tech Field Days to Cover the Issues and Solutions Surrounding AI Deployments
Camberley Bates at The Futurum Group reflects on the AI Data Infrastructure Field Days and what to take away from the gathering of experts and the vendors including MinIO, Google, HPE, Infinidat, Solidigm, and Pure Storage.
Company’s ENGAGE Announcements Focus on New Platform Features, AI Tools, and Integrations
Keith Kirkpatrick, Research Director at The Futurum Group shares his insights into the new announcements from Smartsheet, and discusses the benefits for its customers. He also discusses the company’s embrace of a new pricing model.
Sonatype Reveals the High-Profile Keynote Lineup for ADDO 2024, the World’s Largest Virtual DevOps Conference, Showcasing AI, Open-Source Innovation, and Secure Software Development
Paul Nashawaty, Practice Leader and Principal Analyst at The Futurum Group, shares his insights on Sonatype’s 9th ADDO Conference, highlighting the keynote speakers and their focus on AI, open source, and securing the software supply chain.