We are Live! Talking Google, Intel, Pure Storage, HP Inc., HPE, and Salesforce

We are Live! Talking Google, Intel, Pure Storage, HP Inc., HPE, and Salesforce

On this episode of The Six Five Webcast, hosts Patrick Moorhead and Daniel Newman discuss the tech news stories that made headlines this week. The handpicked topics for this week are:

  1. Google Cloud Next 2023
  2. Intel Habana 2 Leads on Hugging Face Visual Model?
  3. Pure Storage Q2 2024 Earnings
  4. HP Inc Q3 2023 Earnings
  5. HPE Q3 2023 Earnings
  6. Salesforce Q2 2024 Earnings

For a deeper dive into each topic, please click on the links above. Be sure to subscribe to The Six Five Webcast so you never miss an episode.

Watch the episode here:

Listen to the episode on your favorite streaming platform:


 

Disclaimer: The Six Five Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we ask that you do not treat us as such.

Transcript:

Patrick Moorhead: Hi, this is Pat Moorhead and we are here for a Thursday Six Five podcast. Dan and I are just coming back from Google Cloud Next in San Francisco. Dan and I are going to be busy, I don’t know, maybe racing cars tomorrow morning when we normally do the podcast. So here we are. We’re going to do this. Daniel, how are you doing, my friend?

Daniel Newman: Hey, buddy. Happy Thursday.

Patrick Moorhead: I know.

Daniel Newman: You should be just as excited to have the honor to be with me here on a Thursday as you are on a Friday, and we also saw each other on a Wednesday, Tuesday and a Monday.

Patrick Moorhead: I know. I mean, I pretty much see you more than my entire family, so I guess you are the extended family, the work family.

Daniel Newman: I am the reason you get out of bed in the morning.

Patrick Moorhead: Very much so.

Daniel Newman: So it’s great to be here. It’s great to fit it in. And yeah, listen, there’s a chance. It’s not high for those of you that might be rooting for it, but there is a chance that tomorrow I crash and burn. Because Pat, you and I are going to be taking the Six Five live on the road, literally on the track. We’re going to go fast baby. And in the words of the great Ricky Bobby, “If you’re not first, you’re last.”

Patrick Moorhead: I know. No, it’s going to be good. We’ve been talking about getting out of the circuit of the America and racing on that track forever, and now we’re actually going to do it. But hey, let’s jump in. If it’s your first time at the Six Five, we try to cover six topics, five minutes each, usually 10 because we like to talk and listen to ourselves talk. We do talk about publicly traded companies, but don’t take anything we say as investment advice. This is for educational and hopefully entertainment purposes only.

So with that, I mean we have a great lineup. We’re talking about Google Cloud Next 2023. We’re talking about a report that Hugging Face brought out with saying Habana 2 leads on a visual model. I think that’s interesting. A lot of people have counted Intel out, and we’re going to go down the line on earnings, Pure Storage, HP, HPE and Salesforce. Dan, good job on the chyrons there.

Daniel Newman: Hey, listen, there wasn’t an overall chyron, is it a chyron?

Patrick Moorhead: I don’t know.

Daniel Newman: It’s an overall-something-ron. And hi, Ron, by the way, if you’re watching. Ron Westfall, one of my great analysts over at Futurum. Anyway, but yeah, there wasn’t one. It was a busy week, Pat. And by the way, doing this on Thursday actually helped us be more declarative in our picks because there were so many freaking earnings this week. And after the bell today, what do we get? Broadcom, we’ve got Dell, we’ve got MongoDB.

Patrick Moorhead: MongoDB and Elastic.

Daniel Newman: And Elastic. At the very least, I mean, that’s just what I can think of off the top of my head. And for those of you that were waiting on the edge of your seats for Pat and I to analyze those earnings, you will have to come back next week. I’m sorry.

Patrick Moorhead: You know what? Do it. Absolutely. Let’s jump right in. Google Cloud Next 2023. And Dan, as you know, I like to call my own number-

Daniel Newman: You do. You are selfish.

Patrick Moorhead: … when I’m hosting it, and let’s dive in. So as you would expect, Google Cloud Next 2023, the primary volume knob was set to generative AI. And this has essentially been the norm since November when Microsoft and OpenAI got together, boards, board edicts on, hey, what is our generative AI offering? What is our story? And everybody has been in pandemonium since then. Analysts love pandemonium because people are looking for our feedback in a big way, and they’re looking for not only the enterprises, but also the technology companies, markets. Everybody wants to know.

I mean, we’ve seen stocks go up $100 billion, stocks go down $200 billion. But hey, Google Cloud Next. So Google, their cloud business has really picked up steam over the last couple of years ever since TK joined, and he was very clear that, Hey, we’re going to meet customers where they are in their journey as opposed to asking them to go through different hoops or rev their software every year like the consumer side of Google has. But this year did not disappoint on generative AI.

And Dan, I think it’s fair that we gave Microsoft the early lead because they had done public previews open before anybody else, and I have to give Google credit here where Duet AI, which is essentially the generative AI agent, is GA for Google Workspace and I believe for Google Cloud. And I think that is a big deal and what a turnaround. And I think you and I were both universally in agreement that this is not a sprint, it’s a marathon, and being first with GA or being first with Preview or being first with this, and that is great for kind of chest thumping, but it doesn’t necessarily determine who in the end is going to win.

Now, Microsoft does have a massive business lead over Google Cloud right now, but what I find fascinating is that at least, again, probably every 10 days I talk to an enterprise and invariably they tell me, “Hey, Google Cloud is not our primary cloud provider, but we’re using them for some sort of analytics or AI capability,” and I think that Google has the ability to pick up market share here if you fundamentally believe that generative AI is the new basis for applications to come out. This is Google Cloud’s big play here, and I was super impressive to see that they went GA.

I still have a little bit of research to do because what I have found, at least on the workspace side, is that GA doesn’t necessarily mean it’s going to show up immediately. In fact, I waited a year to get Threads into Gchat after it supposedly went GA. Again, I need to get my facts straight on that, but my backend is workspace. My front end is primarily Microsoft, so my team is chomping at the bit to be able to press one button and convert that document into a presentation and that presentation into a document. And that is a killer app if you ask me. Some of the stuff that went on in the background were new C instances. We saw two new instances. One is AMD fourth gen, which is in preview, and then we saw Amphere with the AmpereOne that is an upcoming preview, I think it said previewing next month. And that adds to Intel’s fourth gen Xeon that is generally available.

One thing that I need to really drill down on is the TPU. I had people, probably some Google people hitting me up on social media telling me that the TPU Google does 90% of their training on the TPU. And by the way, every year I go and ask, “Hey, what percent of workloads are done on TPU?” And it has never been higher than 20%. And I think the difference here is, and again I do the double click on this, Google consumer, the ads business, search uses the TPU a lot for the type of things that do primarily for machine learning and deep learning. There was a nice chart out of the archive that was shared, and I’ll put this in the show notes that puts that out. I don’t get the feeling that the TPU is used a lot in Google Cloud, otherwise I think that Google would be doing a much bigger job to promote it.

On the other hand, I was told this week that the TPU is sold out, so I’m trying to get my hands around this. I think that this matters to me because first of all, Nvidia is selling H100s with supposedly 1,000% gross margins, if like AWS does with Trainium and Inferentia, you can move a lot of your customers to your own custom silicon where you don’t have to make merely 1,000% gross margins, right? You can make 100% gross margins and keep a good business, you’re going to keep a lot more profit for yourself. And you could also, if you want to do the cost mode to get market share, you could go down that angle as well. Microsoft, on the other hand, with Azure doesn’t have a custom GPU, an inference or for training, and I think that’s going to become an issue for them over time if they don’t figure that out.

Back to the C instances, I cannot wait to see the performance that AmpereOne brings out. Google does not do comparisons to Intel, at least publicly. And it’s funny, it was AMD versus AMD. It was Amphere versus Amphere, and it was TPU versus TPU. So a lot of work to do, a lot of research to do. It was a very professionally run analyst track that they did. It was great to spend time with Thomas Kurian.

As I’ve said before, one of my biggest research methodologies personally is to better understand senior leadership. And if I don’t meet with senior leadership and get a good feeling about what the future looks like, I have to talk about that. I did get a good feeling from Thomas on growth in my conversation with him, and this is not NDA or proprietary, so I feel like I can share this. He was very surprised at how many customers were coming to him with very specific asks and to see if generative AI can help with those. I think it’s a really good sign for the industry. I think it’s a really good sign for Google.

Daniel Newman: Yeah, good analysis there. I’m going to macro this thing, because that’s what I do best. First of all, Google Cloud has found itself in a moment, and this is the moment where we’ve seen a shift away from traditional compute where other cloud providers have had tremendous strength to AI focused compute, where Google, not exactly like an Nvidia, but similarly, has been built for a very long time. Remember, Google is brain, Google is DeepMind.

Google has been doing recommendations and generative, by the way, if you’ve been using workspace for any period of time, there was this little generative feature that Google has been doing for a long time where it would complete your sentences. I know that until November of last year when a large language model was deployed and made public and people were using ChatGPT, no one knew that, but that was a generative capability right there, generative text.

So this wasn’t brand new. The company had been developing Bard and Palm for a long time. It did it in secret. It didn’t come to market. It was pushed into coming to market. It fell on its face severely in its initial come to market. But I actually think, as I say to my children all the time, it’s not about the mistakes, it’s how you come back from them. And so it made a big mistake. It raced to compete. And you mentioned something when you were talking about the race for GA. and in some cases being GA and being first isn’t as important as you’d think.

It’s interesting now because kind of seeing how this is trading in market, Microsoft came out with an open AI or ChatGPT powered search, and there was this very bullish sort of sentiment about market share, and it didn’t at least so far amount to very much. And by the way, on the other end of things, Google has raced out with its early to GA duet capability, and they have what? About 10 million workspace users? Which is, it’s palpable.

Patrick Moorhead: Paid.

Daniel Newman: Paid.

Patrick Moorhead: Student, yeah, real paid.

Daniel Newman: Good call. But what I’m saying is that’s a pretty small fraction to the number of paid Office users. And the reason I point that out is we’ll see if putting the ability to generate your emails, write your documents, create your PowerPoints, moves the needle of people off of Microsoft or if we’re just trading early GA. But I was impressed by the fact that Google did get this into market first. And yes, we also use workspace as part of our backend, and I don’t have it yet, so I don’t know what GA means, but it’s not DA, meaning Dan available at this point yet. So if you all could make that happen, feel me. Feel me, I’d like to play with this a little bit.

Other things that really caught my mind, because you covered a lot of ground, Pat. One was the strength that Google has for the longer tail based upon its winning ways with generative unicorns. Now, you and I had a chance to be part of a small group and some private interactions with Thomas and the non-NDA part of it was we talked to him, I asked him a question, why are they winning? And I thought he did a very good job of being able to explain why 70% apparently of gen AI unicorns are building on Google. Now, he wasn’t declarative whether or not they’re using exclusively Google, which was something I want to know, but of course I don’t think anyone in his shoes would’ve answered that question, but we can’t help but ask, right? That’s our job.

But what he basically did say is one, the economics of Google are very good, meaning the economics have enabled startups to very efficiently deploy their AI projects. Two, the Vertex environment, powerful software, highly capable. And then the third thing he really focused on was Google’s impressive developer ecosystem. Now, I think other companies could make that same argument as well, but you do need to repeat that number, Pat. 70% of the unicorns, meaning billion dollar valuation plus companies building gen AI tools, technologies, and software are using Google Cloud. So a very impressive number. And I’ll end my thoughts here as we did talk to … Pat, you talked about talking to enterprises.

I won’t say whom we talked to, but we did talk to a large financial data institution and we did get some interesting feedback. At least it was interesting to me because of how we’re hybrid cloud people. And it was very interesting to talk to a large data company that’s dealt with major security issues in the past to find out that they are all-in on cloud. So there is a story to be told, that some big companies absolutely hook, line and sinker, maybe it’s multicloud, but by the idea that their entire enterprise can be run all in the public cloud path. And I thought that was kind of an interesting way to end our event talking to a customer, hearing that, because you and I still struggle, I think, to see how that can be done based upon the complexity of data and infrastructure environments inside of an enterprise.

Patrick Moorhead: Yeah, this company had a burning platform and that they were hacked big time and were looking for answers, but it was interesting. Great analysis, Dan. Just the final boomerang that I wanted to add here was people forget that albeit Google Cloud is the number three or number four cloud provider, and I think Thomas said what? It’s a top five software firm. If you look at revenue with a little hint that he might move forward there, they have the largest infrastructure on the planet of any company out there, bigger than Microsoft, bigger than AWS. But they don’t only talk about that, because they don’t want to potentially scare people off given their consumer capabilities.

I think the time’s over, I think they need to come out and start leveraging that and flexing their planet scale infrastructure out there. Maybe being a little bit more forceful with talking about the TPU. Let’s move to the next topic, and imagine that it’s an AI topic. Hugging Face actually published some benchmarks that show that Intel Habana 2 leads Nvidia A100 and H100 on a very specific model. Dan, why don’t you kick this one-off?

Daniel Newman: And there you have it. That was it. That was all I had. But no, I mean, look, first of all, you ended where we’re kind of picking up, but in the near term we’ve got this gold rush to train, and this gold rush to train is basically largely a train that only goes to one station, and that’s Jensen Huang’s kitchen. That kitchen is going to be so awesome by next year, I don’t know, but I think there’s a printing press being created in the Nvidia corporate headquarters.

But having said that, in all seriousness, whether it’s talking to Thomas Kurian about the capabilities of the TPU, whether it’s Intel, whether it’s AMDs MI offerings, whether it’s Groq’s accelerators, whether it’s Lattice semis, FPGAs with vision capabilities, there are a lot of semiconductors that can do AI, but right now the market’s impression is that there’s really only one, and that’s because there’s this gold rush to train large models and to build foundational models.

Because right now, the ability to use AI in your business using unique data sets is sort of the next frontier of opportunity for productivity gains and efficiency gains. There’s also this kind of overwhelming impression pattern. This is where I think we can kind of have a little bit of a convo/a debate that Nvidia is the only company that can do it, and that basically there’s a reason companies are waiting three to six quarters depending on who they are to get their hands on an A100 or an H100. And it has to do with a combination of the fact that Nvidia has really what’s considered to be a full stack of capabilities, the programming developer ecosystem around CUDA, but also just the fact that it’s sort of the universal and most capable, most powerful.

But the truth is there’s a couple of trend lines going on that are important to note. The one is, well, training is the next immediate frontier of opportunity. Longer term, the ability to accelerate workloads and even do that on traditional general purpose compute is actually a very large opportunity around inference. The other thing is that when you’re training very specific workloads, there is something to be argued that an ASIC, a semi that’s built very specific to accelerate a certain type of workload could end up outperforming, and that’s what Hugging Face…

So this wasn’t an Intel piece, but it’s a partnership and a relationship. Intel and Hugging Face have very publicly been out there that they do have this relationship, but effectively announced that when they were training these vision language models, these very specific kinds of models that the Habana Gaudi 2, which is the ASIC from Intel, actually performed substantially better than both the A100 and the company’s newest and most powerful GPU, its A100.

And so while this, again, Pat, I think is, it’s a little apples and oranges because obviously when you’re buying Nvidia, you could argue that I don’t know what we’re doing in all cases with this. So we want to have this most powerful general capability to do all things AI, but with many companies building out specific foundational models, specific language models, the idea of being able to train more efficiently, and by the way more price efficiently considerably becomes very interesting. So this whole bridge tower on Habana Gaudi 2 I think brings a really interesting debate, Pat, and it’s kind of two debates for me. One is … and by the way, we had a kind of a similar conversation around Groq with the LPU, they call it, right? An LPU, language processing. But is that what is the capacity and aptitude for companies to go down the path of using an ASIC or a chip very specific to that?

And two, what are the constraints, meaning what are the reasons, knowing that these are actually available today, they can be utilized right now in instances both in the public cloud and purchase for on-prem, that companies aren’t more thoughtfully considering the utilization of this technology from both an economics and a capability standpoint, Pat? And so to me, like I said, it’s early days, but I think what we’re starting here is a real conversation about the fact that there is a very powerful market position around the Nvidia products, but there are competitive offerings in the other forms, and that in many cases for specific kinds of workloads could become very compelling. So rather than droning on, I just want to put that out there and maybe bounce it to you and maybe go back and forth a bit on this one.

Patrick Moorhead: Yeah, I like to get back to the basics. I’ve been around chips for over 30 years, and one thing has always been true. There’s been a continuum of efficiency and programmability. The most efficient, the less programmable, and going from left to right, you have the CPU, the GPU, the FPGA and the ASIC. The challenge with the ASIC is always, again, like you said, is how do I program that? And what they do is they put the flexibility in the software to be able to run different workloads, but when you get it there, it’s going to be a heck of a lot more efficient than a GPU or a CPU.

To be clear, Nvidia does have ASIC blocks on its GPU, right? They have transformers, they have some … Heck, on Intel Xeon has four different ways to accelerate AI. So it’s really this continuum. So it’s not ASIC good, GPU bad or GPU bad, ASIC good. GPU has taken advantage of the flexibility that particularly hyperscalers want to be able to go to the next best thing. I mean, heck, a year ago we were still talking about recommendation engines and visual AI and object detection and recognition and self-driving cars. Right now in this generative age, we’re just doing some of the most wackadoodle stuff out there with the GPU. How do you think all of these initial LLMs were trained? They were trained on the A100, not the H100. The H100 is just a beast of a device that cranks out foundational models a lot more flexible. And that’s the key, is flexibility.

One thing I was interested in this one as well was that this wasn’t inference, and this wasn’t training, it was fine-tuning, and also I found it interesting that it wasn’t Intel people. These were Hugging Face people. So that gives a tremendous amount of credibility. But what all the listeners and viewers need to understand is that Habana Gaudi 2 won’t have the same level of advantage over Nvidia or even AMD in all use cases. This is a very specific use case using a very specific model, which was very similar to what we saw with Groq using Llama-2-70b. I also don’t think that this can claim to be a large model as well, given the size. This is not one of these 70 billion parameter models. It’s almost a billion parameters in total. But anyways, read the notes, read the show notes. Dan, any final comments?

Daniel Newman: No, I mean, look, it’s an interesting inflection. There’s a lot of market concern in question right now whether or not there are other companies that are set to benefit from this AI gold rush. The disproportionate amount of revenue that’s gone in one direction, the train only goes to one station, does bring up some relevant discussion points about healthy competitive ecosystems, about the need for alternative routes for enterprises, hyperscalers, and small businesses to be able to benefit from AI. And in the longer run, how much do people that are running Salesforce with some sort of attrition risk algorithm care about what hardware that’s being done on? And I think over time, companies are going to look for efficiencies, especially on the inferencing side. So I think it’s an interesting debate conversation to keep having, and I don’t expect that it’ll be the last time we’re going to have it.

Patrick Moorhead: Yeah, Dan, I mean graphics used to be done on a CPU, and then they put fixed functions in to do 2D graphics. We used to have an accelerator to use with spreadsheets and to crunch numbers, right? It was a plugin chip, right? It was a math accelerator and then it got sucked into the processors. So historically these things should calm down, but until they aren’t calm, GPUs are going to have an operational advantage.

All right, let’s move on to Pure Storage earnings. Another company that by the way is very, very integrated into AI. We knew what they were up to with machine learning. They talked a lot about Meta and how they work with a paralleled workflow inside of the company. But Charlie gave a good overview of how his company is supporting from a storage point of view, what they call the world’s AI projects. They had Meta on there, they had SiriusXM, they had Health 2030 Genome Center, Folding@home, a bunch of pretty big names. Not the big names other than Meta that you might expect, but my guess is a lot of customers don’t really want to talk about that.

Financially, they had a top and a bottom beat. They did talk about winning a big eight figure generative AI deal, which was good. Storage is a service, it’s funny, to all the infrastructure people you say SaaS and it’s like, “Oh, software as a service.” No, storage as a service doubled year over year. That is phenomenal. ARR at 27% and just the fricking eye-popping 70.7% gross margins. Who says that there’s no margin in hardware?

Oh, by the way, fun part is they spent actually more on software than they do on hardware. So are they a hardware company? Are they a software company? I don’t know. They’re like Apple. They’re fully integrated in how they operate. The cool part about their future here that I find is that they finally, if there’s a bingo card of storage, aside from mainframe and very large systems, they play in block storage, file storage, object storage. They’re doing SLC, they’re doing QLC to hit price points and performance. They have a chassis from a sustainability standpoint that you’re not throwing away the entire chassis to get better performance and better capabilities. You essentially, it’s a bladed architecture where you move capabilities in and out, you can pull out and push in systems. They’ve also gotten into the data management space with things like ransomware, duplication, things like that. So company’s on the move. I didn’t get the chance to talk to Charlie this quarter, but hopefully we’ll get to chat with him soon.

Daniel Newman: Yeah, there you go. I mean, look, I feel like quarter after quarter, this is another company that operates well, delivers consistently. It’s in an area that’s somewhat seen as mature as storage, but is trying to really bring innovation to this particular industry. On a macro level, they hit on some very good things. One, their subscription numbers are growing substantially. Mid double-digit growth. Two, they’re seeing their evergreen product grow even faster. Doubling prescriptions to many pharmaceutical-

Patrick Moorhead: You need a subscription, Dan?

Daniel Newman: I need a subscription. Their customer growth, substantial. 12,000 plus customers now. And then of course, Pat, this is something you and I have talked about a lot. They are a customer experience led organization. They’re very focused on delivering best of class customer experience to the industry. 81.4 NPS score, Pat, they post this.

Patrick Moorhead: Isn’t that crazy?

Daniel Newman: Well, they post it because it’s legitimately twice as good as their next best competitor. They are really a wake and serve organization and I like that. And so I actually will have a chance to talk to Charlie this week. I look forward to being able to interact with him a bit more. But I think the company’s doing a good job of balancing innovation, driving towards ARR, and of course keeping the fact that their customer-centric as really what makes their identity unique. You hit on some of their diversification, Pat. I think they’ve done a good job on that. They did slide an AI slide into their deck.

Patrick Moorhead: Which I read.

Daniel Newman: Which do you read the deck or you read the slide?

Patrick Moorhead: I read the slide when I started. Yeah.

Daniel Newman: So Pat, we kind of joke sometimes about you can’t build apps on air. Well, you need somewhere to put all your data as well because that data needs to be accessible. So Pure does legitimately have an opportunity to play in that space. So the guide was like usual, conservative, but positive. Overall, Pat, not a lot to dislike about this quarter’s earnings report.

Patrick Moorhead: Yeah, it’s interesting. I can’t wait to see how VAST Data and Pure collide together, right? Who’s going to be the king of the hill for AI based storage workloads? It’s going to be an interesting one for sure, because every three to five years a new storage upstart kicks off and tries to knock somebody off, but very few of them end up dying, right? I mean, look at NetApp. I mean, NetApp did not have a good quarter and their differentiation seems to be evaporating.

Daniel Newman: The other Kurian, by the way.

Patrick Moorhead: Yeah, exactly. Exactly. And then you’ve got Lenovo really dominating the low-end and the mid-range storage right now. Fun stuff. Hey, let’s go to the next one. HP Inc Q3. What happened there, Dan? Did they catch the same bug that the entire PC industry has?

Daniel Newman: Yeah, so sequential growth year over year, not as good. I think on the EPS side, they managed. On the revenue side, they fell a little bit short of expectations. And really we had the chance to talk to their CEO, Enrique Lores. Comments kind of seem to be consistent on a quarter basis, but what’s also been consistent with some of these OEMs and some of these chips is as much as they want to be able to read the tea leaves into the future, the tea leaves are not clear as to when this market for PCs is going to really turn.

And when you’re in the businesses that HP is in, both PCs, peripherals, printers, it does all sort of … unlike some of these other companies that can maybe trade on some diversification into infrastructure, cloud, they’re very dependent on sort of a similar cycle, which would be for all these PC replacement cycles, peripheral replacement cycles. And so for HP, it was a very kind of muted, conservative call. I mean, I think their quarter to quarter growth is indicative that maybe there’s some healing in the economy coming forward.

I think they did have the ability to claim some market share gains, which were positive for the company. Of course, when you dig into market share gains, you also have to dig into some secondary factors like ASPs and what parts of the portfolio did those market share gains come from. And so we’ll have a better idea, Pat, of that when we see Dell. And so where did that gain come from?

The overall sort of sentiment is … this is probably the one thing that I’m not sure what Enrique and his peers are supposed to do, and this is the AI conundrum. Enrique was not particularly outspoken about how AI is going to impact HP’s business. And the truth is, is that at the edge and on the device, I think we all kind of know it’s going to happen. But what we don’t know yet, and this is the same thing that handset providers are kind of struggling with too, is what are people going to pay for? So meaning that the ability to run a large language model on a PC or on a device, will people pay more for an iPhone for an Android device, for a PC to do that? Now I can imagine in workstations we’ll see some momentum because of the kind of workloads that we’d want to be able to run locally.

And so you could see some spend on a more powerful GPU, CPU combination long-term on workstation. But on PCs themselves, when you’re doing a lot of inferencing in your applications in workspace, Microsoft, Salesforce, how much more powerful does the computer need to be to do that? And is there an AI cycle for what I would say everyday usability for inclusion of AI versus only those specialized workstation requirements? But the guide was very conservative, Pat. There didn’t seem to be a clear understanding of when the turnaround was going to come.

But what I do see is a company that has fixed some of its supply chain challenges, that is working hard to win share, that is continuing to deliver cashflow and profit to the bottom line. But it was kind of like I said, a bit of a muted quarter for the company where I think they’re looking to the future. Two to three quarters out from now is where I sort of feel like that real momentum is going to get picked up, but I think that’s just become the status quo for companies that are focused on devices.

Patrick Moorhead: Yeah, that’s exactly how I was going to start. The entire PC market is muted because there’s so much uncertainty. There’s interest rate uncertainty in Western Europe. A little bit more interest rate stability here in the US. China is a complete unknown. I think the good news is it does look like the channel inventory has been pretty much swept through. I mean, I think there was a lot to be excited about if people did the double click. I mean, profits are up, right? We’re not in a price war, although ASPs did go down and we’re going to have to wait for Dell’s earnings to see what happened there.

They gained unit market share, which I think is a plus. They did have sequential growth, which by the way, if we went from a calendar Q3 to a Q4, it’d be less impressive to me. But the Q3 calendar is always bigger than Q2 and it has been for 20 years. So I’m not as impressed by the sequential growth because the market, that’s just the way it times. It was nice to see it unfortunately got buried when Polly was up. That was a plus. I think that may have been some of the only businesses that were up aside from some other growth areas. So that was a positive. So I think the whole industry is uncertain here in the second half. And I don’t know if we’re going to get even clarity in the first half.

To your point on AI, Daniel, if the PC industry wants to have its moment with Microsoft, it’s likely going to have to create a new category of PCs like AI, the AI PC or something to make it black and white what that means, and they need to show that it increases ASPs, it increases profits, potentially signals a new wave of PC innovation.

Let’s dive into the next company here. Not to be confused with HP Inc, HPE, the Data Center to Edge company, the as a service company. Now listen, I had a great conversation with Antonio Neri yesterday, and to be honest, I really wasn’t expecting much given what we saw from Lenovo ISG, that they were off 8%, NetApp was off 10%. We didn’t have Dell yet. We did have Cisco, which crushed it, right? 30% on networking. But HPE is not the same type of networking company. But wait for it. The Edge was up 53%. That is absolutely boggling and just numb when I look at that number.

This Aruba acquisition has got to go down as one of the best acquisitions that I’ve seen in all of tech. And ironically, when HP and HP Inc were the same company, there was just a string of horrible acquisitions, so nobody really knew. But to Antonio’s credit, it has gone swimmingly well. So HP is also a service company, ARR was up 48%. They increased their EPS and free cash flow forecast.

So for the future, I’ll be looking for that HPC and AI number to get really big soon, once the remaining national labs come online. I think there’s three to go and they’re bigger than the first one that they went online. And of course, what also goes into that AI number is the training as a service on some of HPE’s crazy fast hardware out there.

The final thing on the Edge, SD-WAN and 5G hasn’t even kicked in yet on the Edge. And that can roll a completely new cycle out there. And it’s weird, as I was going through the numbers, Dan, and I was looking at ARR or that big ARR crawl chart. Is this company starting to look more like Cisco that has a lot of subscriptions and a lot of software, but is grounded in an infrastructure business? I don’t know. We’ll have to look at it. And by the way, the multiples between HP and Cisco are radically different.

Daniel Newman: Who do you think is higher everybody?

Patrick Moorhead: Well, I hope they know. No, it’s Cisco by leaps and bounds.

Daniel Newman: I was just seeing if our audience is smart.

Patrick Moorhead: Of course they are. We don’t have any dumb listeners.

Daniel Newman: I’ve always been under the impression that we have the smartest listeners on the planet, all 5,941 of them right now. So again, this is a really weird thing and I hate to keep using the AI train, but the AI train would indicate that any company that could sell GPUs should be just crushing it right now. But we all know that that’s not how the market actually works. First of all, those that actually had an understanding of the impending demand were first in line to get their orders in.

Second of all, the hyperscalers are getting the vast majority of the available GPUs. So when you look at a number like this and you see server sales down or compute sales down, you’re kind of blown away. Now the good news is the pricing power is immense and it’s helping these companies show better margins and being able to have better results on less revenue, which, who doesn’t like that?

The revenue growth was conservative, but it was there. And by the way, pretty consistent with what it’s been for some time we did see strong free cash flow creation. Pat, to answer your question, I mean, look, I wrote the original white paper that said everything is a service in GreenLake and why the company was well positioned. I did that five or six years ago. I know, I saw this. And the thing about it is that you can argue where workloads will be placed, but what you can’t argue is the way they want to be consumed. Meaning people want to consume, whether it’s on-prem or in the cloud in a much more public cloud-like fashion, okay? People want to consume software licensing like SaaS. People just want that model of constant care, service, SSLA update, support. And so GreenLake is always-

Patrick Moorhead: CapEx to OpEx too.

Daniel Newman: And wherever it can. And in many cases, depending on the company situations, CapEx to OpEx is also seen as a good thing. The growth of ARR, the growth of total contract value, the AI story is in there. The overall business performance was very good. And Pat, I think you mentioned it, and this is just something to point out, but there will be a next frontier of opportunity for not just AI but business at the edge. And HP has really crushed it in this particular area.

It’s kind of been a quiet crushing because while its overall business has kind of been in that mid single-digit up or down across the different categories, the intelligent Edge business, which the Aruba part, and of course the rest of its intelligent Edge business is growing mid to high double digits and it’s doing that pretty consistently, Pat.

And so when all this data at the edge and the inferencing that gets to be required to be done at the Edge continues to proliferate, that puts HP in a really good position longer term to monetize this whole, A, this as a service thing, and B, this whole AI wave, which is going to be a potential opportunity. The other thing by the way is I don’t know about the forecasting to be able to meet demand, Pat, but if you look at what Nvidia is forecasting and the ability over three or four quarters to deliver, I would imagine the backlog and the amount of that backlog that’s tied to AI servers and is tied to higher margin could really prop up the quarterly results over the next few quarters for the company if they’re able to, A, deliver on all that demand, B, continue with this strong pricing power and there’s nothing to indicate yet that won’t be the case going forward. So solid results, good times. Let’s go. I think we’re on a pace to finish this thing in only 10 minutes of topic

Patrick Moorhead: About that. Like I said, if you were paying attention, it’s a Six Five, but a lot of times it’s Six Ten. Let’s jump into Salesforce, not infrastructure, but a whole lot of AI. So listen, Salesforce had a beat, beat, beat. They ran the tables on this one. The growth was spread across most businesses uniformly with an exception. And that was data. That was up 16%. The rest were between 10% and 12%. And I have to think, and I’m just taking a whack at this, what’s the one thing you have to do before you jump into generative AI? You’ve got to get your data in line. That is the absolute number one thing. And Salesforce has a data cloud that melds on-prem and in the public cloud, so it’s very capable of moving that data back and forth.

On the call, Marc Benioff reiterated the four foci of what he’s trying to do. And then he talked about a fifth. So at first he talked about restructuring and the second they talked about reigniting our performance culture by focusing on productivity. I think that’s short for, let’s have some layoffs. Let’s clear the deck at the top and let’s get people back into the office. Third was about core innovations. Fourth was focus on the investor. And the fifth, gosh, guess what the fifth is, Dan. Could it be AI? So a lot of good. By the way, if you go in and hit the transcript, there’s a lot of juicy stuff in there. The thing I want to end on, by the way I am looking forward to going to Dreamforce and getting the lowdown here, is the expense reductions have been pretty crazy. I mean, within one year you’re looking at close to over a 12% reduction in expenses. And you know where it primarily was? Sales and marketing.

A little bit of a cost of revenue decrease. I’d love to figure out how they did that, but a big reduction in sales and marketing as a percentage of revenue. In fact, 37% going down to 30%. And that’s not just riding the revenue curve down. It is real declines in whole dollars, which are exasperated when your revenues are going up. And the revenues did go up right around 10%. The company has a pretty bright future. I’m feeling pretty good about not only the story they’re telling in generative AI, but their capabilities in generative AI. I think their biggest challenge going forward is pulling all their properties together. It’s their overall platform. It’s combining sales, service, slack, marketing cloud and data cloud together to become a holistic platform as opposed to best of breed capabilities. And I hope they hit that at Dreamforce in addition to generative AI.

Daniel Newman: There you go. So I’m freezing up a little bit here. So can you just confirm you can hear me?

Patrick Moorhead: Yes, I can. I can see you too.

Daniel Newman: All right, cool. I don’t know what’s going on, but my laptop’s being funky.

Patrick Moorhead: Lisa playing Xbox upstairs?

Daniel Newman: Sometimes it’s, yes, we’re streaming on at least 10 concurrent networks. All right, there it comes back. By the way, that still shouldn’t be a problem. So get after it Spectrum, or whoever it is I’m using your internet from. Pat, this quarter for me was, one, very indicative of the operating improvements that have been made over the past several quarters. Company made big cost expenses.

Now this is always very tough as an analyst because as an analyst you want companies to be healthy, growing, adding staff, adding people, building careers, helping move the economy forward. But as analysts, we also want companies to run intelligently, hire, right size their businesses, consistently invest in the right areas, and not get overweight. And Salesforce as a company I think that got a little overweight during the peak growth. And I can’t blame them entirely because sometimes it’s very hard when the growth is coming so fast, you do need to hire ahead of the growth.

And to be clear, let’s remember even when its earnings were down, it was still making a lot of money. So it wasn’t like having those headcounts was making it not make money, but Wall Street, they’re particularly brutal when it comes to performance. And they don’t just want companies that make money, they want companies that make a lot of money. And Salesforce has always been one of those companies.

It’s also a company that’s sort of found maybe growth maturity, meaning remember, Pat, when it used to be growing 30% and 35% and 25% and it was going to double every four or five years? And now after making a lot of big acquisitions, spending some big money to buy Slack, to buy MuleSoft, to buy other companies, Tableau, it’s growing at low double digits, which by the way is not that different than IBM growth. So you do have to kind of call that out. And so this company is at this inflection where it, A, has to be a little bit more of a mature company in the sense of being focused on growth, but it also has to focus on being profitable. And so this is an inflection for the company.

Now, what I am really excited for the company is its data cloud and its AI growth, and you mentioned these things, so I won’t spend too much time on them, but the company’s AI day was very, very compelling. And the fact that it’s got an approach for security and privacy and AI for its sales and marketing solutions. And it had pricing where you could start to see the incremental impact, which there are different equities folks that are looking at it being anywhere from a few percentages to a double-digit percent growth on a per user basis. This could be a very exciting incremental growth opportunity for the company. I’m excited about that. I’m watching that. We’re paying attention to that. Longer term, I think that’s going to be something that is very important longer term.

Pat, we’ve got Dreamforce in two weeks, so I’m very excited to hear about what’s going to happen. I think Benny, I’ve got a Time 100 AI award or something. By the way, he’s part of the Time 100 most influential–

Patrick Moorhead: Doesn’t he own Time?

Daniel Newman: I don’t know. Does that matter?

Patrick Moorhead: Well, kind of.

Daniel Newman: Don’t you own AR Insights? I’m just kidding. I’m just kidding.

Patrick Moorhead: I wish. Some people speculate.

Daniel Newman: Anyways, but you only own the top spot. But anyways, and that’s it. So it was a strong but kind of very business as usual quarter. I’m looking for more on AI. Looking forward to hearing more on AI. This is a company that, it’s not AI washing. It’s real. When your applications enable you to understand customer behavior, sell more, sell more profitably, get more activity and productivity out of your teams, that’s the real crux of the opportunity as I see it short term. And I think they’re heading that direction. Now I just want to see it turn to dollars on the financial statements. Pat, I got nothing more for you.

Patrick Moorhead: Nothing more. Well, what’s on the calendars? You talked a little bit about Dreamforce. You and I are both going to be out at that. By the way, they definitely cleaned up at least the four city blocks that we were part of down there. I didn’t feel like my life was in danger ever. So I hope that continues. But yeah, I got-

Daniel Newman: Did not step in poop. No poop.

Patrick Moorhead: Oh, that’s good. That’s good.

Daniel Newman: Sadly, that’s now winning. There isn’t a lot to do and everything closes early though, which is kind of weird. Now if people know us, it’s not that weird because I go to bed at like 8:00 PM, but it was weird that I go to bed at 8:00 PM and most things were already closed. The bar in the hotel is like 9:30 PM, 10:00 PM. I mean, the city’s just not the city right now.

Patrick Moorhead: I know when you’re me and I go to bed at 7:00 PM, it’s perfect. As long as they have room service in the hotel, everything works well. So yeah, Dreamforce, then Six Five is going to be at Intel Innovation in San Jose. And then we’re flipping over to visit Oracle, and I may or may not be attending a Microsoft event in New York, so we’ll see.

Daniel Newman: Yeah, well I think actually I’m heading to IA Mobility, an automotive and mobility show in Germany next week.

Patrick Moorhead: Nice.

Daniel Newman: I’ll be doing the long haul and then I will be with you at Dreamforce. And then, yeah, we’ve got a bunch of stuff. At the end of the month I think we’re heading to Confluent. And then we’ve got what? OpenText. There’s a whole bunch of things, both the Six Five and the Futurum Moor Insights roadshow carries on, so.

Patrick Moorhead: I know. It’s fun stuff. Yeah, I’m actually getting used to this travel stuff. I just need to go to bed at seven every night on the West Coast and it works out swimmingly well. Well hey, we really appreciate all of you tuning in. And again, give me all the compliments and give Dan all the complaints. We will integrate them and probably not listen to anything, but we do love you. Hit that subscribe button. I’m just joking. Come on. Will you laugh, Dan? Or are you beyond humor at this point?

Daniel Newman: I’m about ready to gouge my eyes out.

Patrick Moorhead: All right, thanks again everybody.

Daniel Newman: Listen, I like funny people, you’re just not funny.

Patrick Moorhead: Okay…

Daniel Newman: You’re nice. I like you.

Patrick Moorhead: Thank you. He’s nice. Yeah. Thanks buddy. Take care everybody.

Author Information

Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.

From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.

A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.

An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.

SHARE:

Latest Insights:

New pNFS Architecture Addresses Data Storage Needs for AI Training and Large Scale Inferencing
Camberley Bates at The Futurum Group covers Pure Storage FlashBlade //EXA announcement for the AI Factory.
Strong ARR and Margin Expansion, but Investor Concerns Over CapEx and AI-Driven Shifts Remain
Olivier Blanchard, Research Director at The Futurum Group, shares insights on Samsara’s strong Q4 FY 2025 earnings, the 11% stock drop, and key investor concerns over CapEx slowdowns and AI-driven edge computing. How will these factors shape Samsara’s growth?
Google’s Latest Pixel Update Brings AI-Driven Scam Detection, Live Video Capabilities in Gemini Live, and Expanded Health and Safety Features to Pixel Devices
Olivier Blanchard, Research Director at The Futurum Group, examines Google’s March Pixel Drop, highlighting AI-powered Scam Detection, Gemini Live’s updates, Pixel Watch 3’s health tracking, and Satellite SOS expansion.
Discover how AI is driving major shifts in tech earnings on this episode of the Six Five Webcast - Infrastructure Matters. Learn about Broadcom's AI-fueled growth, VMware's Private AI Foundation, Salesforce's Agentforce, and the satellite internet race, and the impact of tariffs and the future of AI in business.

Thank you, we received your request, a member of our team will be in contact with you.