On this episode of The Six Five Webcast, hosts Patrick Moorhead and Daniel Newman discuss the tech news stories that made headlines this week. The handpicked topics for this week are:
- Google Complains to EU About Microsoft
- HP Imagine 2024
- Micron Q4FY24 Earnings
- GSA US Executive Forum
- NetApp Insight 2024
- Intel Lunar Lake Benchmarks
For a deeper dive into each topic, please click on the links above. Be sure to subscribe to The Six Five Webcast so you never miss an episode.
Watch the episode here:
Listen to the episode on your favorite streaming platform:
Disclaimer: The Six Five Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we ask that you do not treat us as such.
Transcript:
Patrick Moorhead: The Six Five is back, and we are live. It’s Friday morning at our typical scheduled time. Dan and I got back pretty late last night. I think I got back around 11:30, in bed by midnight or 12:15. I don’t know, but my aura ring told me I got five hours of sleep. But you know what? Got up at 6:00 and did a weight workout at 7:00. Tried a pre-workout that just tingly, tingly, tingly on my skin. I don’t know if I’m going to do that again though, Dan. Dan, good morning. How are you, bestie?
Daniel Newman: Hey man, look, you just got to adapt to it. You got to get that high beta alanine, itchy forehead, you start looking like you’re a tweaker in the gym, but you’re just pulling all kinds of crazy weights. I mean, that’s all part of getting big. You’re getting jacked now. You went from a little bit puffy to-
Patrick Moorhead: Obese.
Daniel Newman: … really thin.
Patrick Moorhead: I was a fat pig.
Daniel Newman: I didn’t want to say it, but you’re allowed to say it. It’s like your own baby, but now you’re lean and you’re getting mean, so I’m proud of you. You got up, worked out all these west coast days, feeling like a loser because you slept till 5:30 but it’s already noon on the east coast. But yeah, doing good, Pat. It was a good week, it was a really good week. We got to meet with some really great people, learned a lot, spent a lot of time on the coast and even made a little jaunt to Vegas.
Patrick Moorhead: Yeah, it was crazy, right? First part of the week was HP Imagine, we’re going to be talking about that. Middle of the week, various advisory meetings, and then wrapping up the week with the Global Semiconductor Alliance conference. But we got a great show for you today. We are super psyched. We’re talking about Google complaining to the EU about Microsoft’s behavior. Microsoft typically isn’t in a lot of these antitrust conversations, but here we are. HP Imagine 2024, HP has two to three signature events a year, and this one was all about innovation. The Six Five was there, hopefully you can check out all of our content there. Micron came out with earnings. Is HBM and AI doing well for them? We’ll see. GSA US Executive Forum, Dan and I were both present. I’m going to break that down. What did we learn? Who did we talk to? Who did we take selfies with? We do a lot of selfie analysis here on The Six Five, so I’ve heard, but.
Daniel Newman: Is there any other type of analysis?
Patrick Moorhead: I don’t think so. Just selfies with CEOs, that’s the extent of the insights. So anyways, NetApp had their signature event. NetApp INSIGHT 2024. Are they getting into the data game like vast? We will see.
Daniel Newman: Six Five was there.
Patrick Moorhead: Six Five was there, and the first Intel Lunar Lake benchmarks are out, that came out on the 24th. Intel wrote some big checks in Berlin and promises made. Do they deliver? We’ll see. Let’s jump right in here, bestie-potamus. So Google complained to the EU about Microsoft’s behavior in a couple different areas. We don’t see Microsoft in the news related to antitrust a lot, do we? Compared to Google and Amazon and some of the other Mag Seven.
Daniel Newman: Yeah, that’s an interesting one, Pat. Is that a pot-kettle thing, or what do we get going on there? It’s kind of one of those things that really, it’s a tightrope, right? Just because you’re being accused of certain monopolistic behavior, does that prevent you from accusing others of other types of monopolistic behavior? Should it?
Patrick Moorhead: It’s like the spider with the Spider-Man meme, one of my favorite where they’re all pointing to each other.
Daniel Newman: Yes.
Patrick Moorhead: No, you’re a monopolist. No, you’re a monopolist.
Daniel Newman: We’re all monopolists, but so in Europe it’s an interesting venue to decide to kind of go down the path of pursuing antitrust because, what do we know about Europe? They love regulatory-
Patrick Moorhead: Love it, man.
Daniel Newman: … antitrust, competitive. They love to take down competitive people that abuse competitive powers. Google basically made a complaint this week on Wednesday saying Microsoft uses unfair licensing contracts in its Azure cloud computing business to stifle competition. Now, does this take you back in history? Is this the first time that a company named Microsoft has been accused of bundling software or using its packaging of different softwares into licensing to gain advantages? We’ve seen it for almost what, two and a half decades now, this has been a thing? But at the same time, Pat, Microsoft has done a remarkably good job over the last three or four years of avoiding the regulatory spotlight as its peers in the Magnificent Seven have all been under duress.
Basically, the long story short here, Pat, is Google is saying that Microsoft is making it really hard for customers to move workloads to competitors’ clouds. They talk about licensing fees as much as five times higher for companies that want to run their Microsoft workload in a Google environment. And this is something that has been settled. There’s been settlements behind the scenes for somewhat similar behaviors. Of course, it’s all redacted, we don’t have the details of this settlement, but Microsoft is supposed to be making changes based on that July settlement to reduce these sort of behaviors. Pat, I got to be candid here. It’s really hard to break down. I shared a longer sort of Tweet here. The nuance of antitrust is so substantial and it gets to be really difficult to talk about. But one of the things I sort of broke down, and this comes up every time you and I have to go over this, and I use Apple as an example. But we have these two sort of conflicted efforts with antitrust that weren’t historically the case when you go back to like a Ma Bell. When you go back to the Ma Bell era, it was both monopolistic in the way that they stifled competition, and it was punitive to consumers because consumers had no choice and they had to pay more because of the monopoly that these companies had. Microsoft isn’t.
Microsoft wants to keep people on Azure’s cloud, and so they use their diversification of product services and licensing in their bundle to make it more appealing. They also believe that kind of stacking, so you stack hardware, software, platforms, services together to give people a better experience. And so when you optimize say, workloads for your particular hardware, they run better. At least that’s the idea of it. And so you have this kind of push and pull, Pat, and it goes on because Microsoft could kind of look at like, we optimize everything to be the best and of course it’s cheaper because it’s all running in a stack that’s being scaled, optimized for great customer experience. But at the same time, that’s a very fine line from where you start to bundle.
But all of a sudden the question is, well, if the consumer is getting a good experience but at the same time you’re making it hard for people to compete, are you going to make the experience worse in the name of making competition better? And that’s the question mark that I have. And I’m not saying that running on Google is worse, I’m just saying that that’s kind of this fine argument that I’ve had for a long time. And the easy thing for everyone out there, Pat, is like in the app store. It’s like, nobody wants to side load an app. It’s not convenient to have to go to an outside site and go get Spotify and figure out how to get it on your device and utilize it as an application. You want to run it through the app store. So this is the simple thing. Having said that, because Apple knows that, they charge a fortune to every app developer to be able to have that convenience. And then it comes back to marketplaces, convenience, innovation. Who pays for companies that have great ideas and build great modes?
So this is kind of like I said, a little bit of a push and pull, and it’s between two companies that both know all too well how to use their power dynamics and their strong market positions to stifle and keep other companies out of their markets. Great capitalism, monopolistic behavior. It’s a very, very blurry line, Pat, between those two things. But it’s an interesting case in a venue that is most likely to have some sort of profound decision, because Europe doesn’t like these kinds of behaviors.
Patrick Moorhead: Yeah, I mean, Europe is a huge innovator in regulation.
Daniel Newman: What? I was like, where are you going with this?
Patrick Moorhead: I was waiting to bring that one out. No, it’s funny. History matters here, particularly in legal because it’s based on precedents, but you do have kind of the trends, administrations like Lina Khan and what happens over there in the EU. Historically, by the way, Microsoft, there was a verdict that came in in 2000 that they were to be broken up as a company. And guess what it was for? It was essentially giving away Internet Explorer for free with Windows, and they were ordered to split the company up. So there is a lot of precedents here. And you might also recall Teams, I’ve forgotten if Slack is still suing Microsoft for basically giving away Teams with a E-5, E-3 license, or if that’s antitrust, but very similar. Now, you just can’t give stuff away, you cannot give stuff away at a loss. Okay? That’s one of the keys here. That’s super hard to prove, right? Which is, and by the way, Microsoft isn’t giving these license away for free when it’s on Azure, but they’re likely putting in pricing to incent their customers. I have heard challenges from all of the hyperscalers about Microsoft, particularly on Windows licenses and how much it costs. But yeah, this is a tricky one. This one is definitely gray to me.
On one side it’s VPA, volume purchase agreements. The more you buy, the more you save. I’m channeling my Jensen Huang here, versus giving stuff away for free, predatory. Meaning drive people out of the market and make it so difficult for them to compete. So, we’re going to keep our eyes on this. This was what Microsoft was broken up for in 2000. Of course they appealed, they won on appeal in 2001, but my gosh, that setback Microsoft, I don’t know, 15 years, in terms of how they operated. Okay, let’s move into the next topic here. Dan, you and I and The Six Five attended HP Imagine 2024. We kicked off the week actually at the HP garage, which is the birthplace of Silicon Valley. That’s not something I made up, it’s actually on the plaque in front of the HP Founder’s home and their lab. And we actually did recordings with the leadership team and the extended leadership team inside of Bill and Dave’s garage. So, it was really good.
So thematically, some of the stuff that stood out for me, first and foremost, they’re all-in on work. HP does a lot of consumer and a lot of work, but they’re doubling down on work and that was Alex Cho’s statement. A couple things there. So first and foremost, there are more profits, profit dollars. So from a business perspective, this makes sense. And then when I look at the portfolio, including things like Polycom and commercial grade printers, this strategically makes sense to me. And then we add onto that the services that go along with that, that they’re willing to pay for. I get it, and they want to drive share. Right? So on AI, of course we talked about AI and moving that forward, and this was in the context of the PC, workstations, printers and more. A couple standouts for me. HP really leaned into an AMD commercial product, calling it the highest performance AI product that they have with 50 NPU tops, and they’re swinging for the fences with AMD. HP has a lot of Intel and they also announced a Lunar Lake consumer platform, but there just wasn’t a whole lot of discussion about that. They were all-in on commercial, and this was AMD. We’ll have to see how AMD sells. AMD can sell in small business, but I don’t feel like it sells even close to Intel.
On the workstation side, what I really appreciate what the company is doing there. It is funny, we throw around this word solution, but what they’ve done is they’ve put together this end-to-end workflow for data scientists and developers, front end and back end developers, and it’s pretty cool. It even interacts with Jupyter files, and think of all of the LLMs and SLMs. Not that they’re training 400 billion parameter LLMs, but when it comes to customization, when it comes to data management, you can very much do that on-premises. Right? When you see a developer cranking out programs and manipulating data, these types of platforms.
One of the biggest challenges right now is getting enough GPU performance. And they introduced a new feature called Boost, and Boost is essentially the ability for anybody in the work group to be able to share a GPU, even though they might not have it on their system. Try to get underneath the technology to try to figure out what’s actually making this work. Like, is it virtualization like we see in the data center? Is it something different? Do you have to pin? Does the other user take 100% of the other workstation? I don’t know these things, but I’m going to give HP the benefit of the doubt right now that this thing works and it works well. So, pretty cool. Dan, you and I talk about GPU sharing in the data center at enterprise and hyperscaler, that’s not easy, by the way. Right? I mean, new technologies to figure out how to virtualize it, because it was typically on or off, 100% or not. Bestie back. And then let me talk about printer. We don’t talk a lot about print on the show. Print is very much a large portion of HP’s profit dollars and they brought out a feature. It’s so funny, I don’t ever want to use social media play and interaction or views as the only way to gauge interest, but I had a lot of people commenting on print. Well, what could HP have done to generate this interest?
So I don’t know, as long as I’ve printed, LaserJet or Inkjet, beyond Dot Matrix, try to print a one-page spreadsheet and you get 17 sheets, and row N is the final 10 pages. Or, you’re just trying to print a web page, or one, trying to print off a map maybe to give to somebody or a recipe or something like that, or a school project off a website, and you end up getting the footer and the header and the right rail, they’re all separate pages and it’s ugly. Well, HP print AI essentially takes a snapshot of what you think you want to print and it asks you, “Are you sure you want to do this? Are you sure you want to do this?” Also on the HTML pages, you can parse out ads, which is great stuff. Final, just to make a long story longer, Dave Shull got up there and talked about solutions and services, and this is more, think managed services, where let us do the driving. It does include service, but managed services is another big revenue opportunity. A couple things, below the bios type of services for remediation, online remediation, and finally, managed room solutions. So think of Polycom coming in and instead of the hit rate being 50% when you walk into a conference room, imagine that being more like 75% or 90%. I’m making up these numbers, we haven’t done the research or done an economic analysis behind this, but that’s the thesis of what they’re doing.
Oh, you’re muted, bestie.
Daniel Newman: Someone’s car alarm’s going off so I had to go do a little triage real quick.
Patrick Moorhead: Bee-oh.
Daniel Newman: Closed the door, so I hope you saw my little note. But yeah, you covered a lot of ground there and it was a really interesting event, Pat. And we did have a lot of fun, we hung out in the HP garage, oscillators and vacuum tubes. It was a wooden shed, it was probably the hottest day in the valley in a long time so we were glistening. If you see Pat, little beads of sweat on his head as we talk to these executives, just know it was like 110 degrees inside that shed. But what was even hotter was the announcements, Pat. You and I, we’ve gone back and forth a bit about kind of how this next wave of AI devices plays out. I think we’re starting to see in the real world how companies are adopting.
We did hear a fairly prescriptive comment from Enrique Lores, CEO, who we sat down on the pod this week and talked to about commercial leading the way. So that was one of the interesting things here. We’ve been through about two decades of what I call consumerization, where consumer technology has really led to commercial, but we are right now seeing the first iteration. And why would that be, Pat? Well, it’s efficiency, productivity. So companies are excited about this. 18-hour battery lives, more sustainable devices. Companies are excited about this. The potential for various agents and co-pilots to use an NPU to enable people to do private, secure work on their device. I think enterprises and companies are excited about this. I think consumers are sort of still looking for that killer app to some extent, that thing that’s really going to drive. And battery life of course drives, but as you’ve seen generation to generation with iPhones, a new battery life and a slightly improved camera only can create so much enthusiasm. So people are really waiting for what those next things are.
On the productivity and business side though, Pat, there’s some really great built-in capabilities in these new HP devices. I like some of the things that we’re doing with the AI production. For podcasters, video creators, being able to do with one person what feels like a three-person crew, webinar, webcast, video, stream, very, very powerful. The AI Director, you’ve got what AI can do for video conferencing to be able to basically take four or five people in a one room together, but give them the same sort of screen equity so that unlike in the past, Pat, where it’s like one executive and five people in a room. And then you get each person, you’re trying to figure out who’s talking. Depth of the room, you can’t really read people’s faces and emotions, which was cool for 2005 video conferencing. But in 2024-5 five video conferencing, we want each person to be represented. We want each person to be able to see them. And, Pat, I don’t know about you, but when I’m on a video, I want to see their face. I want to see if they’re sweating when I’m talking to them. I want to know that I’m hitting the right buttons. That’s how I get the reaction. That’s how we close the business. So that’s kind of where a couple of things my head was at.
I do really like and I don’t need to spend a lot of time doubling down because you covered this well, but look, sometimes, Pat, it’s about basics. And the print thing, the AI print thing to me was like, I don’t print a lot. I’m not going to sit here and be like, I’m an everyday printer, but I swear every time I print something it’s screwed up. It’s like, oh, I wanted to print a webpage really quickly because I wanted to look at something or read something, and it’s like 19 pages for one page. I wanted to just print a homepage image to look at something, or a spreadsheet. Gosh, for sake, I want to actually sit down and maybe make some scribbles and notes on it while I’m working. You always get that weird column Q that’s on the next page and it ends up printing 11 pages for that one column. Fixing that, Pat, is meaningful. By the way, it actually is useful from a utility and a cost and ink and everything else, it actually really adds up.
So some of these things you can call iterative, some of these things you could call breakthrough, but I thought it was a really encouraging event. The numbers will tell the rest of the story from here. And so how this innovation finds its way into business and ultimately into consumers, is where we’ll watch. Oh, and I like the GPU sharing. I think people are going to need to understand the application a bit better, but I like what they’re doing there. I mean, this is all about utilizing horsepower. You have horsepower, use horsepower. So HP in this case, with that particular service, Pat, stands for horsepower. More AI horsepower.
Patrick Moorhead: No-
Daniel Newman: That’s it for me, that’s all I got.
Patrick Moorhead: Yeah, so you can imagine somebody on a low power notebook that doesn’t have a discrete GPU who is going to tap into the GPU on the big workstation. That’s kind of my expectation of what they’re going to do. Kind of in a very similar, I mean, it’s like a client server relationship. And instead of it being a server in the data center or a server in the cloud, it’s a giant HP Z workstation. So let’s move on here. Micron, big-time provider of memory and storage, stock popped based on what they’re doing in AI accelerators and GPU memory. But also, as we’ve seen with this accordion memory market, they have pricing power back in the industry. Dan, how’d they do? Any surprises here?
Daniel Newman: I mean, I think people were positive to the upside coming out, because we’re kind of at the tail or the beginning of the next wave, however you want to look at it, but this is really the tail end. Micron, one of the sort of leading indicators of AI demand. And remember, it’s been kind of a bit like to your point, of an accordion where the stock went way up to about 150 and then it dropped almost 50% when others… And so the memory space is a very boom-bust space, and despite the fact that HBM3 has a ton of demand right now, we’ve heard it’s sold out through ’25. We’re hearing they’re trying to add capacity. We of course, alongside CoWos, different TSMC processes, there’s so little capacity to make more, but it’s going to be used up all the way to the brim.
I actually watched the interview on Squawk on the Street with the CEO Sanjay Mehrotra, and it was interesting to hear his sort of takeaway on it. I mean, the company did well in this quarter. They pushed guidance forward up in the next quarter. It’s not like some staggering amount, but they basically, I think they beat very narrowly this quarter but they did say that they believe that the growth of this space is substantial and it’s not a long horizon. Meaning, this isn’t like… One of the questions they asked Sanjay was, well, is it a year? Is it two? Is this going to max out? And he was pretty adamant that that’s not the case, that this is a several year tailwind for the company.
Beyond the HBM thing though, Pat, he was also, we just got done talking AI PC. Well, Micron has a strong affiliation with this demand cycle. So I do believe there’s an increased demand cycle, I call it an elongated cycle. We’ve talked about a super cycle, but I think we believe there’s an elongated cycle that’s going to be created for new devices that have these NPUs and these AI capabilities. Micron provides content for the phones, they provide content for the PCs. They’ve got some exposure to connected devices, IOT, smart glasses. Pat, we’re not going to talk about Meta on the show, but we know that Meta had a pretty interesting week and announced something very cool with a new unused name for innovative technology called Orion. Nobody’s used that name before for anything. And then they also have exposure in vehicle and automotive.
So they’re diversified into AI and the scale of this AI demand wave, but I’m pretty certain that most of the enthusiasm coming out the gate here is HBM3 driven. Its, this is basically the first indicator going into the next cycle of earnings, which is going to start in just two weeks, Pat, with IBM and Intel and others coming down the barrel, that AI is still hot. That the sort of bearish sentiment that comes from the corners of Fintwit and the Perma-bears on Wall Street that are saying AI is a fad and it’s going to… Pat, you and I when we go on and talk about the conversations we had at GSA next, I think we can be pretty confident having talked to foundry, talking to packaging, talking to material companies, talking to design. There’s no bearishness in the semiconductor space about AI. And so Micron is a beneficiary.
But again, this is a company that went to 150, went back under 100, and now it got a 17 or so percent pop on the news, but it’s still nowhere near its highs right now. So I think it’s still an interesting play, and the company’s got some leading technology here in the US. They are an actual manufacturer of Foundry, a Chip Slack beneficiary, and they seem to be coming out of what had been a very, very tough period of time into a number of tailwinds that should actually prop the company’s long-term up because, Pat also, these one-year cycles. These one-year cycles on data center open up the door.
Patrick Moorhead: Great stuff. I mean, I have nothing good to add other than, Micron reflects the market. And where it’s hot in AI and that’s data center, when it comes to smartphones and PCs because they pulled out on CapEx like everybody else did, they have more pricing power than they had before. But we haven’t seen a boost in PCs, we haven’t seen a boost in smartphones market, hence they haven’t seen a meaningful volume increase, but they have seen price increases and price increases are good. Hey, let’s move to the next topic, and that is the Global Semiconductor Alliance Executive Forum that both you and I attended. We both attended the dinner. I did a presentation for their board of directors, which by the way, had everybody from the Mag Seven there, which was pretty cool. And then I did an innovation panel on stage. So the conference, it really reflects semiconductors, right? Discussions on AI, manufacturing challenges, how do you play China? And then, challenges in some of the key elements of AI, compute storage and connectivity. And the conversations are very similar to we talk about even with the board, I can’t give exact details, but essentially, talking through the downstream benefits of AI.
One thing I did notice though is, Dan, as you and I researched not just semiconductors but the markets they go in and the use cases that drive it in the cloud and on the PC, I thought our conversations and my conversations with the board of directors, it was enlightening to them. They don’t spend a whole lot of time, it appears, on the downstream application. So we had a very lively conversation on that. A lot of conversations about GPU and accelerators, and what does that mean? And sustainability and power. Not sustained, power, like power generation, what’s going to happen? And also maybe, disintermediation of smartphones in five to 10 years, given generative AI devices and these goggles.
Just to wrap this up, some of the discussions with Astera Labs, SambaNova and Celestial AI on my panel. No surprises, but three to five years of innovation, model size and complexity, still going up. Cluster size, moving from 100,000 nodes to two million, and across data centers. The shift from mega-training to inference, going from 90-10 to 10-90. And specialization, right? Like ASICs and XPUs versus GPUs. And then following up the conversations and obviously SambaNova is biased. Both Celestial AI and Aster Labs, they support both, right? But both these mesh and networking companies are both working to lower power, increase performance, and lower cost, and that’s the game here in semiconductors. It is interesting, both Astera Labs and Celestial AI very much have clients out there that both you and I would know, they’re integrated into their solutions. I did get a tour of Astera Labs, saw public stuff. I saw basically an H series rack with their chips all over it, as well as an MI300 platform, and I’ll just leave it at there.
Daniel Newman: Yeah, I didn’t have quite as much time as you did there, Pat, but one thing is I wish we could actually tell the people some of the great conversations we had. Sorry, you’re going to have to get into the room and it’s a tough room to get into. And so by the way, thanks bestie, for making that happen. I’m humble pie, I’ll admit it. The overall sentiment though, Pat, you know chips are a boom and bust industry. Overall sentiment though, is extraordinarily bullish and just you know that there’s been times where it hasn’t been, but nobody seems to feel that this is coming to an end anytime soon. Now, that could be misconstrued as sometimes the exuberance of a top, right? That sometimes when nobody sees it coming is exactly when it’s coming. But I think most agree that it’s not because they don’t see it coming from some place of exuberance. They don’t see it coming because they believe it’s still really early, that we’re really just getting started. We’re just getting up to speed on capacity, of capabilities of new packaging and designs and the amount of innovation that’s coming up.
But there are a lot of downstream challenges, that was the one thing I will lead, and some of the conversations I had is, talking to a lot of the people on the design and delivery side, there are some kind of waiting and waiting for that kind of utilization rate to go up, because everyone believes the build out is very, very substantial. But at some point on one-year cycles, how do the cloud providers make money? How do the ISVs make money? But hey, where there’s great software, Pat, behind it, there should be great hardware. So I also see a big opportunity, one of the things I’m predicting now is with the simplicity that’s been put into being able to design and build your own chips is, are more ISVs going to do it? Are we going to get Uber chips for Uber software? Are we going to get, of course we’ve seen it with Meta and Alibaba and Google and they’ve sort of had the ability to do it in a dual capacity, but do companies like Salesforce with Agentforce, are they going to build a specialized chip down the line that’s going to make them more efficient, going to give them more control? And of course, the ability to make sure their solution is absolutely optimized for their use case. It certainly would be more price efficient if they’re using tons and tons of GPUs. So something to think about down the line, Pat, but very exciting times. I know we’ve got to keep moving, so I’m going to stop talking.
Patrick Moorhead: No, no, this is great stuff. So Dan, The Six Five was at NetApp Insight ’24 in Vegas. You met with senior executives, CEO. What did they announce? What did they announce?
Daniel Newman: Yeah, there’s a lot going on there, and our team wrote some great pieces on kind of the in-depth announcements. We’ve got some real storage nerds, but we maybe most importantly, want to note that the evolution at the evolution at NetApp is all about the pivot, right? It’s all about the pivot from being in storage to being one of these kind of new data or they like to call it, they call it IDI. They refuse to do the AI whitewashing. That was their, Gabie Boko and George Kurian I spent time with will tell you this, but intelligent data infrastructure is about, are we starting to see an era of limiting the sort of stack for accessing data from the application layer?
And NetApp is a first party inside of all the public cloud providers, so their file system is used first party like basically ISV, OEM, where people don’t know when they’re using AWS or they’re using Google for a certain file, they’re using NetApp. But the other thing about it is as we’re building out vector file block object and we want to be able to essentially access all the storage across our data estate. In the long run, why do we need these middle layers of abstraction for data management? We heard about data lakes, data warehouses, data streams, data swamps. I’m joking, those don’t exist, I don’t think. But at some point we’re seeing VAST Data do this, we’re seeing NetApp do this, we’re seeing Pure Storage do this. And we’re seeing other companies like Weka entering the stage of saying, “Hey, can we access that data in some meaningful way through using metadata and using the different storage archetypes to allow the application to directly call the storage and file system to basically do AI?” And that’s interesting where the company’s heading.
Another thing that they’re very focused on that I think is interesting is the combination of being able to do logic and being able to do speed of being able to update data sets. Now, if you understand the way storage works, and this is a little bit nerdy, but essentially you have to continually take these kind of snapshots of the entire storage ecosystem. And each time data is updated, you have to move the data to make it accessible again to the applications. In most cases, it’s done in its entirety. So every time you have to move the data, it’s a massive project. It can be terabytes and terabytes of data, sometimes hundreds of terabytes, sometimes more that has to be updated continuously so applications have the newest, you remove all the duplication. So NetApp’s basically built a system that can only continually update in an agile fashion where you’re only updating very quickly the data that’s new and what’s changed, as opposed to having to move all the data. So that’s another thing that they’re doing that’s pretty interesting.
But overall, Pat, I think what we’re seeing now is just a new era. We’re seeing a new era of companies that are software-driven, enabling storage for an AI world where I think we’re going to start to see some of these traditional data management software competing with storage companies and creating a kind of a new layer. And Pat, I don’t know if this also opens up an M&A conversation. Does Cloudera pair up with somebody? Does Databricks pair up with somebody? Does storage… like, HCI kind of, that was a thing for a while. Are we seeing kind of the new era of converging infrastructure with software to make AI more available? Good event. It was great to spend some time with their leadership team and I did it all in about four hours.
Patrick Moorhead: That’s the way to do it sometimes. So, yeah, Dan. Yeah, NetApp was formed right around when I started my career in the early ’90s, they were formed in ’92. And it seems like every five to 10 years you have hot new storage companies coming out. NetApp didn’t create the first NAS, it was a company called OSPEX, but what they did do is they radically simplified storage and made it easier for different servers to share storage. I think it was a first product called Filer, hopefully, I’m not mistaken on this. But more importantly, NetApp was first with their public cloud interactions, I believe it’s called ONTAP. I’m hot on hybrid multi-cloud fabrics, but NetApp, even though they were selling a ton of arrays on-prem and in colo and in private cloud, gave you the ability to have a common type of pane of glass to be able to move data around on-prem and in the cloud.
NetApp was the first with what they announced, it’s fast data. I have to give them a credit here, a much smaller and nimbler company. But NetApp’s scale is far and wide. They’re a fricking huge company. So this natural compression of the stack between storage and the data makes sense. We see that in every industry when you’re trying to grow. And it’s not just growth for the sake of growth, there is a performance advantage to compressing the stack and integrating files, integrating data and metadata out there, particularly in the age of AI. And then if you can pull that data in and give the privileges, the security privileges that actually makes your generative AI or even machine learning access a lot more secure. So, I’m hoping to learn more about what they brought out at Insight ’24. And Matt is going to be, and my team is going to be doing an analysis of what he thinks. What’s pretty cool is just seeing a ton of innovation here with Vast Data, with NetApp, and Pure. The one company I’m not hearing a lot about is Dell. Where is Dell in this entire game? Again, to be determined.
Thanks, Dan. Oh, there we go. All right, let’s move to the next topic. And that is Intel Lunar Lake benchmarks, they are out. They came out, let me see, three days ago on the 24th. And essentially, Lunar Lake was a chip that has a big NPU, very similar to what Qualcomm brought out first and AMD brought out second. AMD argues they did come out first with a larger NPU, but there wasn’t a whole lot of software to be able to take advantage of that. But the claims that Intel made on this chip were, we’re going to have superior performance and superior battery life. And Intel has always had good performance, but when it comes to battery life, Intel and AMD didn’t do great.
Also, a little fun fact. The tiles that are used in Lunar Lake are not manufactured at Intel. They’re actually manufactured at TSMC. So, it’s on, right? You have AMD, Qualcomm and Intel Solutions now battling it out on design, as opposed to Foundry and Fab technologies. So it’s literally mano-y-mano. A one unique architectural thing that Intel adds, they put memory onto the package and you can do some pretty interesting stuff related to latency and lower power. So how they do, so to me, it mostly as I expected from single-threaded CPU performance, many-thread performance, MT, graphics, and battery life, and NPU. But I got to tell you, it raises a lot more questions than I have answers, because the benchmarks that I saw across four or five websites, and by the way, Signal65 has yet to publish our tests, but they are on the way. Single-threaded CPU performance, pretty good. Some tests had Qualcomm leading, PC World had Intel leading. Okay? On the same benchmarks.
Daniel Newman: There’s only one real test, right? It comes from Signal65. I mean, are there really other tests?
Patrick Moorhead: There’s really only one definitive test house and that’s Signal65, as we say in jest. No, PC World had a pretty good analysis. But we had both Tom’s Hardware and PC World saying different things, like who won, on the same benchmark. Okay? MT, many threads, I knew. I mean, you’re doing 12 cores on Qualcomm versus eight cores. And even if you have a single-threaded processor that operates a lot quickly, there was just no way that Intel was going to win here. And many threads, Intel and AMD are dominant, okay? When you’re using this at the same time. Office productivity performance under Procyon, Intel gets the crown. One of the things that threw me was not only the benchmarks that went back and forth, even on the same test, but it was the plugged versus unplugged. I should have known this because AMD has had higher performance plugged in, you get a lot of boost out of that. But I thought with Lunar Lake starting over, that that would not be the case. Qualcomm has very consistent performance when it’s unplugged, and it appears that when you unplug Intel, the performance drops handily, particularly on the CPU.
Most confusing part was battery life. Tom’s hardware had Qualcomm winning by 35%. Forbes on Procyon video had Intel leading by 12%. Forbes Procyon productivity on Forbes, they had it Qualcomm winning by 9%. And that Procyon productivity battery life was the one that Intel said they destroyed everybody on. So, here’s my thought. I was an OEM for almost 10 years on the PC front, and I worked at AMD for 11 years. And let me just tell you, this tells to me that this product was moved out quickly. And if you remember, Intel pulled this product in from what was likely a January launch, into to hit the fourth quarter. There were only two platforms that were benchmark-able, one from ASUS and one from Dell. And if you compare that to the Qualcomm launch, I mean, it was Dumbo dropping multiple solutions from Lenovo, HP, Dell, and even Samsung.
There could be power profile issues. Right? That’s what I’ve been told, which is, hey, when you need to benchmark something and when, but I think there’s something in the firmware, there is got, probably going to be an update that will probably give answers. And I’m leaning on Ryan to be able to have his team put some of this stuff into perspective. My final comment here is this was the seven series, so there’s five, seven, and nine. There is a nine series coming out that is supposed to be a higher performance. And I believe that Intel staged this likely to hey, we want to get the battery life, that’s the monkey on our back. We want to get that out first, and then we’ll follow through on performance with the nine series. And I don’t expect the nine series to have as good a battery life as the seven series. So that’s it, Dan.
Daniel Newman: So you covered a lot of ground and I know we’ve got to wrap this up, so I’ll just make a couple of maybe more macro comments. One is, Signal65 will be the arbiter, and we will put our thoughts out there. Look for Ryan Shrout on the Twitters or X, whatever you call it, to share some of that data as it comes out. We are looking at all of this stuff. And you’re right, there were some good reviews, so I’m not saying there aren’t others, I’m just saying we are going to do a very thorough job here. The second thing is that, Pat, is I think what Intel got, what Intel wanted, and I’m going to tell you what that is. It’s not obvious. It’s not obvious. I think there was some concerns that it was going to be, they were going to just get blown out across the board and that would’ve been really bad for Intel. But with the combination of the design with the tiles done at TSMC, they’ve been able to build a part that again, even in one particular hub that says their battery life is better than the ARM, in the ARM-based processor, that’s kind of like, even just creating a toss-up is kind of a win for Intel. Look, I’m using a Qualcomm device every day, I’m really pleased with it. This is not a knock on what they’re doing. I’m using almost all arm devices now. It’s just what’s happened to come across my desk in recent.
Having said that, Pat, the Lunar Lake part seems to be very competitive. That’s what Intel needed here. They needed it to be very competitive. They needed it because remember, Intel’s the incumbent here. So I believe it’s the competition within a benefit of a doubt being so much better. That is going to be what’s required for a meaningful market shift. And so if Intel’s number one goal, given its difficult times, Pat, was protect the moat, that is your strongest business right now. These numbers do not give clear read to the channels, to the commercial customers, to the retailers, that they have to go all in yet. I’m not saying the ARM stuff’s not really good, Pat. I’m just saying I think for Intel, this was about as good as they could have hoped for in this current juncture. A pretty good set of responses and not a very clear output of who’s better, because it depends. And so that’s where my takeaway was. You definitely dug into the depths, but I know we’ve got a roll.
Patrick Moorhead: Dude, come on. I’ll tell you when we got to a roll. I’m the guy who has to roll.
Daniel Newman: All right, all right. We don’t got to roll. Keep going then. No.
Patrick Moorhead: No, that was really, I’m glad you gave the macro because the question is, how good does it need to be? Okay? And Intel is the incumbent. It’s a lot harder to switch people off Intel in enterprise because they’re very conservative, they don’t want to screw something up here. And if Intel gets close, and even price doesn’t even matter that much. And that’s why AMD has had a hard time getting traction. I hear that AMD is stepping up what they’re doing in the enterprise, related to marketing. I’ll believe it when I see it. Qualcomm is going to have to do this as well. But yeah, people were skeptical on what Lunar Lake would deliver and they, I think had a showing that was better than people expected.
Daniel Newman: Not losing was winning.
Patrick Moorhead: Yeah, the YouTubers were like Intel destroyed Qualcomm and AMD, and I don’t know what they were smoking out there. Now, there’s two reviews that I’m looking forward to. Linus Tech tips, and maybe a hardware unboxed. But yeah, I mean it’s-
Daniel Newman: It’s competitive, which is great, right?
Patrick Moorhead: Intel is competitive, right? And then you take the money cannon. Now, Intel is probably losing money on Lunar Lake. Okay? I don’t know if it’s like 10% gross margin or what’s going on, but that is a very unprofitable product for them. It’s kind of weird. How can it be profitable for Qualcomm and AMD? Because Lisa Su doesn’t create money losing products. And then, why is Lunar Lake so unprofitable? I’ve got to dive underneath that and it could be complexity in the design, it could be the size of the dyes, the tiles, or something like that. But theoretically, Intel gets healthy with Panther Lake, which is the follow on 18-A. So wow, good show, bestie. Cranking out good topics here.
Daniel Newman: Here we go, here we go. And by the way, we didn’t even get to a few things that had happened this week, but.
Patrick Moorhead: Google had their at work stuff and we could have talked about-
Daniel Newman: I know I had their big launch. A lot of excitement about Meta this week.
Patrick Moorhead: Gosh, Meta’s just… Man, Meta is just crushing it.
Daniel Newman: Meta’s just quietly the AI champion. And now I think we all said they had their iPhone moment with their new glasses. I mean, those are starting to look like the glasses I wear every day. And it just kind of makes Apple look stupid. I mean, the fact that they spent all those years to build that horrible device that nobody uses now. I joke, and it’s not really funny, but about the IDF blew up all the, what are the headsets called? Vision Pros. No one was harmed.
Patrick Moorhead: Yeah.
Daniel Newman: You know the pager thing. So they did it on the… Anyway, funny. It was funnier to me.
Patrick Moorhead: No, no, I got you, I got you. Anyways, hey, thanks for tuning in. Thanks for being part of The Six Five community. I don’t even know what event we’re going to be at next, but if it is, it’s going to be good, at least in our minds. Appreciate you tuning in, and tell your friends, families, dogs, anybody about this show. Give us feedback, we really want feedback where we’re good, where we suck, and send that to my address. Okay?
Daniel Newman: Don’t lie. We don’t want to hear. We don’t want to hear unless it’s good.
Patrick Moorhead: Yeah, we want to hear topics too. Anyways.
Daniel Newman: Yeah.
Patrick Moorhead: Everybody have a great weekend. Adios.
Author Information
Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.
From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.
A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.
An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.