Search

We are Live! Talking Microsoft, Micron, Cisco, Semis, AI, Cloud

We are Live! Talking Microsoft, Micron, Cisco, Semis, AI, Cloud

On this episode of The Six Five Webcast, hosts Patrick Moorhead and Daniel Newman discuss the tech news stories that made headlines this week. The handpicked topics for this week are:

  1. New York Times Sues Microsoft & OpenAI Over Copyright
  2. Micron Q1 Earnings
  3. Cisco Acquires Isovalent
  4. Watch List in Semis
  5. Watch List in AI
  6. 2024 Watch List in Cloud

For a deeper dive into each topic, please click on the links above. Be sure to subscribe to The Six Five Webcast so you never miss an episode.

Watch the episode here:

Listen to the episode on your favorite streaming platform:

Disclaimer: The Six Five Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we ask that you do not treat us as such.

Transcript:

Patrick Moorhead: It’s Friday morning and The Six Five is back with Dan and Pat. We’re so excited for you to be here, it is the best time of the week. And yeah, we were on a little bit of holiday this week, but came in to do the show. Dan, you’re looking good, you’re looking a little tan.

Daniel Newman: It’s the most–

Patrick Moorhead: How do you get a tan in the middle of the winter?

Daniel Newman: Wonderful time of the year. And now you know I can’t sing. Yeah, I don’t know, but Pat, I don’t know if you’ve been paying attention, but here in Texas it’s been 70 and sunny every day for the last month. So there’s a chance I might’ve gone somewhere else, you’ll have to check my Twitter account. I was in an undisclosed bunker, in an undisclosed location, and I went away for a few days. And I’m not going to lie, I sent Pat a text that said something like, “I’m feeling really overwhelmed.” Because I left the Friday before Christmas and then I missed the day after Christmas. And somehow I still felt stressed that I had missed too much work, it was the first two days I’ve taken off this year.

Patrick Moorhead: And like a good friend I was so empathetic that I told you how dumb that sounded.

Daniel Newman: Yeah, I appreciated that. That kind of abuse, you can only expect from someone that knows you well.

Patrick Moorhead: Yeah. Hey, we got a good show for you, aside from Dan and I bantering back and forth. We’re going to be talking about the New York Times suing Microsoft and OpenAI. We’re going to talk about Micron earnings, Cisco bought Isovalent. And we’ve done a lot of 2024 predictions in many ways, Dan and I, but we thought you wanted more, so we’re going to give you more. So we have our watch lists, our 2024 watch lists. And that could be companies that we think are definitely going to win it or technologies that we think are going to win it. And then there’s some that’s going to be like, “Eh, okay. Well, maybe not or might be suspect.” And we’re going to be talking semis, AI, and the cloud. So Dan-

Daniel Newman: Yeah, what are the risks? What are the downsides? You know?

Patrick Moorhead: Exactly. Downsides, upsides, all that stuff.

Daniel Newman: We got a lot of time. Let’s get after it, but– but, but, but, but, but…

Patrick Moorhead: Yeah, we are talking about publicly traded companies, so don’t make any confusion. Do not take this as investment advice, seek a professional. If you saw my portfolio, you would definitely not do what I do. Actually, I don’t trade in tech companies that are customers because of all that NDA information I have, but you still would be sad and wouldn’t do anything that I recommended.

Daniel Newman: Yeah.

Patrick Moorhead: But let’s move forward, Dan. Okay, New York Times sues Microsoft and OpenAI. What is going on?

Daniel Newman: Well, we knew this was going to happen, so let’s just be really candid. One of the things from the moment that we started playing with ChatGPT, Bard and any other large language model that was made publicly available was, what does this mean for attribution? And how does that work going forward? And how do companies that used to get clicks make money? If you’re a publisher and you can get the 300 word abstract of the latest New York Times op-ed without actually having to go or pay for The New York Times, what do we do? And so Pat, I don’t know about you, but I’ve said from the day we went and I really started to see the power of generative AI, I think when I went to the Microsoft first one back in February. I’d seen it before, but I really started to see how it could help a vacation plan come to life or it could provide a abstract on a big long health report.

I said, “Oh, this is going to get really interesting.” I said, “What happens with all this copyrighted material? How does it prioritize what material gets used? When material gets used, is it clearly demarcated from the content that it’s creating? Meaning what’s the source of material?” And so as a matter of time, what ends up happening is people are going to start reading abstracts and summaries on GenAI. And what they’re not going to do is click over to websites. And if you’re The New York Times or any other publisher and people stop clicking over to your site, guess what happens? Your advertising gets destroyed. And if your advertising gets destroyed, then your business model gets destroyed. And so by the way, Pat, I’ve said from the onset of this thing too, that this is going to be a test for Google. And hold your thought here, but Google’s model is you drive traffic, clicks and advertising and people pay to advertise against certain searches.

Well, same thing there. If you can go to an LLM and you can get the information summarized in a ChatGPT or Bard, how do they get advertisers to pay for that? And so basically what we’re looking at here is not so much about just The New York Times and OpenAI. What we’re really looking at is an inflection point and the beginning of setting a precedent around copyrighted and privately owned content, and how that content is protected on behalf of those that publish it, those that invest money. You and I can relate, we invest a lot of money in our analyst teams to create content. And we want to be able to drive views and reads. And now what happens in an LLM can crawl all of our content and it could share in an abstract our analyst viewpoints on something. And it’s possible our analysts won’t even get proper credit that their content was used in a tokenized response in an LLM.

So this is going to be really tricky, but here’s two things I think are going to come from it, and as we start to warm up for our predictions pieces. But first thing that’s going to come from it is this is going to be a massive moment in terms of setting precedent. And remember the genie’s already out of the bottle, so we’re not going back. I heard some really dumb punditry that this could be the end of AI. Meaning they’re effectively like, “If The New York Times wins, everybody’s going to stop.” We are not going to stop, this is not going to stop. So what’s going to have to happen though is there’s going to have to be an attribution and a compensation model. Because basically what’s happened is the entire world have become citizen journalists and creators of content that these LLMs are crawling and that is becoming product.

And that product needs to be monetized, so there will need to be an agreement created between OpenAI and every other language model that crawls the internet and uses the content to one, provide proper attribution. And two, every time it uses the content to create a response, there needs to be some sort of fractional pennies of compensation that’s going to go back to the publishers, the content creators, and the originators. Because it won’t just be written content, it’s going to be graphics and videos, and it’s going to be imagery, but that’s where it’s at. This is a big moment, it’s an inflection point, it was bound to happen, this will continue. And by the way, this is why some companies got so into the indemnification of LLMs. And this is also why we talk about the real value being in the private data that you have or the unique data that you have because the publicly available data is table stakes.

Patrick Moorhead: Man, you’re on fire on this one.

Daniel Newman: Was that good?

Patrick Moorhead: It’s almost like you’ve thought about this one a little bit. So I worked for a company called AltaVista, which was the top-rated search engine before Google came in and destroyed everybody. And there was a lot of conversations, including my stock grants. Yeah, it destroyed those too. But it was self-inflicted because AltaVista had this crazy idea of having unfiltered search results and what you’d get is a bunch of university junk as opposed to Google that would parse it for consumers. So anyways, we had the same conversation back then, which was, “Hey, will content providers allow us to crawl their websites?” And there were some legal challenges, and what ended up happening is once proper attribution was sent back in the terms of search links, and when this kind of bid per link, bid per click, cost per click advertising model came in, everything settled down.

And then about 10 years later when Google started to do little, it was called snippets, there was this conversation as well and that got resolved. This is pretty much the same thing with a little bit of a twist. And here’s what I mean by that. When it comes to the text version of what’s coming back, I do believe that if proper attribution is provided, people will link and tell me when you’re trying to… When you’re using something like ChatGPT or Bing Chat or Copilot, whatever we’re calling it this week, or Bard, you’ll typically double click if you want precision because you want to learn more about that. And I do see that as continuing into the future, so I think that that’ll probably be resolved there. What I think is going to be really, really sticky though is when it comes to video, audio and photos.

It’s miraculous how when you type a prompt into Midjourney about your favorite Star Wars character, how much it looks like Star Wars content. It literally looks like, hey, they ingested all the videos, all the movies, all the photos, and they’re essentially spitting that back and they’re making money on it. That I think is going to be problematic. If you had a jury trial, you can imagine putting up Chewbacca on Midjourney and Chewbacca from the Star Wars content side by side, and it’s literally exactly the same, maybe a little bit more yellow in Mid Journey Chewbacca eyes that they’re going to shut that thing down.

And on the tech side, Dan, I really did a lot of thinking and okay, maybe I talked to an IP lawyer in the family as well. Which says, “Hey, as analysts, we digest content that’s copyrighted and people pay us to give our opinion on it, right? Whether it’s end users and enterprises or the tech company.” How is that different from OpenAI ingesting copyrighted content and giving a summary back and charging you for it? Dan, I don’t know if you have thought through that. And the only difference is the person processing it in the analyst case is a human and the other is a machine. We’re all charging for the output of that summary and that opinion. Kind of makes you think.

Daniel Newman: Well, if you can get them to pay the same and not actually have to sit through the inquiry, that could scale nicely. So you’re talking about if you invented the future AI. Well, that’s interesting.

Patrick Moorhead: Well, actually what I’m thinking about is the difference between what we do today as analysts, where we ingest content that’s copyrighted, a press release, a PowerPoint, right? It has trademarks all over it. And we write something and we get paid for it. It’s kind of the same, the only difference is humans are processing it and outputting it versus a machine.

Daniel Newman: The value is in the person that produces it.

Patrick Moorhead: Yes. Yes.

Daniel Newman: And so that’s the thing is, I guess the two sides of this is the New York Times pays for someone that’s a reputed, respected opinion or voice, that’s the concept. And then they create it, and then the system uses it to create a summary that doesn’t truly attribute where that knowledge came from. And so attribution just becomes really important. But I also believe in a world where information becomes more easily created and digested that the person that creates it actually becomes increasingly important.

Patrick Moorhead: Yeah.

Daniel Newman: The role of the influencer, and when I say that in an affectionate way, not in a critical way, is hey, an author, a journalist, an analyst, a pundit, they’re creating the backhaul of information that the web is processing to simplify the distribution. It’s complicated, Pat, it’s not straightforward.

Patrick Moorhead: Good stuff. Hey, let’s move into a little bit of an easier topic, and that was Micron Q1 earnings. Pretty much straightforward, I got on Yahoo Finance and pontificated for a while, and I think you did some media as well. So here’s the state of the entire memory market, which is a really painful two years where there was oversupply. And then what happened is device makers, PCs and smartphones declined. And then you even had some declines in non-AI data center memory and storage. So that’s why you see these companies, and whether it’s Micron or Hynix or Samsung with negative gross margins. It’s really tough and some people automatically move to, “Hey, it’s a poorly run company.” No, folks, this has been the cycle of pain for this market for 30 years. And here’s the good news is pricing is stabilizing across every company including Micron, and they showed that with their margins and also their profitability.

And I think the difference is in how you show up for earnings has to do with cost control and investment control. And that’s exactly what Micron showed when you look at the bottom line, albeit it’s still a loss, it’s a lot less of a loss. So they’re starting to get pricing power back, their operational, I guess I would call it operational excellence is key in here. And we all have to look at the future where I would say six or seven years ago, Micron was not a leader really in any technology that was out there, and I’m sure that’s debatable. But then with new senior leadership, they really turned the corner and started putting the best of technologies out there to give folks like Samsung and SK hynix a run for the money. And as I look at the… I think there’s going to be a second half supercycle for AI PCs and AI smartphones. I think it’ll give them a lift and it will all come together. That’s my thought.

Daniel Newman: Wow. Well, I think what you said that’s probably most prescient is that this is a boom or bust industry, substantial boom or bust industry. And the bottom line is that these companies need to rake profits in during the booms. And they need to be very cautious during the gullies, and the gullies are substantial and they can be long-lasting. Now, one thing that’s happened as technologies proliferate is the cycles have become more volatile and more contracted. Meaning that the period of time of which these booms and busts happen seems to move quicker because innovation cycles move quicker. So for instance, the innovation cycle that we’re moving towards with AI and AI PCs is going to fuel demand for memory. And that we know that in 2024, second half, the AI PC is going to become substantially available in the market. That’s going to create a supercycle of buying for PCs both commercially and consumer.

We also know that Apple has every so many generations of iPhone is a real supercycle. They’re iterative, iterative and iterative, but then they’re going to… Or even you look at the Vision Pro which came out, which is going to create a whole new wave of demand for silicon for XR, and the Metaverse, which is apparently coming back to life, maybe possibly if Apple decides to drop the price below the monthly payment of a Ferrari for one of those things. But anyways, the fact is that what you’re starting to also… And if you’re sort of a market prognosticator, memory is a really important watch item to understand what’s going to follow because the demand for memory starts to jump as the expectation for volume in certain areas starts to rise.

And so memory has been pretty much attached lately to just compute cores that tie to AI. And I’m talking mostly the data center, which has really, really created a lot of friction for Micron and Samsung and SK and all these companies. So Pat, I think you nailed it on their results themselves, but I think Micron’s done a lot more innovating in the last several years. I think they’re well set up for these trend lines that are coming. And good report, overall good momentum, not a great report, but a good report and their momentum is good. And it looks like the outlook for memory as part of this silicon ecosystem is going to have a better year in 2024.

Patrick Moorhead: Yeah, although most of the times I don’t think the stock market knows what they’re talking about. They’re up 70% this year, so I think they get it even with negative gross margins. So all right, let’s get into a different topic here. Cisco acquires Isovalent. Dan, why is Cisco doing this? And who the heck is Isovalent?

Daniel Newman: Yeah, so this came at an interesting time of year, the week of Christmas, but you know what? Sometimes you make acquisitions right into the Christmas or into the holidays, right? The business goes all year long. I might make acquisitions heading right into the holidays too, I can’t promise. But look, there’s cloud security hybrid fabrics. Cisco is going to play in this space of bringing hybrid multi-cloud fabrics together. One of the things that you and I have talked a lot about, Pat is that every cloud that you play in requires sort of a different stack. And so while companies have moved to multi-cloud, the ability to do this seamlessly from compute, networking, security, Cisco has an opportunity to win in that area. And what do I mean by that is, look, this is where HPE is playing, this is where Dell is trying to play and Cisco’s trying to play here too is saying, “Let’s simplify the process. Companies are going to go multi-cloud. Let’s simplify the process of enabling them to go multi-cloud by creating a stack that allows you to do the multi-cloud thing with less friction, less complexity.”

I’ll be candid, Pat, don’t know a lot about this company, did not follow them, was not tracking them. It appears to be an undisclosed amount, which means it was probably a fairly small deal in size. But they hit some of the, what I would call the right words, open source, cloud native, and bringing together networking and security. Which by the way is something Cisco uniquely has done very, very well. And that’s addressing the fact that they are network and security together, which some of the other companies I mentioned have not been as focused on. Now some of the other words that they talked about beyond open source is Mesh, which is another thing. And then of course Kubernetes. So this company apparently simplifies the connection of Kubernetes clusters across different hybrid multi-cloud environments.

So I think in the end, this is bolt-on, this is what Cisco does very well is they buy smaller, lesser known but valuable assets that they can put into their overall ecosystem. And then they have a world-class sales force to sell them and add value to their current customer and client distribution. It also helps the company drive a stronger footprint in cloud and security, which are two areas that Cisco is very focused on. So again, need to pay more attention to how this plays out. It’s not probably going to be a big headline gainer because it’s not a company that’s well known, but Cisco’s pedigree is finding these gems, integrating them into the business and expanding their portfolio, which drives to revenue growth.

Patrick Moorhead: Yeah, I want to sum this up by saying you had me at hybrid multi-cloud fabric. Those are the Pat Moorhead words, but if you’ve listened to anything that I’ve pontificated about over the last decade it was that cloud was going to mature into a public model and a private model. And there will be services that go across whether it’s on-prem data center, on-prem edge, a sovereign cloud that’s on-prem, multiple IaaS and PaaS providers, that average Fortune 500 company has two and a half. And then every acquisition you might make, you’re probably going to end up having four or five in the end, because the expense to collapse those into two vendors just won’t make sense over time. And what enterprises have done is they’ve stood up separate teams across developer or DevOps, applications, networking, security, kind of like stove pipes, and that’s not very scalable.

So this advent of hybrid multi-cloud fabrics, which provide a service across all your clouds takes a lot of work. But we’ve seen companies like Cisco, like Cloudera, Red Hat is hybrid multi-cloud for an application transport. We’ve seen VMware do it as well across application transport, security, and even networking. So yeah, this is exactly what this is and it makes perfect sense to me. And I’m glad Cisco is doing it and the hard work now begins to integrate this into their security cloud. Which by the way, security cloud is actually a security and a networking cloud, which is a hybrid multi-cloud fabric. So good job Cisco, this makes a ton of sense. And it’s also very in alignment with Cisco’s innovation strategy, which goes all the way from seed round. They actually have a seed round funding mechanism, which most tech companies don’t have. They might start round A and round B all the way to organic innovation and creation.

So hey, let’s move into the back half of this podcast, which we are calling the 2024 Watch List. And we’re going to be talking semis, we’re going to be talking AI, and we’re going to be talking cloud. And we picked those quite frankly because that’s what most people want to talk about right now. There’s a lot of other really interesting stuff going on right around the hybrid cloud. We could have done one of those and maybe we’ll do that, I don’t know, next time. But essentially what are some of kind of the puts and the takes, the pros and the cons, what are the conversations that are going on there? So Dan, I’m going to call my number for semiconductors. Look at me just going right in there. Oh, that’s the order we go in. So it was kind of planned that way, but anyway-

Daniel Newman: Oh, yeah.

Patrick Moorhead: Yeah, so here’s the thing, and I did my watch list in terms two parameters, right? Which was market share and momentum. Don’t confuse this with a stock pick, okay? Because God knows what Wall Street is doing, was it baked in the valuation? That’s not what I do, I do more market and product fit type of stuff. So first thing in semi is AI is going to lift all boats, it’s going to be in the data center, it’s going to be in PCs, it’s going to be in smartphones, it’s going to be on the industrial and commercial edge. And you will need new semiconductors to plow that in. So I think some really interesting companies out there that are going to gain share and momentum are going to be AMD, a data center and AI PC. I think they’re being superconservative with their forecasts. Intel, AI PC, I think you’re going to see evidence of 2025 data center competitiveness with a new GPU. And we’re going to see some foundry action, Qualcomm, AI phone, AI PC, they’re getting zero credit for that right now.

And like I said previously, I think we’re going to see a supercycle out there. And I think the memory market’s going to get back to normalcy, and we’re going to see the Samsungs and the Microns do well. Some of the ones that are a little harder to pick would be NVIDIA, which is they crushed it for so long and they’re a trillion plus valuation company. But the question is can they maintain the near 100% non-China market share in AI accelerators? I don’t think that’s going to be possible if AMD gets to 10%. Then there’s companies like TI and Microchip, I don’t even know what they’re doing in AI. And Texas Instruments is kind of the sleepy company that doesn’t want to say anything to anybody. Kind of interested in spaces that are kind of boring, vital but boring. They’re leveraging some of their manufacturing capability to have lock-in and exert pricing pressure on customers.

And then Microchip, where do they even play in the FPGA market? And how do they accelerate AI? And how are they going to do this year? It’s kind of a mystery to me. So that’s it, that’s my Pat Moorhead semiconductor watch list for 2024.

Daniel Newman: Yeah, I think you hit on some good points and since you sort of went first, I’m going to try to be additive where I can. What I’ll say is I agree, it’s become sort of… I was on CNBC yesterday and they kind of asked me my winners. And my whole comment about NVIDIA, it’s just like, “Is that even a topic?” It’s not because they’re not a winner, it’s just because everyone knows that one. And by the way, it’s up a million percent, jokingly, but… And now what they’re going to have to do is how do you keep growing at triple figures over triple figures? We saw this with Zoom during the pandemic, it’s like they came back to earth when they started growing 20% and everyone was like, “Well, this company sucks.” It can’t continue forever, and they actually do have real competition now.

So what I think next year, a couple of my key themes are one is real competition. Meaning that they did get out to that fast start on GPUs, but there’s real competition there, and it’s coming from multiple angles. And that’s going to be a theme of the year, you’re going to have real AMD competition, you’re going to have real ASICs competing, you’re going to have real homegrown silicon. Do not mistake for a minute that these cloud providers do not want to sell lots and lots of homegrown silicon. And so that’s definitely going to be a trend line. I’m all for a big, big buying cycle for PCs, those companies have had a really tough few years. And the AI PC, which still by the way has some work to do in terms of explaining what it really enables for its users, but they will get it right this year and they will create a cycle.

I think the cycle starts a little later than everyone thinks, I think everyone’s thinking about the middle of next year. I think it starts out a little slow, but I think it’s going to pick up by the end of next year and by next Christmas we’re going to see a real tidal wave of buying.

I’m going to make a bit of a call-out on one company because everybody kind of has been very positive on NVIDIA and AMD, but I actually think Intel’s turnaround is in motion. Now, for a few years we’ve been saying it could turn around. I think that event, that AI Everywhere event was a big moment in the Pat Gelsinger reign as CEO of Intel. I think this is the moment, you’re starting to see the market is understanding and accepting that it’s meeting its goals, it’s meeting its timelines. And we want in the country as the US and the world, the allies in the world actually benefit, I don’t know why I said actually, they benefit significantly from a strong Intel. And Intel, even the fact that as we move back to inference, a lot of inference gets done on Xeon. And the cycle of slowing CPU sales and the data centers should see a turn as all this infrastructure gets put into place and we move back towards regular data center compute that’s going to be utilized to inference a lot of this data.

The only thing I’ll say is that automotive is very interesting. I think Qualcomm, which by the way will have an AI PC and has a very powerful early CPU. But on the automotive side, their work and all the effort and investment is going to start to pay off with growth. That company is coming to a really good inflection point. And by the way, it’s been a rough couple of years for Qualcomm, but automotive has held strong throughout. And all that design pipeline, hundreds or maybe even thousands of semis that are going to go into automotive. What they’ve built with their Snapdragon Ride and the automotive platform and all the design wins should start to really bear fruit. That along with a supercycle for phones and a supercycle for PCs, should set up a bit of a comeback after multiple years of depression. So lots of other things I could say, but there’s some things to watch in the semi space.

Patrick Moorhead: Good stuff, man.

Daniel Newman: There’s a lot there, man. We really set ourselves up to leave a lot of good stuff out, didn’t we?

Patrick Moorhead: Yeah, I’m especially glad that you hit on the CSP custom silicon, right? We’ve got Amazon doing training in inference. We’ve got Microsoft showing up with Maya, which is interesting. You see some increased vigor with Google and the TPU. It’d be interesting to see what OCI does, right? Will they create something on their own? My guess is they’re going to partner kind of like they’re doing with Arm Semiconductors with Ampere, which they have an investment. And too it’d be interesting to see if Oracle makes a play for one of these ASIC companies as their valuations are going down. But anyways, let’s move to the next. Sorry, what were you going to say?

Daniel Newman: No, I just said that’s an interesting one. It’s an interesting one to look at.

Patrick Moorhead: Yeah, the only one that hasn’t shown its cards on that. Hey, let’s move to the next one. Seems like we talked about a lot of AI and semis, but overall AI, Dan, what’s your watch list in AI for 2024?

Daniel Newman: Yeah, so kind of like I did with NVIDIA, I’m going to skip some of the obvious picks. The obvious pick for AI in my opinion is Microsoft, so I’m just going to run past that one and talk about a few other companies I think are really interesting. First of all, I think those that got the indemnification right and that focused on private data are going to be very interesting plays in 2024. I think IBM led the way on governance and we continue to say, “Well, look, companies are going to have to demarcate their private or what I call unique proprietary data from the scraping the internet for data.”

That’s what this New York Times lawsuit is going to teach us is that it does come down to who has more rich data? Then who has the platforms and who’s building technologies that enable grounded, vectorized data sets to be implemented and utilized securely? And it won’t just be about indemnification, but it’s going to be about actually building technology that doesn’t let you get into legal trouble. So by the way, that’s going to be a really important conversation in AI, that’s going to be a watch item all year long.

We didn’t talk about this company, Pat in semis much, so I’m going to talk about it in AI. But I actually think Broadcom has a really interesting year ahead of it, as we do see this pivot from AI data centers to broader data center. All the movements of data networking is going to be really, really important. So Broadcom has a very interesting play from an AI standpoint of who moves the data? And that goes into then companies in our infrastructure space, it’s HPE, it’s Dell, it’s Cisco, the network itself, it’s the data center construction, the edge. We’re going to have to move data around at a very, very high rate with very low latency, and we’re going to have to figure out ways to do this with economics. And so those economics are going to become very important, it’s going to be how do we do AI and make it affordable?

Now, one other item I’ll say is, one, people were not talking about this path, but Futurum Intelligence actually released a report, I think it went live today, on what we call it our decision-making data dashboard for AI. And companies are actually, this is going to be a topic for the year, but companies are very early in their implementation of AI. We heard Chuck Robbins at Cisco last quarter talk about the fact that their revenue, they had lowered expectations because basically customers had overbought this infrastructure that I’m talking about and now they have to put it into… you got to put it into commission to actually start using. We found that we’re seeing a 300% rise in companies that will be spending multimillion dollars in their AI strategies next year.

So what’s happening now is companies this year, it was all about the infrastructure, that was all the buying. The buying wasn’t actually companies using AI. And so who are they planning to use? Well, our data basically calls out the winners here. And I told you it was Microsoft, but do you know who the number one IaaS provider for AI?

Patrick Moorhead: Yeah, it’s AWS.

Daniel Newman: It was AWS. But there was a couple of interesting ones in there, IBM actually came in in top five partners for end-to-end AI. And a couple others I’ll mention, I think Salesforce has one of the actual productized AI offerings that they can sell and monetize. And then I think in AI, ServiceNow or any sort of workflow automation tools, iPaaS and then as well as services workflow automation tools are going to be very important because that’s going to be an early iteration of AI. It’s been ongoing and companies are going to double down because you’re hearing rumors right now, Pat, that 4 in 10 companies are planning to meaningfully cut their headcounts next year in trade for AI use cases. Which will be another big trend line is going to be productivity and efficiency in trade off of new hiring.

Patrick Moorhead: Good stuff, man.

Daniel Newman: Thanks.

Patrick Moorhead: But the great part about AI is you left Oxygen.

Daniel Newman: Oh my god, there’s a million directions you can go, dude, I bounced all over the place.

Patrick Moorhead: So I’ll take this, two things, so some technologies and companies that I think are going to do well in what I’ll call the wave two, right? We’re currently in wave one, where there has been a lot of build out, not a whole lot money being made aside from the NVIDIAs of the world. And it’s kind of a… People are driving down demand for basic enterprise stuff to put all that money into AI, yet AI servers are being stranded because they can’t get GPUs, it’s like a Catch-22. So yeah, some of the things to watch are these multimodal capabilities. We are starting to see that we’re going to see a lot more of that in 2024. And what does that mean? What that means is that a single chatbot or a single agent can do video, can do audio, can do photos, can do text all in one. And more importantly, it can find relational interrelationships between them to give you better answers.

While there was some questioning of the blue duck in the Google’s latest models that it brought out and how it did it, but that is a multimodal capability that they were showing off there. I think that we’re going to see… Training will continue, but we’re going to see a lot of action at the edge and inference at the edge. So some of the winners here are going to be PC companies, edge infrastructure companies like Dell and HPE. And you’re going to see IBM be a really interesting play here as really the enterprise solutions provider for AI for regulated industries. And you can’t be wrong, zero risk. I also think you’re going to see some of the magic move out to the non-Microsoft SaaS like Adobe and Box. I think what’s kind of on the outs could be employees that are in accounts payable and accounts receivable, right? Dan, you talked about companies wanting to dial back the amount of resource. I think legal is going to be transformed in 2024.

I’m talking to a lot of lawyers that are doing some of their first PaaS research against highly optimized models for legal. And don’t confuse those with the dumb lawyer who used ChatGPT to create cases that never even existed, right? These are highly optimized models out there. I alluded to this to a little bit with The New York Times and OpenAI lawsuit, but video and imagery outputs that break copyrights. I’m just envisioning myself or the jury pool looking at this video versus this video, right? This photo versus this photo. Being, “Yep, that was ripped off.” So I think that we need to keep an eye on the companies that are doing that. And do they go poof? I don’t know, but they could.

Daniel Newman: Yeah, we didn’t really even talk about it too with AI, Pat, but a real ’24 item to watch, the election. I don’t want to go down that rabbit hole, but how can you not mention, woo, deep fakes, misinformation, all kinds of different ways that AI is going to be used to mislead. Of course, it could be used to help people too, it doesn’t all have to be negative, but I just don’t see people putting in the work to figure it out. And I see a lot of information being amplified and algorithms taking wrong info. And gosh, it just seems like it’s a little… Doesn’t it just seem like a disaster waiting to happen in some ways?

Patrick Moorhead: Yeah, the big companies are girding to block stuff and monitor stuff, but it is going to be spy versus spy, and I just think it’s going to be a timing thing. Most of the social media platforms have turned off political advertising a month before the election, which is an interesting play. They’re leaving a bunch of money on the table, but I also think it’s trying to circumvent maybe some of the paid advertising and some of the fake videos and fake news.

So hey, let’s jump into the final watch list and that is about cloud. I’m going to call my own number on this. And I think some things to keep an eye on, and you got a little bit of taste of it with the Isovalent piece is hybrid multi-cloud fabrics. And the public cloud is going to grow and the hybrid cloud is going to grow. But what I find really interesting is if I look at the ARR numbers and the percentage of growth in the hybrid cloud, it’s actually outpacing the average growth of the public cloud. Nobody wants to talk about that. And oh, by the way, I don’t get sucked into this repatriating workloads conversation because net-net more workloads are going to the public cloud than coming back. Okay, so I’m not going to get caught up in all of that. But we’re entering kind of this phase two of this hybrid multi-cloud, where no enterprises want to get version 1.0 of a hybrid multi-cloud fabric.

But we’re now getting into rev two and rev three of all of these. And whether it’s hybrid multi-cloud fabrics like Cloudera for data, whether networking and security with VMware and Cisco, and even backup and storage companies and data protection, right? All over the place. I think you’re going to see just some massive growth, maybe 100% increase in sales for these hybrid multi-cloud fabrics. And I’m pretty excited about this. I also think in cloud, so AI is going to lift all boats there. Okay, and then the only question is do enterprises have enough money to do the standard non AI? And I think that’s still a TBD, but one of the biggest things that cloud companies are trying to wrangle is simplicity. This stuff is hard, right? And you’ve seen things like Bedrock, SageMaker, Vertex, solutions like this come out.

I think those are going to, for generative AI, even get simpler. And in fact, Bedrock for example, you don’t have to pick the kind of semiconductor that you want to do the training or the inference. And I think what you’re going to start to see is more simplicity, but also being able to choose your service and parameters around an outcome, right? “Hey, I want the cheapest. I want the highest performance. I want this.” Without actually having to do, I call it bag of parts AI, which requires you to have a research team which limits AI to only the largest institutions that are out there.

So I’m expecting a lot more simplicity to come in. And the first year was really just getting ready for this stuff, but it’s just too hard. And the conferences that I’ve gone to this year, I didn’t get a lot of definitive answers on… In fact, it was more, “We’re working on it.” So I think the working on is going to translate into simple. And I think time motion studies, these cloud companies have to advertise this in terms of, hey, 27 less clicks or 100 less decisions that you need to make to get these medium-sized companies into the boat. So my final comment is I think we’re going to see even more companies doing more definitive silicon for cloud to either lower costs or maximize performance or manage supply chain.

Daniel Newman: Yeah, some very good takes there. I’m going to try to simplify a bit on the cloud. Obviously it’s talking about a lot of Oxygen from IaaS, PaaS, SaaS, you could also go to the implementers. By the way, our new data shows that on a two to one basis, those that are going to implement AI are going to do it with Accenture over any other SI. That’s how big Accenture is, it’s crazy, but-

Patrick Moorhead: Well, that also says complexity.

Daniel Newman: The complexity is substantial. But look, I actually think to some extent it’s going to be a year of… FinOps is going to be really important in cloud this year, is that as companies continue to grow and the growth of cloud will be substantial. I do think we’re going to see it falling into the 20s percentages, 20 percentage CAGR even as we come back. And I think that’s a combination of FinOps creating efficiency of buying and also companies leaning into AI and doing it at different layers of the stack, meaning it’s going to be more distributed. I think AI really… Here’s an interesting thing, but I think AI sees a big gain in SaaS. Meaning companies that democratize it and make it available at the application level are going to be very popular. You’re seeing Oracles and SAPs and companies like that, they’re actually democratizing their best AI features only if you’re running their application in cloud. Which is forcing some companies that have long stayed on-prem, this could be that forcing function that finally moves them to cloud.

But you also see, I’ve made a pick in AI of Salesforce, and the reason I made that pick in AI is because Salesforce is a really easy to consume… We can argue how easy it is to use, but you can say it’s a very easy to consume SaaS based AI. Meaning you want to know something like customer churn, you can run your SaaS, have all your data in there from your CRM and it could give you churn data. You want to know about ERP from your ERP data, something about supply chain, you’ll be able to run Oracle Supply Chain Management tool and you’ll be able to get insights on… This is that practical implementation and utilization of AI. And these companies can bake it in where it’s either more sticky or it adds a nominal amount of cost per seat to be able to use all this AI. So people are going to be using and consuming AI in the cloud through applications for the enterprise. I think that’s going to be a really, really big thing.

So the other thing is I do think that AWS actually picks up IaaS momentum. And I know that’s crazy because they already have IaaS momentum, but here’s the thing is I just think ultimately the open… So this kind of crosses into AI, but the open distributed model approach is actually where people seem to want to land. It’s kind of always the same thing, Pat, it’s like we always talk about hybrid and multi. It’s like nobody really wants to be 100% in on anything because they want to make sure that they stay flexible. And so with AWS kind of leaning all in on this kind of open AI approach, not open AI as an OpenAI, but open AI as a multimodal, which is something you were talking about, people are going to be comfortable leaving their infrastructure there. They’re not going to feel the requirements to change. And they already had such a big lead that I just think you see momentum continue to pile on for cloud adoption there.

I do think there’s a surprise in cloud that’s going to be Oracle, they continue to outpace and outstrip the market for growth. That’s partially because they’re smaller and that’s partially because they’ve found a value equation, that goes back to my FinOps comment. People are wanting to do things in the cloud, but they are looking to find financial efficiency. And Oracle has sort of reversed the model of other cloud providers as where they charge a lot, where they charge less and they’ve been able to increase utilization of IaaS by being somewhat of a efficiency gainer there. And of course, 400,000 companies run Oracle and most of them are running it on-prem. And so the process of moving to cloud just gives a natural momentum to Oracle and its broad offerings in cloud. So there’s a lot more, we could probably go down Pat, but it’s 7:5… Oh, sorry, it’s 9:59, I’m still running time zones, and I wanted to finish this one on time. I thought doing the last one of the year and actually coming out on time would be a big step for us.

Patrick Moorhead: Dan, what are you looking forward to in 2024 for your business, Futurum Group?

Daniel Newman: Oh man, I get to do the plug. Well, first of all, I’m very excited about the way the business is shaping up. Our two new partnerships that we announced are two of my highlights. So The Six Five is going to be bigger and better, and it’s going to be more present and more prevalent with more talent. Super excited about that. Our venture, Signal65, we’re going to really start to lean on this and this is our testing validation business, Pat. One of the things I really want to see The Futurum Group between its Intelligence, its Lab and its Media is use our voice to really become the highest integrity, most trusted data backed and validation company that’s entering, disrupting, changing the face of industry analysts business as we know it. There’s a lot of work to be done. Every day is a humbling experience here, but we hired over 50 headcount in 2023, and we acquired seven companies.

And I couldn’t be more excited to see all this investment, the acquisition of Tech Field Day, bringing vendors, customers and influencers closer together, bringing The Six Five and the LIVE team to CES and MWC. Pat, we have an exciting year, I appreciate you giving me the chance to plug. And by the way, I couldn’t be more excited about the things you and I are doing together as part of TFG or as you like to call it FGP and our long-term growth strategy. How about you?

Patrick Moorhead: I’m super excited about expanding The Six Five, Six Five LIVE live format plus all the new talent that we brought on. I’m super excited to kind of introduce The Six Five audience to them and The Six Five sponsors and things like that. We’re going into CES a little over a week. Yeah, a little over a week, I go out in 10 days, we’re going to be kicking off Six Five Lives, I’m super excited for that. Signal65 is really a culmination, an expression of kind of my belief system that it’s all about measures of merit. And I’m excited to bring your strength, Ryan’s strength to bear to really bring some value add that quite frankly is required in this industry. And particularly when you look at AI, questioning the measures of merit and, “Oh, it should be this benchmark.” “No, it should be this TCO model.” Or something like that.

I’m really excited to not just leverage what I know and my experience, but actually bring lab resource to the table that can provide some definitiveness of that. And measures of merit will also be debated as they should, but it’s better than just raw opinion. So yeah, I’m super excited about that and I’m probably going to double the amount of analysts at Moor Insights & Strategies. So I’m retooling the mothership as well.

Daniel Newman: I love that. And I love what you’re saying, look, whether it’s the data as we were sharing throughout this pod, the intelligence, whether it’s going to be the validation we do at Signal. I love having more and more data and real world testing to back up the thought leadership that we do. And I think it really sets us apart from a lot of what’s out there that’s either repeating others’ data, sharing other data that’s not first party, not first handed, or that basically opine without any real data or research at all. I think increasingly great opinions need to be shaped by real research. We’re seeing that debate in the academic community right now as certain leaders of large institutions — whether or not they had real data. We want to have real data here, what we do at Moor or with the Futurum, and then of course our joint ventures The Six Five and Signal 65. Great year, buddy. Great year.

Patrick Moorhead: Great year. Hey, shout out to the audience. I think we generated a billion views here on The Six Five between our sponsored shows, our unsponsored shows, we’re super excited about that. And it’s not just about views, quite frankly because you can buy enough views to have 10 billion. But that’s really not the point, it’s about providing high quality content that you can’t find anywhere else. And that is really the charter here, which is to do that.

So I want to thank everybody for coming with us for 2023, and hopefully you’ll stick with us for 2024. Hit that subscribe button. If you want to follow us, we’re pretty much on every platform. We’re on X, we’re on LinkedIn, we’re on Facebook, we’re on YouTube, wherever you would like to watch us. So appreciate that. If you have any feedback, you know where to find Dan and I, we’re very public in what we do, and if you can’t find us-

Daniel Newman: Are you on the ski slopes skiing it out with some of the most famous and good-looking people in the world?

Patrick Moorhead: We’ll see, I saw some CNBC folks in Aspen. So yeah, I’m going to be taking a real break here, going to be doing some skiing. So I may or may not do this podcast next week, but I’m going to try.

Daniel Newman: I’ll miss you buddy, I’ll bring in one of the new hosts, we’ll have Shrout, we’ll have Ryan come in and do the co-hosting with me. That would be a Six Five first. I’d hate to start the year out without my besties, so…

Patrick Moorhead: Buddy, I hear you, I’d have to pop in-

Daniel Newman: You’re going to have to pop in.

Patrick Moorhead: … to say hi.

Daniel Newman: All right.

Patrick Moorhead: All right, thanks everybody.

Daniel Newman: See you everybody.

Patrick Moorhead: Bye.

Author Information

Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.

From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.

A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.

An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.

SHARE:

Latest Insights:

Azure for Operators Unveils the General Availability of Azure Operator Nexus Aimed Primarily at Running Mobile Workloads on Azure to Deliver Breakthrough CX
The Futurum Group’s Ron Westfall examines why the general availability of Azure Operator Nexus exemplifies Azure for Operator’s strategic commitment to empowering telecom operators with security, performance, and efficiency innovation.
On this episode of The Six Five In the Booth, hosts Daniel Newman and Patrick Moorhead welcome Dan Kusel, GM and Managing Partner at IBM and Usman Zafar, Assistant Vice President, Product Management & Development at AT&T at MWC 2024 for a conversation on the influence generative AI has on transforming the telecom industry.
On this episode of The Six Five – Insider, hosts Daniel Newman and Patrick Moorhead welcome Walter Sun, Global Head of AI at SAP for a conversation on SAP’s AI strategy.
The Futurum Group’s Steven Dickens and Sam Holschuh share their insights on the transformation of the data management and analytics industry along with Snowflake’s announcement of a new CEO.