NOTAM Outage, Intel Sapphire Rapids, TSMC Earnings, HPE Acquires Pachyderm, ChatGPT to Azure – The Six Five Webcast

On this week’s episode of The Six Five, hosts Daniel Newman and Patrick Moorhead get together to discuss:

  1. NOTAM Outage Blame Game
  2. Intel Announces Sapphire Rapids
  3. TSMC Earnings
  4. HPE Acquires Pachyderm
  5. ChatGPT Coming to Azure
  6. Google NRF

For a deeper look into each topic, please click on the links above. Be sure to subscribe to The Six Five Webcast so you never miss an episode.

Watch the episode here:

Listen to the episode on your favorite streaming platform:

Disclaimer: The Six Five Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we do not ask that you treat us as such.


Daniel Newman: Hey, everybody. Welcome back to another episode of The Six Five podcast. It’s Tuesday afternoon. We are off schedule, but it’s myself and Mr. Patrick Moorhead. We are back for the first regular what’s going on in the Six Five episode of 2023? Pat, no excuses except we’ve got a lot of excuses about why it’s the 17th and this is our first regular episode of the year. How are you doing, buddy?

Patrick Moorhead: I’m doing great. Daniel, let’s not forget January 2nd. We did an event, the best of 2022 and a teeny tiny view into ’23. So don’t be too hard on yourself, buddy. We’ve done seven Six Fives since the beginning of the year, but we are back and we do owe it to our audience to get in sync on Friday just like we always do at 9:00 AM Central, but I got to tell you, I am super excited about this one.

Daniel Newman: Oh, yeah, no, absolutely, man. You’re right. We went right into CES. We did a whole lot of Six Five there, but I got a little secret for you. I love our partner shows, but my favorite, favorite, favorite show is the original. There’s something about the OG and the OG of the Six Five is the you and I picking our topics, it’s the social banter, it’s the every week getting deep on what’s important and getting into the analysis and what’s not. The whole Six Five is going to be every day, every week because it’s huge and it’s getting bigger and there’s going to be some more surprises this year. Can’t show them yet, but, Pat, we are back. Great episode today.

We’re going to talk a little bit about NOTAM. If you don’t know what NOTAM, it’s not a company that doesn’t believe they have any market available to them. We’ll tell you more about that in a little bit. Intel had some launches. TSMC had earnings. HPE made another acquisition. ChatGPT was all the rage for about the last 30 days, and we’re going to talk a little bit, a teaser on NRF before we probably talk a little bit more about NRF on Friday.

For those of you that have been under a rock and this is the first time you’re listening to the Six Five, the Six Five is six handpicked topics by Patrick and myself. Five minutes each, sometimes 10, but the whole idea is that we’re going to do deep analysis, minimum news. We know you can get the news elsewhere. We get ours on Twitter, but you can get yours wherever you feel like you want, but when you want analysis, you come here.

Now, reminder, we do talk about publicly traded companies, but this show is for information and entertainment purposes only. So don’t take anything we say as investment advice. Pat, I’m going to steal a line from something that you say sometimes and go, I’m going to call my own number, something Tom Brady did very ineffectively last night when the Dallas Cowboys sent him home hopefully for the last time. I mean, sure, he’s the goat, whatever. I am done with that guy. Into the sunset, my friend. You had a good career. Go Cowboys.

Patrick Moorhead: Ouch.

Daniel Newman: Ouch. Sorry.

Patrick Moorhead: Tom Brady represents all washed up old men like me.

Daniel Newman: Pat, we’re over 40, remember? We’re invisible now forever. So let’s talk about NOTAM. Like I said, we’re not talking about no total available market. We are talking about the noticed air emission system in the FAA. Over the last few weeks, we’ve seen some different interesting systems failures that have brought some attention to legacy or older technologies. While we are not necessarily covering FAA, we were all affected. In fact, I was traveling out to California that morning. You and I were exchanging some texts and you said to me, “Hey, you might want to check your flight and make sure you’re actually leaving today.” This is the system that basically gives clearance and all kinds of other instructions to every plane that’s taking off. This failure costs more than 1300 cancellations and 9,000 delays. Unlike the one at Southwest, this was not unique to any one in particular airline. This was a systematic issue for the entire federal aviation administration that affected every airline, every plane, and it could have been an even bigger disaster.

As technology analysts, the first thing you and I do is we go, “Well, what technology is the software running on?” So if I’m not mistaken, Pat, you know and I, we are modernizing technology guys. We like talking about cloud, multi-cloud, hybrid cloud, the future of architecture, but we are also these rare advocates that the mainframe is far from dead and, in fact, there are still many applications in many highly critical industries and use cases where the mainframe is still the best way to make sure that you have a resilience scalable system, and everybody’s immediate assumption, probably because of what happened with other large outages and failures of older and more antiquated technologies, was that for some reason, this was also a mainframe failure.

Guess what, everybody? Pat, I’ll let you dig into this a little bit more. It wasn’t. So the mainstream media basically came out and blamed this on the mainframe. For companies like IBM and Broadcom and others that have significant businesses in the mainframe, that’s a bit of a shot across the bowel because the whole idea of the mainframe, as I said, is all about secure, resilient, scalable. When you’re running a global transaction system or payment rails, never goes out. So when a system like this goes out and everybody says, “Oh, it’s a mainframe,” that’s a big problem.

So while some people were really upset about this whole deal because their flights got delayed, other people turned to technology and started making claims like, “Well, if they were modernized or running in the public cloud,” which by the way does go out sometimes, “this would never happen.” Well, guess what? That wasn’t what happened. We wanted you first hear to know. Pat, what color do you want to add to this?

Patrick Moorhead: Well, first of all, the press blew this one, did not follow up on this. In fact, they let a guy whose name is Tim Campbell, a former SVP of air operations at American Airlines and now a consultant in Minneapolis, he was quoted in no less than nine of these articles where he was quoted saying, “So much of their systems are old mainframe systems that are generally are liable, but they’re out of date.”

So first off, factual issue, NOTAM does not run mainframes. It runs old Sun SPARC systems that are running a Solaris operating system. I had Sun SPARC systems in my data center when I was at Alta Vista in 2001, and let me tell you, those are not mainframe systems, those are gigantic servers. Some might even call them a web server. So factually, it’s inaccurate, and even with the Southwest outages, the mainframe was being blamed there. So first of all, there’s a factual issue going on there.

The second thing is that it’s important not to confuse older systems as bad and newer systems as good. I would like to say that newer technology does not go down, but then I would be lying. I would be inaccurate. In fact, even the cloud goes down as we’ve seen, and a lot of those outages have to do with our network misconfigurations, right? We’ve seen Azure. We have seen AWS go down. Heck, on the security side, we’ve seen customers leave open S3 buckets and have created these gigantic security issues. Larry Ellison at Oracle always talks about these systems that are autonomous.

So the fact is is that there’s new technology that can go down and there’s old technology that goes down. The key part here though is, Daniel, and you and I have riffed on this a lot, it’s about building resiliency, right? It’s a platform and it’s a holistic approach to it. It’s integration of the new stuff with the stuff that’s the old stuff. It’s skill gap and skill assessments. So I’m going to be talking and writing about this a little bit more because it really peaked my interest, and being a guy with gray hair, I feel like I could bring a little bit more perspective to that. That’s all I got.

Daniel Newman: Yeah, that’s a good one though. It’s always interesting because these things get so many headlines, and in the race to report, there’s so much inaccuracy. This is where, I think, we always have a good opportunity as technologists and as analysts to provide a little bit of color.

Speaking of something that’s had a lot of weight and a delay, well, Sapphire Rapids, next generation Intel, chips for data center, Pat, announced live, came out. You wrote a great piece on Forbes. This one’s yours. So I’ll let you go first, but big moment for Intel.

Patrick Moorhead: Yeah, big moment for Intel a couple years late. We’ve both had the opportunity to talk to Pat Gelsinger, and I just had to have five-minute conversation with Pat Gelsinger to understand. I know why this is late. Intel took on a lot. Typically, what you don’t do in the same generation is you don’t change the node and you don’t radically change the design. Intel did both of those, right? They radically changed the design from a monolithic design to one that was distributed. That’s the first thing that they did. They also made one of the biggest node changes that was out there.

The reason this was late was because the confluence of that, but primarily they had to do a lot of backporting from what they thought they were going to be on more like an Intel 5 than an Intel 7, but big takeaways for me were, first of all, it’s here. This has been shipping for months. It might be late, but it’s already here.

It’s all about acceleration performance. I think Intel outlined that they had eight different accelerator engines, and these are pieces of code that don’t run on the CPU, they run on these accelerators just like we’ve fallen in love with GPUs that do acceleration. So these are anywhere from accelerating data streaming to AI, to analytics, to load balancing, to vRAN, quick assist for encryption and decryption, crypto acceleration. So a lot of these different sub-components that come together not just for AI that let’s say NVIDIA does, but for a lot of other type of capabilities.

I think the second big picture here is that it’s not only all about acceleration, but it’s not at all about the CPU. Intel in their announcement did not talk about the CPU a single time, which is very different from let’s say what Ampere or AMD has been talking about. It’s a positioning move, and I think the degree of success, Daniel, is going to come down to, A, the software that could take advantage of that acceleration, customers wanting to use the software that uses those optimizations, and a heck of a lot of sales and marketing. They’re not going to do this on raw CPU performance. They’re not going to do this on cost. So they’re headed, which I think is a very valid strategy.

Now, they had the who’s who show up on their stage, which is an indication of the type of support they’re getting. Heck, who was the first person, non-Intel person who was on their virtual stage? Our year one Six Five Summit keynote speaker, Michael S. Dell. You had Antonio Neri. You had YY from Lenovo. You had Jensen. Hey, Dave Brown from AWS, another Six Five guest. Arvind Krishna was on there.

Daniel Newman: I think they’ve all been Six Five guests.

Patrick Moorhead: They have, except for … Well, actually, YY has not, but the other four, yes. Raghu was there from VMware. So Intel rolled out the digital red carpet. I think that’s really a plus, but a word of caution. I wouldn’t directly relate maybe what people are saying behind the scenes all the time to the big veeps that go on stage, but they do know that that Intel is going to continue to be a major force. They have the dominant market share in server parts today between 80% and 70%, depending on who you count.

Guess what? The next generation, they have to no longer apologize for what node they’re on. I believe Intel is going to be at a, first of all, they’re going to be on their second generation of distributed architecture and they’re going to be on a much more competitive node, which means the amount of area they can devote to certain subsystems will go up and keep the chip the same size. So I’m optimistic. We’ll see.

Daniel Newman: Yeah. So you hit it on the head. I mean, the bygones be bygones. Intel was going to be late. Nobody’s surprised by this. It’s here. The future is really what Pat Gelsinger and the team can control. They’re very ambitious. Was it four and three?

Patrick Moorhead: Five and four, baby. By the way, I never get that right, Daniel.

Daniel Newman: Five and four.

Patrick Moorhead: Five nodes in four years, and Pat affirmed-

Daniel Newman: One ahead of the year, meaning it’s a really consolidated, compressed timeline, but something that if Pat can get done could get Intel back into the driver’s seat. I think what you called out deserves a double click and that’s Intel is alluding to some extent that some of the traditional computing and workloads on these servers are becoming table stakes, that into this generation they can all do it, meaning the versions built on ARM, the versions that are being built by AMD and, of course, the versions being built by Intel, and they’re really leaning into what accelerated computing is going to be.

Future research analyst, Ron Westfall, did a really good breakdown on this and when he was coming back and saying, “Hey, what was special about this?” and it was really just that. It was that this is all about innovation, it’s all about accelerated compute, and that’s what Intel is focused on. You look down the list of AMX, DSA, DLB, QAT, AVX. Now, again, nobody knows what that means.

Patrick Moorhead: I love it when you talk like that.

Daniel Newman: I knew you would like it. Now, the other 93% of listeners had no idea what I’m talking about, but you’ve got Advanced Matrix Extensions, you’ve got Data Streaming Accelerators, you’ve got Load Balancing. These are all things that make servers work better. This is accelerating workloads that are going to really make people’s day-to-day interactions with software better.

So Intel is saying, “Look, some of the things that used to be benchmarks and everybody would run numbers next to each other, some of that’s becoming table stakes. Let’s talk about the things that we’re building into our next generation technology that’s going to make your systems work better.” For Intel, my opinion is it’s all about keeping the customers they have. For the last few years, it’s been a bit of shedding market share as delays have crept in and opened the door. You got to give credit to AMD. You got to give credit to ARM for enabling new companies to enter the server market, but at the same time, Intel has given market share because it hasn’t been able to compete.

So now, the question is with products that are more competitive with their existing, they still have, and correct me, Pat, I know you keep tabs on this too, around 80% of the server market share.

Patrick Moorhead: That’s right, 78%. That’s right.

Daniel Newman: So it’s still a very good number. If you and I had 80% of the analyst market, we’d be doing very well. Sometimes I think people forget about that is that Intel is still doing very well. My take though is the five and four in an incredible discipline in winning and keeping its existing customers for as long as possible are going to be the critical things that take place. So it was good, of course, to get the who’s who of OEMs and the who’s who of cloud providers up on that stage.

We all know that all those companies are hedging more and more and they’re going to continue to hedge, but if Intel gives them products that perform, Intel can tap into its long-term and deep relationships and the pretty much CIO offices and the cloud companies all over the planet to win more business and keep more workloads. Let’s remember, computing is going to grow. There’s going to be more demand and it’s not going to change any anytime soon. So congratulations, Intel. It’s one checkbox down, many, many more to go.

Let’s jump to the next topic. Speaking of servers and data center and results, I mean, almost on the polar opposite of what Intel’s been dealing with over the last few years, TSMC has been an absolute rocket ship. Now, look, you and I have been more than outspoken about our concerns about TSMC. The concerns are very little to do with the company’s performance and mostly to do with the dependency that we’ve created here in the United States and now around the world for TSMC’s leading edge process capabilities and manufacturing that takes place in Taiwan, and Taiwan, Pat, it’s almost been a hundred years, but they did, what was it? Around 1949 when China said in a hundred years they would come back.

Now, all kidding aside, China builds their economy with a thousand years in mind. We build ours with the next election or special election in mind. So we make very short-term decisions. They tend to make longer term decisions. That’s not good or bad, but the problem is with Taiwan being so important to so many of our leading edge chip design companies is that if something was to happen and if China did decide in less than a hundred years to go back and stay claim to what they believe is theirs, we would have a lot of problems like no iPhones, no accelerated computing chips from any of our fabulous designers.

This is Intel’s opportunity, by the way. This is Intel’s opportunity is to play on policy to drive more demand and more interest in building more leading edge chips in the US like they plan to do in Columbus. All right. Anyways, long story to get to earnings, company crushed it, I mean, literally, and I saw you went on CNBC. Good job, Pat. They literally crushed it.

Patrick Moorhead: No. First of all, I want to thank you for leaving me a slot on there.

Daniel Newman: On CNBC?

Patrick Moorhead: Yeah.

Daniel Newman: No, I just talked about the metaverse. That’s all. That’s what they have me come talk about. They have you talk about real things and they bring me in to talk about things that’ll never actually mature into actual marketable goods. Kidding.

So quickly touch, huge growth, huge EPS, 76% earnings per share. Now, that kind of growth probably comes from a combination of locked-in contracts, improvements in supply chain, able to get goods. The numbers were outstanding. Now, they did forecasts to a more conservative future. So a really great result, if I looked at it right and I was reading, by the way, best earnings release ever. It’s two paragraphs and that’s it. A lot of these earnings releases, they’re like books. There’s just literally probably 300 words.

HPC, IoT, automotive, all higher margin parts of the business. Now, what does that also mean though? A lot of the revenue and growth is not coming from devices, and that’s an area that they’ve done very well in some recent quarters, but they are guiding conservatively the future. They’re not as confident in and they were pretty outspoken about that, but they are very confident that they have the leading edge technology that as the market corrects, everyone will be coming back to them. The market’s open for interpretation on whether or not there’s an opportunity there, Pat, but all in all, given the fact that literally all we hear is doom and gloom, especially in the semiconductor, this was a bit of a bright spot.

Patrick Moorhead: It’s interesting and this is what I love about our relationship and our media stuff we do, we don’t always have to be in the mutual admiration and agreement society here. So what’s funny is, and I think the MarketWatch headline, what was perfect like, “TSMC Warren misses this and that and stock goes up,” and this is just part of what I love about it.

One of the things that I think the investors were really happy about that they thought that Q2 could be the bottom of the semiconductor trough because we’re still waiting. The bubble has popped at least short term. People aren’t buying as many smartphones and PCs. There’s also some research from Canalys that says smartphones were around 16% in the fourth quarter. By the way, PCs had about a 23% swing to the negative as well.

What I think was important, this is what I talked a lot about TSMC on, was the risks that the company has. People right now are assuming that the company is infallible, but China’s Xi Jinping has been very public that says he wants to unify China and that includes Taiwan. Militarily, I believe China is better off. Again, I’m not a military specialist, but this is what the spooks in DC tell me that China’s better off invading within the next two years than in the next five because they don’t have to come up against United States’ next generation weapons. It’s China’s latest generation versus quite frankly our old generation. What does that mean to TSMC investors if China invades Taiwan?

Then there’s, I think, even the bigger challenge, which is Intel 2024 to 2025. Intel’s betting the farm that it can become a competitive foundry to compete head-to-head with the company. I urge everybody not to write Intel off on this. I believe that just like the US mandated that all defense chips have to be fabbed in the United States and that’s what’s called the DoD RAMP-C that NVIDIA is part of with Intel, it’s going to mandate all critical infrastructure chips be fabbed in the US. That’s a belief that I have. So that would extend to healthcare, that would extend to the carriers, that would extend to banking, and every chip that is bought in there.

Intel’s likely to receive the lion’s share of the subsidies even though TSMC is building US-based fabs in Arizona quite frankly because TSMC does not. They lost the plot. They don’t know how to really talk the talk in DC. To be honest, they don’t have a lot of friends in the United States who even know how to educate them.

The other thing, a little fun fact for you, Dan and the listeners, TSMC is not building its leading edge products here, but rather technologies that will be one node behind what Intel is planning on doing. All Intel has to do, in my opinion, is get a toe hold with US-based Amazon and the packaging side, Qualcomm who has endorsed 20 Angstrom, and NVIDIA who’s already committed to the DoD RAMP-C using Intel 18 Angstrom. All that could spell trouble for TSMC. This is not 10 years away. This could be two or three years away. So that is something that I think that investors need to be looking at on TSMC and maybe even an opportunity with Intel.

Daniel Newman: So reminders, no investment advice here. So don’t actually listen to what Pat said. No, listen, just don’t do what he said. I want to boomerang really quickly because I got to bifurcate what I said and because you alluded that we have a difference of opinion and we do maybe, but when it comes to TSMC, when you look at this quarter’s earnings, my admiration is there because in the backdrop of what the world is ending in terms of the semiconductor of boom and bust and you look at Micron’s recent performance and some of the others, having a 76% earnings increase and a 43% revenue increase in this market is pretty miraculous considering a year ago we were still in what was considered that wild growth percentage. So this is just a bottom line fact of opinion. This is a fact. It was a good quarter.

Everything you said about Intel I agree. If you ask me, and I have it on the record multiple times where I’ve said a lot of people are focused on Intel’s five and four and the IDM and the process. I said the biggest opportunity nobody’s talking about with Intel is that IFS could actually become a massive business for Intel. There’s a lot of work to be done, but TSMC’S ability to bring leading edge here or not.

Now, first of all, they’re not, so that’s a door opener at this point. Second of all though, they’re not an American company, and with even Morris Chang, the CEO of TSMC going out on the record and saying globalization is all but dead, this basically means that policymakers are going to have a fiduciary and responsibility to the constituency to deliver leading edge manufacturing in the United States by ideally a US company and its allies.

Now, again, Taiwan is an ally, but only in the free Taiwan, not in the China’s claim to Taiwan. My comment about a hundred years is just China tends to do what they say they’re going to do. So they said they’re going to come back in a hundred years. Don’t be surprised in 2049 if it hasn’t happened yet, but China’s going to come back and knock on the door and say, “Hey, guess what? China’s here.”

Anyway, I think we largely agree, big opportunity for Intel. I do think TSMC is performing extremely well. That’s the ground truth that we always call earnings. You can’t go against that, but the opportunity here in the US, the policy and the opportunities that creates are substantial. So all right, let’s keep going, Pat. We’ve got topic number four. Our friend Antonio Neri and crew have made another acquisition this time. It’s Pachyderm.

Patrick Moorhead: Yeah. So if you haven’t been watching for the last two years, HPE is undergoing a huge transformation from selling boxes to offering essentially cloud and data services on-prem. They’ve made a lot of tuck-in acquisitions. They had built a lot of their own software. If quarters’ earnings are any indication if it’s working, it is. HP had one of the biggest quarters that they have ever had in the past five years last quarter. I think you and I both had talked to Antonio about this.

So what is Pachyderm? So we’ve seen HPE integrate companies like Blue Data and a lot of AI plays. A lot of them have to do with high performance computing that were very leverageable to any type of big data application. This one is about AI workflows, and whether it’s version control, auto scaling, DDU, cloud, and with cloud and on-prem capabilities, it’s the entire AI pipeline that quite frankly we’ve seen from the likes of Google, we’ve seen from AWS, and we’ve seen from Azure that essentially is a one-stop shop to cleansing the data all the way out to publishing the machine learning inference code to that device. So again, it is going to be a short analysis here, but everything you would have expected for HP to do, first you saw moves and data and now you’re seeing moves in AI.

Daniel Newman: Yeah, I think that’s punchy pad. I mean, look, the data pipeline is complex, and as you’re seeing things like Intel’s decision to focus in on accelerated computing rather than focusing on just traditional compute metrics, you’re going to see companies have to move from traditional infrastructure and infrastructure as a service. I mean, remember, HPE is really going to the industry in the enterprise right now and saying companies that feel obligated to move our workloads to the cloud, there’s another way to get the experience of the cloud without necessarily going to the public cloud for all things.

I think there’s been no shortage of chest pounding here by you and me that we’ve called the hybrid cloud, and HPE has been a leader in terms of buying into the future of hybrid cloud and understanding that with its huge customer base, there is an approach to deliver as a service without, like I said, all the things going.

So in order to do this though, it’s going to be service driven. If you look at what the public cloud companies have done extremely well, it’s been having comprehensive data services. That’s one of the things. If you look at AWS’s data pipeline, machine learning tools because they have everything from tools for the most technical data scientists all the way to tools for the complete novice that want to be able to play data scientists in the public cloud, a lot of these private cloud offerings have lacked having these data pipeline services at scale where they don’t have as comprehensive set of services, which has been a big motivator for companies and enterprises to move more workloads and more data to the public cloud.

Again, we’re public cloud believers here, I am, and I think, Pat, I can speak for you, but we also know that with the volumes of data and what companies are trying to do with many of their core data systems of record, edge data. Egress is expensive, moving everything to and from the cloud can be expensive. This is only one use case, but my point is is that having more and more integrated machine learning pipelines, platforms, things that customers can utilize, keeping their data on-prem and utilizing something like GreenLake is going to be important.

You and I have looked at the crawls. HP’s had a very comprehensive set of acquisitions and expansions and services. GreenLake is at this moment the most comprehensive of the on-prem cloud offerings that I can say. I mean, others are going to be investing and catching up and you can count on the Dell Apexes, and the Lenovo TruScales, Cisco Plus, they’re all going to be adding to these services, but I like what HP is doing. I like that they’re focusing on data. I like that they’re focusing on open source. They focused in on Kubernetes. They focused in on containers. So these are the things that they’re going to have to do to be able to compete at scale.

I’ll be candid, I’m not super familiar with Pachyderm itself and its software, but I’m very aware of what it’s going to be doing. So I think in the terms of machine learning pipelines and being able to scale companies’ work that they’re doing with these data sets, it was a valuable acquisition and it fits, which is something I think you and I always have to really look at is does it fit. The same questions you asked me when I buy companies, “Are you sure that fits with what you do?”

For HPE, it’s very important that they’re not just buying, but that they’re buying right because in fairness, no matter how much they expand their offering in things like data, AI, machine, hyper converge, storage, and elsewhere, it’s going to be hard to keep up with the growth of services with companies like Azure, Google, and AWS that are going to be continuously adding pages and pages of both homegrown solutions and then, of course, companies that they’re able to buy just because of their mass scale and size, with good acquisition, Pat, good part of the portfolio.

Patrick Moorhead: Hey, one thing I noticed in the press release, it had two benefits, data lineage and data versioning. Who does that remind you of? Cloudera, maybe?

Daniel Newman: Well, that could be an acquisition for the future, Pat.

Patrick Moorhead: I always thought Cloudera and HP would make a great combination, but-

Daniel Newman: Well, they’re private again, so we’ll see what happens, but yeah, we’ll be watching that quite a bit. Pat, maybe their cloud sucks.

Patrick Moorhead: Maybe. I’ve heard that spoken by some really smart people.

Daniel Newman: Well, smarter people have gone on stage and danced in circles and, “Woo, woo, woo, woo, woo, woo, woo, woo,” to get the crowd up and going, and I-

Patrick Moorhead: I’m so dumb.

Daniel Newman: I got it on video, everybody. If you send me a message and a payment, I will send you a video of Pat spinning up a crowd and he did a good job.

Patrick Moorhead: By the way, he is referring to, just to let everybody know, Dan is referring to a keynote that I did at Cloudera’s signature event in New York City where I did dance on stage and set my hair on fire to get people’s attention. I did say your cloud sucks to the audience. I did. I really did. I said it.

Daniel Newman: I even woke up to watch the rest after that.

Patrick Moorhead: I appreciate that.

Daniel Newman: That’s the job of a keynote speaker. So we were hinting at it with Pachyderm and HPE, but in case anybody’s … First of all, if you haven’t listened to our show, as I suggested, you’ve probably been under a rock, but if you haven’t heard about OpenAI and GPT at this point, and if you haven’t tried it by this point, you’re probably sleeping under that same rock. So it’s been a big week in the Microsoft world as it relates to ChatGPT. It’s been basically all the rage.

The first and foremost has been the 10 billion dollar proposed acquisition, which I think would make Microsoft a 49% stakeholder in ChatGPT, some monumental valuation, but of course, anybody that’s tried it has realized that, “Holy crow! We have technology now that actually does the things that we’ve long been threatening like writing our essays for us and doing it in a way that’s actually meaningful.”

By the way, something our friends on the All-In podcast suggested, I don’t want to take credit for this, but the possibility to deliver a technology that could actually disrupt search as we know it in a meaningful way. Anybody that’s followed the Bing Google evolution knows that Microsoft has never quite figured out how to play the game that Google has in search. Now, there’s two themes here and the main theme is if you look at Microsoft’s portfolio from devices to Azure, to Enterprise and apps, to gaming, the ability to take a technology like OpenAI and ChatGPT and embed it across every part of your portfolio, productivity, collaboration, dynamics, applications, doing some built-in on device AI on surface, what a powerhouse of technology and tools that the company could use to absolutely differentiate itself from every other platform, including Apple. Siri, you still suck, mostly. So very interesting.

I mean, look, this isn’t the first thing. I mean, the partnership with Microsoft and OpenAI goes back I think three or four years. The company’s been very busy working on things like conversational AI. They bought GitHub, which is obviously a platform for developers and coders and that gave the company more leverage to implement and utilize code that could add AI, but basically now in January of ’23, the company is announcing Azure OpenAI service and making it generally available for OpenAI and ChatGPT coming soon. So effectively, what you’re seeing with OpenAI and ChatGPT is going to be able to be overlaid on everything that Microsoft builds. What an absolute powerhouse.

Secondarily, like I said, this to me, Pat, was the number one thing, and I’m not going to take too much out of this one. We could talk forever. I’m just going to say the second I heard this idea of Microsoft being able to use OpenAI and ChatGPT to enable Bing to finally offer search that could compete with Google or some service, absolutely blew my mind because only about one out of a hundred times that I’m searching something on Google am I looking to buy something. Yet the entire architecture of chat, sorry, of search has been built to basically enable someone to sell you something.

So when you search for I want more information about a company’s new product, I get ads fed to me, right? I get something that they want me to click on and they want to create revenue, but when you use OpenAI and ChatGPT and you search something, you put in a question like, “Hey, how does Intel’s gen three versus gen two compare on server chips?” You would actually get a somewhat sophisticated breakdown of all the internets, all the material that’s been fed to this thing that could give you a meaningful answer.

I mean, you have questions being asked that could write history papers, doctorals. You have things that could be answering … I mean, just yesterday, Pat, just as a little example here, I did a search out on the summarizing Google Microsoft announcements at NRF. So this could interestingly feed what we’re going to talk about next. It said to me, I’ll just read this out, “At NRF 2023, Microsoft and Google made product announcements that’ll have an impact on the retail industry. Microsoft announced its new Azure AI for retail platform, which helps retailers create personalized shopping experiences for consumers. Platform uses AI to analyze consumer data and predict what shoppers want to buy in order to give them better recommendations. Microsoft’s new retail strategy is based on three pillars,” I’m going to end after this but, “personalization, convenience, and security. The company also introduced a new suite of tools called Microsoft 365 for retails, which includes features like inventory management and analytics software.”

I mean, my gosh, Pat, one question in and we got something back that, generally speaking, an analyst or someone on our team would spend some time researching, reviewing, being briefed to get our arms around. The implications of this are massive. The fact that Microsoft is getting there first is going to absolutely put the industry unnoticed, Amazon, Google, Apple. They’re all going to have to find their version of this killer app to try to keep up if Microsoft effectively is able to execute with this technology.

Patrick Moorhead: So I feel like I’ve got a different perspective not from you, but I’m looking at this LLM opportunity through the lens of a business perspective, and that business perspective is twofold, and I brought this up before on here, but I think it’s important, which is first off, if you’re going to try to do a knockoff of Google search, you better be prepared to be blocked by the scrapers that come in if you don’t link back to where you found the data and be prepared to be sued if you just rip off the information publicly.

So we’ve seen this time and time again and the value, the reason that you allow crawlers to come and search your site and block Google is because you want people to find you and you want them to link back to your place. I’m really interested to see on what it does on data sets that aren’t public. So for instance, law books or something like that, something that has a copyright to it, that’s where I think we could see some serious game changing.

Second question I have is on cost. I’m very interested to see, by the way, the L in LLM is large, and large means expensive. I mean, with hundreds and thousands of GPUs that have to be intelligently working at the same time now, the amount of resources it takes is based on how difficult the question that you ask it, but we don’t yet really know the cost of a transaction. I’ll call it a transaction or a search. Google search is very efficient in the way that they’ve done it. So I’m not certain about the cost and the longevity of it. Meaning, do the costs go down over time or is this going to be just the Rolls Royce of capabilities?

I also question on, is it really a capability that Google doesn’t have? I think we’ve heard some rumblings of they have been working on a project for close to a decade, that-

Daniel Newman: Deep Mind.

Patrick Moorhead: Yeah, exactly. It operates a little bit differently than OpenAPI, but it is an LLM. By the way, nothing I’m saying takes anything away from Microsoft and Azure, but my question about that is, what is the long-term competitive advantage that Azure has using ChatGPT? Like you said, I agree that wouldn’t it be interesting if Microsoft connected some things on the operating system and the PC platform and all of the AI that they’re driving to the PC desktop? That’s something that competitively Apple just can’t do and they’ve sucked at it for eons. Don’t get me wrong, Apple’s good on device level AI, but it’s horrible on it as it relates to the cloud.

So I think ChatGPT is cool even though it got the companies that I worked for that were wrong. When I queried on it, it doesn’t mean I’m going to throw it out, but I still have questions on its cost, its unassailable moat that it has. Congratulations to Azure being first on it. I think when it starts crunching on some private data sources, things will really get interesting.

Daniel Newman: Well, I also think the ability for companies to bring in proprietary data and then use any of these, but especially as good as … because obviously, I was spilling a lot of the scraping of public information. You brought up a great point about the legal ramifications, copyright issues of people using the data. Obviously, search has had a way of getting around copyright for a long time in terms of sharing and making it available, and that’s largely because people want the data to be found online.

So there are going to be a lot of things to work out, but in large language models, when you have companies that have tons and tons of proprietary customer data, tons and tons of experiential data, of data sets from experiments, that stuff’s going to be unique when you can combine it with the public domain to provide differentiation, but you can’t take it away, Pat. This thing can literally write a PhD essay for you, which doesn’t make it not cheating, but the real question that I believe needs to be asked is, what are the ramifications to the future when you don’t need to actually learn these things in order for it to click?

You can’t leave the education, the institution of higher ed in place when someone can just search a question and have it write a comprehensive dissertation. This is a beginning of a fundamental change in human behavior and humanity and the way we will work, the way we will study, the way we will learn. It’s been happening right under our noses, but this may be one of those moments where it’s become incredibly evident that everybody is going to be affected by it. Pat, we got one more. Man, it’s good to be back. Love this show.

Patrick Moorhead: It’s good to be back.

Daniel Newman: I don’t want to work anymore. Can I just do this?

Patrick Moorhead: Hopefully.

Daniel Newman: Okay. Let’s just do this. Let’s just do this. Anyway, let’s talk about NRF. Now, we might talk a little bit more about it on Friday because I did spend some time there and it was pretty cool, but Google had some really interesting announcements, Pat, that caught your eye from afar. So we went to Microsoft. Let’s talk Google.

Patrick Moorhead: Yeah. We’re going to talk more about NRF this Friday. So first of all, a little bit of a backdrop here. So Google Cloud, they were a late arriver into the public cloud scene after Microsoft and after AWS, but they are rapidly growing on a percentage basis. I would say that on average, I talk to a major enterprise once a week, and a lot of the times they might use AWS or Azure as their primary cloud, but then they will more than not use a Google data or AI service, and that’s where Google is doing a fine job landing and expanding.

I think if you look at the needs of retail, so first of all, retail is a little bit spooked about AWS in that Amazon is a huge retailer and they own AWS, but I see Google really getting a lot of traction there because they’re able to leverage their data capabilities and AI capabilities to solve real problems, and whether that’s related to frontline, getting people to work in frontline capabilities or enabling them to work smarter or allowing the people at HQ to make smarter decisions on what to provide, what to do when you get a stock out, how to not have a stock out through really great and amazing forecasting. That’s the backdrop behind I would say three announcements that Google brought to the table.

One, leveraged Vertex AI vision. The other one was updating discovery AI, and the third was its recommendations AI, but you can imagine shelf-checking AI, right? It’s very simple, right? It used to be the days that you would send an army of people down in Aisleway, and I don’t know if you ever worked in a grocery store, Daniel, with little guns to count how many boxes of Cheerios were on the shelf. When I worked in a factory, we did that in our inventory system, but it’s called a cycle count. You go out there and do this. Why not have a robot or a smartphone that you can wave in front of the aisle to tell you how many boxes of Cheerios are? I mean, it makes perfect, perfect sense. I mean, it can use cameras that are in the ceiling. It can use an associate mobile phone or even a store roaming robot.

The second is something that we used to call clicks to bricks or bricks to clicks, which is transforming the digital window shopping experience, essentially making the search experience for products and discovery process so good that you’re more than likely to buy that product.

The third is a more personalized search and browsing results with machine learning. So imagine that Google optimizing search, but that gives you a flavor. Daniel, I’m sure that you got hit in the face with all of these demos when you were there for, I don’t know, 18 hours or 24 hours, but net-net, it’s Google transforming the retail environment either onsite or in the cloud and on the web using technologies that they’re pretty good at, really good at.

Daniel Newman: Yeah, you made a good point. More broadly speaking, Google is the ultimate got in by multi-cloud value. Some of their data tools and services are just standout. For the longest time, they found themselves side by side and they’re getting a fraction of the overall cloud spend, but their data in AI technologies have led to a lot of logos that are running on Google. Now, we’ll talk more about NRF on Friday, but NRF is legitimately a tech show.

You walk in and it’s Microsoft, SAP, Google, IBM, Cisco, Dell. I mean, you just get hit in the face by tech companies. It’s a tech show. The future of retail is technology. There are a few, the zebras. By the way, they call themselves Zebra technology. Didn’t you used to work for MCR some thousand years ago, Pat?

Patrick Moorhead: Yeah, a long time ago.

Daniel Newman: They still have a booth. Pretty big one, actually. It’s still a pretty big thing for them. The net-net of it is the future of retail is technology driven. Remember the just walkout technology with Amazon, the frictionless shopping? I think a lot of what I came to recognize this year is that we’re actually going to see the analytics of the in-store experience start to match the analytics of the e-commerce experience. This is the big thing that’s been missing is as everyone likes e-commerce because of data, you can do eye tracking and screen and behavior and time on page. Well, guess what? With a camera and a good AI system, you can actually do the exact same thing with a person in a store. A little bit weird to know you’re being watched, but if you don’t know you’re being watched by now, you’re a little bit weird. You need to pay attention to what’s going on in this world.

I love the shelf-checking technology, Pat. I mean, look, I’m sorry for the frontline worker because this is going to be bad news for you, but if your career is counting the number of chicken noodle soup cans, there is going to be a time in the future where a sensor and a camera are going to be able to do that better and faster and more realtime than you. This is where companies like Google come in. It’s not as easy as everyone thinks. Building that kind of technology is not going to be cut and dry, but what Google’s doing is it is going to become an enabler and cloud services that retailers can apply and they’re going to be able to start doing these things with very little friction. In fact, getting your inventory is everything in this business.

We did a massive study on resiliency in retail and things like supply chain. There were some unique supply chain issues in this last few years, but longer term, the ability to get your logistics right has always been a massive differentiator in retail. Walmart for the longest time was given this props for being this logistics machine, the ability to really understand its inventory, maximize and minimize its cost centers, its pricings, its optimization, what lands where. Well, with AI, it puts a lot of smaller retailers on the same plane as a company that had that mass data set of a company like Walmart.

So Pat, I mean, even in this press release, they talked about Google, they talked about a number that NielsenIQ said that on-shelf availability. Empty shelves cost US retailers 82 billion in lost sales. 82 billion could fix a lot of balance sheets. It could fix a lot of retail stores. It could enable a lot of growth and scale.

Last thing I’ll say is people want retail experiences. People want to shop. If you ever have a doubt, everyone out there that’s a fan of the show, if you ever doubt the power of a retail experience, follow Pat to Aspen one of these times when he decides to take his family Christmas shopping and you will see the power of retail experiences, but all joking aside, Pat, a lot of fun, big show this year. I think we’re going to have a few more topics to cover this week. I didn’t get a laugh out of you about the retail experience. You reading your email? What are you doing over there?

Patrick Moorhead: Yeah, I’m doing email and tweeting.

Daniel Newman: Self-conscious, but, no, I mean, come on, man. I think when you walk into that little strip, the shopping strip in Aspen, I actually think doves come up.

Patrick Moorhead: No, I mean, dude, I just get into a different personality. Now, it might have something to do with the fact that I’ve slugged down a bottle of champagne with my family or two before I go shopping, but I don’t know. It’s fun.

Daniel Newman: I don’t know, but I see the doves. I personally want to go on one of those trips and just maybe be a member of the Moorhead clan for one day.

Patrick Moorhead: Yeah, be part of the family for two hours.

Daniel Newman: I’m in. That two hours, I am in. Well, hey, buddy, we did it. We did it. We made it happen, 58 minutes. So this was a Six Ten, everybody, but we really do appreciate you joining us for the Six Five this week. We covered a lot. We talked NOTAM, Intel. We talked TSMC, HPE, Azure, and Google Cloud. If you like what you heard, hit that subscribe button. We’d love to have you in our community. Share our pod, subscribe to us. We’re pretty much on every one of those networks, Spotify, Apple, and all of the others.

For this episode though, for Patrick and myself, I think it’s time that we say goodbye. We will be back on Friday. So clear your calendars, 9:00 AM to 10:00 AM Central every week, same time, but we do offer it on demand. So if you can’t be with us in realtime, we get it, but we love you, we appreciate you. We’ll see you all soon. Bye-bye now.

Patrick Moorhead: Bye, baby. Take care.


Author Information

Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.

From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.

A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.

An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.


Latest Insights:

In this episode of Enterprising Insights, The Futurum Group Enterprise Applications Research Director Keith Kirkpatrick discusses the news coming out of Google Cloud Next, focusing on new product enhancements around AI, workspace, and security.
The Futurum Group’s Dr. Bob Sutor looks at five generative AI Python code generators to see how well they follow instructions and whether their outputs check for errors and are functionally complete.
Cerebras CS-3 Powered by 3rd Gen WSE-3 Delivers Breakthrough AI Supercomputer Capabilities Matching Up Very Favorably Against the NVIDIA Blackwell Platform
The Futurum Group’s Ron Westfall assesses why the Cerebras CS-3, powered by the WSE-3, can be viewed as the fastest AI chip across the entire AI ecosystem including the NVIDIA Blackwell platform.
Rubrik Files an S-1 with the US SEC for Initial Public Offering
Krista Macomber, Research Director at The Futurum Group, shares her insights on Rubrik’s S-1 filing with the United States Security and Exchange Commission (SEC) to go public.