Search

We are Live: Talking Lenovo, Luminar, Marvell, AWS and NVIDIA – The Six Five Webcast

On this week’s episode of The Six Five, hosts Daniel Newman and Patrick Moorhead get together to discuss:

  1. Lenovo Opens Security Center in Israel
  2. Luminar Multibillion Dollar Mercedes Deal
  3. Marvell and AWS Strategic Alignment Across Cloud EDA and Chips
  4. AWS Aligns with Hugging Face for GenerativeAI
  5. Lenovo Earnings
  6. NVIDIA Earnings

For a deeper look into each topic, please click on the links above. Be sure to subscribe to The Six Five Webcast so you never miss an episode.

Watch the episode here:

Listen to the episode on your favorite streaming platform:

Disclaimer: The Six Five Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we do not ask that you treat us as such.

Transcript:

Daniel Newman: Hey, everybody. Welcome back to another episode of the Six Five Podcast. Daniel Newman here, episode 157, joining you live from Espanola. Hola mis amigos! Aqui para Mobile World Congress. I’m getting my practice in, although nobody needs to know how to speak Spanish here because unlike America where we’re just dumb, in this part of the world, most of the people seem to actually know more than one language. But Pat-

Patrick Moorhead: How about that?

Daniel Newman: …I made it. I ripped through the airport, got in a cab, got to the hotel, checked in, got straight to my desk, tuned up the laptop because it is time for the Six Five. How you doing, buddy?

Patrick Moorhead: Buddy, I can’t believe your commitment. I would’ve found 12 different ways to get out of this. Dan literally just checked into his hotel, that’s the reason we’re starting a few minutes after 9:00 Central here. But hats off to you, dude. I really appreciate you do that, and hopefully the audience does as well.

Daniel Newman: Yeah, it’s great. If you haven’t watched the Six Five before, I’d ask you why. But once you get over that, I would say this show is for information and entertainment purposes only. And while we will be talking about publicly traded companies and occasionally about their earnings, please don’t take anything we say here on the show as investment advice.

We got a great show today, six topics. We’re going to talk about a couple of companies a couple of times today or it just sort of worked out that way. We’re going to have a little double dip on Lenovo about a new security center they’re opening up in the Middle East and then also talk about their earnings. And then we’ve got an update from Luminar, big announcement ahead of their investor day coming out next week. Marvell and AWS have a tie-up that they’re announcing. And AWS also has maybe an answer, at least a partial answer, to what Google and Microsoft are doing with large language models, and we’ll talk about that as well. So we’ll wrap up with NVIDIA earnings. I already did the rounds on that one, so I’m just going to cut to the clips of all of my segments so that I don’t have to do that again.

But everybody out there, really excited to have you here, thanks for tuning in. Pat, let’s get this show on the road. We’re going to have to do a real Six Five, six topics, five-ish minutes each because the typical six 10 won’t work because we got some stops here at the end of the day. So number one, Lenovo open security center in Israel, Pat, your show.

Patrick Moorhead: Yeah, so first off, let me get a little bit of background on this. Lenovo, because it’s a company based out of China, gets more scrutiny than other US-based and western European-based companies on security. It was interesting, I mistakenly thought that when the company came in it would lose pretty much all its public and energy and government contracts for PCs and servers. But to Lenovo’s credit, what they did is for nearly a decade they underwent some of the highest scrutiny that any company can provide US and Western European authorities. They went through, I would say, a 5X more stringent security process.

Now, what I have recently learned that people at Lenovo aren’t trumpeting is these annual, I’ll call them, security rectal exams no longer have to happen. Again, hats off to the company. One of the reasons that they’re in this very good position, and we’re going to talk about their earnings a little bit later, is because they do invest hardcore into security. The company just established an innovation center in Israel at the Ben-Gurion University of the Negev, and I think this is a pretty big deal at the next step in what they’re doing.

There is something about Israel and security companies, I mean, literally they’re all over like kudzu. It makes sense if you look at, first off, you have on all sides they are surrounded by people who want to kill them. And even in the West Coast is the med, and I guess that would be on all four sides. What that does, it creates an environment where being really good at cybersecurity is an important thing, not just for commerce, but the security of their family. So I’ve always been just incredibly impressed at the companies that come out there. The other thing is they know how to keep secrets where you see some regions leaking like sieves. I attribute that a lot to the required military service. I think it’s four years for every man and woman to go into. But net net, the center is really focused on the next generation, which is about this zero trust security architecture that we’ve talked about a lot and also the below the operating system security. I do believe, having talked to the principals all the way up to the CEO level at Lenovo, is that it does believe in security by design. I’m not saying that other companies don’t, I just want to reinforce that I believe that Lenovo does. I think this is a good next step in its evolution of its security capabilities, and I think they did it in an incredible country that is noted for cutting-edge security innovation.

Daniel Newman: Yeah, I think you hit most of this one, Pat. I think that what I would add that’s maybe worth noting or just double clicking on is Lenovo having its ties traded on the Hang Seng, Chinese company, has a very big and robust operation in the US, but obviously that has always made it more susceptible to scrutiny. The company putting extra dollars in investment to both publicly market and then of course build zero trust architectures and build security hard end into their products and then be transparent with these Western European and US and Middle East, like Israel, to show that they’re operating in good faith. There’s a significant demarcation between a Lenovo and a Huawei for instance, where there’s been a lot of-

Patrick Moorhead: Dear God.

Daniel Newman: What’s that?

Patrick Moorhead: No, you said Huawei, and I’m like, “Man, they totally screwed it up.”

Daniel Newman: Yeah, they absolutely did. That’s what I’m saying though, is sometimes companies from similar regions will get grouped together. Lenovo’s putting in extra cycles to make sure that that doesn’t happen. So good move, good strategy. It’s going to be important with the long-term growth that they continue to be able to get more and more into these larger government, Fed-ramp opportunities, etc. So let’s move on, Pat. Second topic, we’re going to circle back to an upgrade.

Mercedes-Benz have been busy making headlines with chip makers that are focused in the auto space. A few weeks ago, Pat, we talked about Mercedes and Qualcomm’s tie-up for F1 after a year with Ferrari, and now you’ve got a bit of more practical go-to-market announcement here with Luminar. Luminar, we’ve talked a lot about, putting their multi-sensor and production with a focus on Lidar and had made big announcements, Pat, with Polaris, with Volvo. We were at CES doing video in the booth with Austin Russell, the CEO, and his team. One of the things is we’re really starting to see what’s been an idea and a capability and a more secure, a safe way to move people about using the third sensor. You got vision and radar and you got to add Lidar. And so, Mercedes-Benz being an absolute staple brand is going the opposite route of Tesla, which has leaned in completely on vision, and is saying they’re going to put Lidar into a large mix of the portfolio, and they’re going to do this by the middle of the decade. This is short term, and for Luminar, a company that’s starting to generate revenue but really only a small percentage of its potential in terms of design wind dollars, an announcement like this is just tremendous for the company. It actually saw the stock go up 25% the day this announcement came out.

Some of the things that are specific is really incorporation of L3 Technologies. Some of this is going to enable drivers to be hands off the wheel up to about 80 miles an hour, 130 kilometers an hour. This is something that’s going to be enabled by Lidar. If anybody’s seen, Pat and I have shared a number of times some of the different tests of where Lidar versus traditional vision as the sensor technology for safety in the vehicle, Lidar seems to add a tremendous element of safety and driver security that vision alone isn’t accomplishing.

That’s what I keep leaning into, is at some point these car companies are going to have to be accountable to delivering safe vehicles when there’s technology available. This is getting down to the point where it can be done at…I’ve heard numbers of around $1,000 a vehicle to say, “Hey, on a premium car like a Mercedes or a Volvo, the Polestar, it’s starting to make a lot of sense,” and this is something that Luminar should be excited about because it’s in the US now, it’s on an international basis, these are big brands. And of course, you saw they had some announcements of Chinese automakers that are also including their technology. So the volume is starting to ramp. And for those that are invested in Luminar… By the way, company has its investor meeting coming up I believe next week. You and I won’t be there because we’ll be here at Mobile World. But a lot going on, a lot of momentum for Luminar. I think this announcement, Mercedes is a staple brand with global recognition, and choosing Luminar is quite a coup for the company.

Patrick Moorhead: Yeah, their stock went up 29% in one day. I think it was in the toilet for all the wrong reasons, so it was good to see the company get a little bit more credit for what they’re doing. I do understand the trade-off, viewers and listeners, in that Elon Musk believes that he can accomplish this, as Dan said, through vision and even certain different types of vision where it can see in the dark. The thesis behind that is also that you minimize the amount of data coming in so you have less to process. When you add vision, radar, and Lidar, there simply is more data to process, and that means you have to have a much bigger compute engine in there to make sense particularly for autonomous driving. But I think what we’ve seen is even for an L3 or an L2 Plus, that is the case as well.

In 2018, Elon Musk said that his software would be able to do a coast to coast trip. We are now in 2023, you do the math, around five years, it still hasn’t happened. So that thesis of a single sensor type I think is a bust. If nothing else, I think it’s a bust because Tesla is putting radar back into the design. I stick firmly to my thesis that says you cannot have a safe autonomous driving car on vision alone and you do need Lidar. Lidar can see in the dark, Lidar can see far, it can see through fog, and it creates 3D maps along the way that can help drivers around it here. I just love the openness of multi-billion dollar deal with Mercedes. Dan, you are right, the cogs, I have been told they can sell it to an OEM or an ODM for less than $1,000 right now. And then the next generation with the shrink it gets down even further. So congratulations to the folks at Luminar. It’s good to see external recognition of something that I think you and I already knew.

Daniel Newman: Yeah, it absolutely hit the nail on that one, and it’ll be a big moment for Luminar to break through into the mainstream, which it’s been doing over the last few years, which has been a good part of the reason why we continue to talk about that company as a definite player in the space. So let’s talk a little silicon cloud optimization. Pat, let’s talk about Marvell and AWS. I saw a great note on Forbes about Marvell and AWS trade chips for EDA cloud services.

Patrick Moorhead: Marvell and AWS, baby. We’re trading EDA for chips. So a couple mega trends going on here. So first off is, when you have spiky workloads like EDA… For those who don’t know what EDA is, I’m going to ask you… Don’t yawn when I’m talking. Be nice, man. No. EDA is electronic design automation, and that’s essentially the software and services that are used to build, test, verify IP blocks, chips and SOCs. What has happened is the two leaders in this cadence and synopsis have added AI to these. And so we got into a situation where they turned into spiky workloads, right? You could 10X the amount of CPUs and GPU’s against it and you would get an answer quicker. When it comes to time to market for something like an IP block, a chip, or an SOC, time is money. And also you can reduce the amount of people who are actually working on it, which is a big thing.

We saw that through cadence with its Verisium platform that says it can improve debug by 10X. So a spiky workload where you 10X it is really good for the public cloud, because you don’t have to have 10X your normal compute like you would have if you did it on-prem. It’s like retail where they have very spiky workloads around the holidays. That’s why a lot of the retailers have moved their burst capability into the public cloud. So that is what EDA is, and Marvell announced that it was moving EDA to AWS’s public cloud. They weren’t clear exactly which instances, which services, which file system, but that’s okay. I’ve got a link in the Forbes article that shows you what they could do.

The second part of the announcement was that for the first time AWS is admitting that Marvell is a big customer. I think you and I both knew that just based on the space that Marvell was in. And when they talk about how well they do in cloud and what they offer, you would probably be in the minority if you are not using a Marvell piece of silicon. Not a surprise to me. Again, whether it’s cloud storage, electro optics, DPUs, networking, hardware security modules, they’re really a leader in that. They also have something that I think is very unique, there’s only a few vendors that can provide this, is a very flexible business model. You can buy chips from them as a merchant silicon provider. They partner where they will put your IP in their SOC. Heck, they’ll build an entire custom ASIC, and they compete very heavily with Broadcom on this. And then, heck, they’ll integrate it. You want a merchant solution from them added to a custom ASIC added to partner IP, they can do this as well. So win-win for both companies. I was really impressed to see Dave Brown doesn’t quote a lot. We’ve had Dave Brown on the Six Five I think two or three times. You don’t see him giving a lot of support quotes, but we absolutely got a big support quote from him, and people can’t look at that too lightly. Master show of support.

Daniel Newman: Yeah, I think streamlining silicon design processes, of course, are going to be looked upon favorably here. It seemed like it was a little light in terms of the depth of the news and exactly how this is going to set up in AWS’s ecosystem, but I could see the value of the partnership. Marvell wants to sell more chips to AWS and AWS is going to continue to build and optimize silicon. It seems like it’s a really nice tie-up, Pat. Didn’t see anything specifically about the numbers or the expected size of the deal and what it means for each company. To me, like I said, it really reads more like a cooperation, collaboration, a core investment to streamline designing, debugging, verification of IP blocks. You hit most of it for me. It just looks like something that could be done faster, being optimized.

And of course, like I said, it never hurts when you’re a company like Marvell that’s providing technology to many of the world’s largest cloud scale companies and other industries as well to be tied up with the largest public cloud company in the world. So it was a good win for Matt and Rajiv and team. And of course, like you said, Dave Brown tends to be careful to lend his name to anything except for the Six Five, comes down all the time, and so it’s indicative of how confident he is in the partnership.

So all right, let’s jump on another quick AWS topic. Pat, because like I said, we’re going to double down a couple of times today. God, the rage-

Patrick Moorhead: By the way, I have 15 extra minutes.

Daniel Newman: Oh, well, that doesn’t mean we should go longer. Our fans only block out the 9:00 to 10:00 hour. Hey, all you out there that have carved out your day around this, I’m really sorry I was late. I promise you, I was sprinting to you.

It’s been a big topic though, large language models, Pat. Of course we’ve seen what’s happened over the last few weeks with Microsoft and of course with Google and Bard, and AWS being the third to come up with a story here. And so, AWS isn’t a browser company. They don’t have a search tool in the traditional way that you think about it. But one of the biggest opportunities, and you and I think talked about this on the last couple of podcasts, is large language models aren’t going to be just about the open internet. I look at what we’re seeing right now as a bit of a parlor trick. It’s going to be a consistent set of information that every company has access to through the browser. Bing is optimizing the way you ask a question to a search engine, but Google, if you type the search in using traditional search methods will still find you pretty good data.

But we’re starting to see what can happen. I think the big inflection point is we’re starting to have conversations with our AI or with the machine instead of searching in a very specific way that’s machine friendly. But the future of generative AI I think has a lot more to do with the way customers and businesses can create and optimize workflows or processes or built-in automations using big sets of data that live inside of systems of record and ERP and inside of business CRM, transaction data, employee data through an HCM system, or a supply chain management system that’s talking about your supply chain operations.

And right now, the ability to be able to train those models and use those models in the cloud is going to be something enterprises are looking to. So yes, open AI provides a large language model, that’s where we get ChatGPT. Of course, Google is building their large language model, which is going to be Bard. Amazon is basically saying, “We’re going to partner with Hugging Face,” and this may not be the only thing they do, but Hugging Face to basically democratize and make available it’s open source large language model for AWS users. So that could be through SageMaker, that could be done through container services, Pat. And so, there’s a few different ways that this is going to be able to be done. I think this is, A, really important for Amazon to have a claim to stake in terms of how they’re planning to participate in the large language model space and towards generative AI.

I think that this has got to be competitive. And of course, like I said, is Amazon has tons of data but needed a partner to help build and make it openly available, an LLM that could be utilized by all of its AWS user base. And so that’s what I think is going on here. I think it’s going to be more utilized and important for what I would call enterprise business applications, customer interactions, conversations, and chatbots, but not the kind that we’re all getting really accustomed to with ChatGPT where it’s using open internet data. This is all about that data that sits below the corporation in the company’s databases, in the systems of revenue, like I mentioned.

So interesting, like I said, interesting because I think it’s timely. I think everybody’s pulling forward their announcements. I have a feeling that Microsoft reconfigured the pace of generative AI by coming out and announcing the ChatGPT 3.5, the one that you and I went to the event, and that pushed for Bard, and now a AWS feels a bit obligatory to have an answer. Having said that, building development tools for these enterprise workloads to use large language models is pretty interesting and pretty exciting. And of course it should be beneficial for the company’s tools including SageMaker, but also including Inferentia and Trainium.

Patrick Moorhead: Yeah, so first off, this announcement was an expanded relationship. AWS had a relationship with Hugging Face. I don’t think it was reactive. What they did do is they had to talk about them doubling down. The other thing I want to point out to our listeners is that AWS hosts more machine learning workloads than anybody else out there. It’s funny, while most of the focus was on Google versus Microsoft on the consumer side, very few people were talking about the Azure versus AWS versus GCP. As Daniel said, this is where a lot of the heat is. And, oh by the way, a lot of these startups who are out there who want to take advantage of generative AI, AWS will be a place that they will be looking.

The other thing, it’s important like Microsoft and similar to Google, AWS has a ton of examples with AI. I mean, all of Alexa that serves all of its devices are based on NLPs and low latency operations that they run off their own silicon, called Inferentia. But what the company’s doing is they are integrating Hugging Face into all of its AI workflow, specifically SageMaker. That’s a key one right there. I’m super interested to see what they do with Amazon CodeWhisperer, which is essentially AI creating code. Azure talked a lot about that as well. So net net, a ton of excitement. You have Microsoft aligning with OpenAI, you have Google aligning with Anthropic… I need to know a little bit more about Anthropic… and then AWS aligning with Hugging Face. So Dan, which company has yet to even talk about generative AI or align with a major partner here?

Daniel Newman: Well, there’s a few, but I mean, I think if you-

Patrick Moorhead: Apple has not.

Daniel Newman: Okay, well, I was actually going to say IBM’s using different words. Oracle really hasn’t said anything yet despite the fact they have a number. You’re right, and obviously that’s the freaking obvious one. I should be fired.

Patrick Moorhead: No, no, but that was just the one on my mind and the one that I put in my tweet. I was trying to lead you there.

Daniel Newman: We used to be closer. It appears we’ve grown apart.

Patrick Moorhead: Yeah. It’s funny, on IBM I did look at the words that they used, and they didn’t use the words that AWS, Google, and Microsoft used, but I think they did a better job explaining the entire landscape of natural languages. Yeah, so Apple, where the heck is Apple? Who are they going to partner with? A, I don’t think they know how to do it on their own. They’re not good at cloud and doing things in the cloud. They’re very good at device. I think they’ll probably partner with AWS in what they do. You had Intel based, Mac-

Daniel Newman: Wait, hold on, prognostication, Pat, Apple and AWS tie up?

Patrick Moorhead: Yeah, I think AWS and Apple are going to tie up based upon their prior relationship. I don’t think Apple’s going to wake up and get good at the cloud. They’ve sucked at doing stuff in the cloud for years. It’s not a core competency.

Daniel Newman: I think they know how to route all the data through China. I don’t know what you’re talking about.

Patrick Moorhead: Do they? I didn’t know that. I didn’t know that. Can I quote you on that-

Daniel Newman: I don’t know.

Patrick Moorhead: …in one of my Forbes articles?

Daniel Newman: That’s a true story I made up.

Patrick Moorhead: Yeah. Anyways, yeah, I think it’s going to be AWS and Apple. Apple’s going to be writing huge checks to AWS to figure out what they do. As soon as Google and Microsoft start popping out more consumer goodness, Apple’s going to be on the hook for something.

Daniel Newman: All right, Pat, there you have it, everybody. All right, we got two more topics. Going to do a little bit of earnings. Rinse and repeat. Did we talk about Lenovo? I feel like we did, but you know what? Let’s talk about Lenovo again. Pat, so Lenovo earnings, it was a mixed bag, but there’s some definite bright spots in there.

Patrick Moorhead: Yeah, totally. So top-line you’d be like, “Oh my gosh, Lenovo really had a horrible Q3. Revenue is down 24% net. Don’t yawn when I’m talking. Come on, man.

Daniel Newman: Dude, I’m-

Patrick Moorhead: Yawn when you talk.

Daniel Newman: …jet lagged. Don’t let me distract you. Stay focused. Stay focused. We’ll zoom in on you. We’ll zoom in on you.

Patrick Moorhead: By the way, for the record, I wouldn’t even be doing this if I had just rolled in. I’d be like, “Dan, we got to push this or do it some other time,” so props to you buddy. No, anyways, top-line, if you just looked at that and stopped there, I mean revenue was off 24%, net income was off 32%. But I don’t think we should stop there because it really doesn’t tell the story. First off, while the company has done a great job adding services and data center and EDGE infrastructure to the mix, PC is still a large part of their portfolio. And surprise, the PC market is cratered, right? Whether you want to believe it’s down 29% or 39%, it doesn’t make a difference. Lenovo’s Device Group is down 34%. The company is still number one share provider. And by the way, pretty good margins, it’s 7.3%. I think number one market share of 7.3% on a dollar basis makes the Device Group the most popular… Sorry, the most profitable PC maker out there.

I need to do some homework and compare it. Kind of surprising versus Dell given how profitable they are. But super bright spots, right? Service, SSG, and data center, which is called ISG. Service was up 23%, record revenue at 1.8 billion. By the way, I’m still super impressed that Lenovo, even though it’s the smallest revenue provider, they talk about this first on their calls. I think that’s a gutsy move, and I frickin’ love it.

The one thing about services, SSG, that was interesting is you’re thinking, “Oh, it’s break, fix. It’s related to the hardware.” Well, 53% of SSG’s revenue were high order services, managed solutions… Sorry, managed services, project and solution services. I think that’s super impressive. The data center group, ISG, absolutely crushed it. Record revenue, record profit, up 48%. Record op inc, record revenue, record server revenue, record storage revenue, record software revenue. We haven’t seen the numbers from the bean counters, IDC, Garter, and Canalys, but on the earnings call they said that they moved to the number three server position, and they are taking share like crazy in storage. I’m going to give a congratulations to the company, particularly the services and the infrastructure group on this one. It’s hard for me, how can you spank somebody who got taken down with the market? I can’t.

Daniel Newman: Well, the only thing you could do is if the take-down was disproportionately higher, I guess you could. Or if it’s Apple, Pat, I think you generally would figure out a way.

The true thought process that I have here on Lenovo is this PC market, it was a predetermined, everyone knew, no one expected, there was no surprise here. I guarantee next week with HP when they report, their numbers will be down on PC too. Dell’s numbers were down. Lenovo’s numbers were down. I think the Mac numbers were down too, if I’m not mistaken. I’m trying to remember because Mac had a few weird surprises in here, so surface can be a little odd sometimes, but the overall PC market is down. And it’s a huge part of Lenovo’s business.

So that’s kind of why there’s a zoom in zoom out factor here. When you zoom out, you go, “Oh wow, the revenue was soft.” But then when you zoom in and you go, “Wow, this services group is growing significantly faster than the category, 20 something percent in service in their SSG group, that’s going to be really important as the company continues to move to as a service moving to consumption-based infrastructure on-prem. Clearly showing some strength there. And of course part of that whole one Lenovo strategy is that they’re doing a better job of cross-selling between the three groups. This is something that I’ve spent some time analyzing. It’s going to take a little bit of time before where it’s going to feel like one streamlined purchasing operation for an IT department, but they’re moving in that direction, they’re doing a good job. That should give a little bit of tailwind to the overall growth.

The ISG numbers, Pat, are ridiculous. I mean, those are just tremendously good results. Again, you got to remember what’s going on is while we’ve all come on record and said data center and systems will be more robust than PCs, nobody expected high double-digit growth. And so, you could see it on Kirk Skaugen’s face when we talked to him the other day, you could see a little well-deserved satisfaction on the results, and that’s because it was a big outperform. It was a big outperform and it deserves kudos. The PC market, Pat, I’m saying 24. I really don’t think we’re going to see any great quarters on the PC side of the business. But I do think infrastructure and especially some of these… I’ve come out and claim a few times what’s old is new is going to continue to be a winner, putting more infrastructure on prem. And the math seems to work in certain cases that companies need it. You and I have been on the record for decade plus on hybrid cloud, and companies are seeing it now as the cost scale. And as you get these larger data sets and they want to train things for AI, doing it in the cloud makes sense sometimes, but it doesn’t make sense all the time.

By the way, Lenovo wins either way, because they sell a lot of servers to the hyperscale cloud company. So whether it’s enterprise servers going into corporations that are trying to do more prem or it’s selling servers that are going in data centers for hyperscale cloud companies, they sell quite a bit of both. So good quarter for Lenovo. Pat, you ready to hit the last topic? I know you had a little-

Patrick Moorhead: No, I’m not in a rush, man. I’m not in a rush. Are you in a rush?

Daniel Newman: I know you had a little extra time now, so why don’t we talk about NVIDIA. Do you ever hear people call it “Nividia”?

Patrick Moorhead: Yeah, and I have to gouge my eyeballs out when I hear that.

Daniel Newman: Okay. So the quick snapshot is company beat and raised across the board. But I got to say before I really start giving too much cheer to NVIDIA, this is on a way, way, way softer guide. Meaning that the guidance got completely reset. Numbers down in a number of areas significantly on a year-over-year basis, especially in gaming. Now, gaming did beat the expectation in this quarter, but it was 40, 50% down from a year ago. This is straightforward. This is the inventory glut. This is the selloff of the inventory. Question is, are we at the end of that? I think there’s some speculation that we’re getting closer, but I think we’re still looking third, fourth quarter. And that’s been a bit of a consensus listening to Jensen, listening to Lisa Sue, listening to Pat Gelsinger and other chip makers and CEOs talking in the last wave of earnings.

Now, data center grew 11% on a year-over-year basis. Pretty good, although it missed the guide. And so, I was actually backwards on those two. I actually thought gaming would be worse than expected because that was kind of the feeling of the market, and data center would be stronger. Data center ended up being weaker but still growing on a year-over-year basis. AI was mentioned a lot on this call. By the way, NVIDIA stock is almost doubled from the bottom. I think it’s more than double. think it’s not really yet due to revenue ramping or margins increasing. It has more to do with the company is so well positioned to be part of this next generation of AI. It’s got software lock-in, CUDA, it’s got GPUs that are used in every public cloud and also used in most enterprises that are for any sort of training. And of course, there’s large expansion and inference. Company’s got super computing as a service with its DJX offering. It has next generation gaming AI with Ada Lovelace.

And then, of course, it has the Omniverse. Which, again, Metaverse is kind of ice cold so people aren’t talking about it, but the Omniverse, in my opinion is the only actual application for Metaverse that companies are really thinking about using. This is designing environments in the cloud from autonomous digital environments. Immersive digital environments could be construction sites, it could be for software or collaboration in the cloud. The company is able to accelerate on all those fronts, but as companies want to move more and more into AI, NVIDIA is going to be a big part of the story. And that’s why I think the market’s been so excited. I think the more times you say AI, the more your stock goes up. I’m pretty sure there’s a correlation there. That’s not official. And remember our disclaimer here, do not take anything we’re saying as investment advice.

Pat, they did have a better automotive number, which has been interesting because it felt like the automotive business really, really sagged over the last few years. Qualcomm had grown its 30-billion pipeline, and while NVIDIA’s pipeline has been stagnant around 11, but the revenue’s almost doubled on a year-over-year basis, so I can only kind of wonder if that’s designs turning into shipments. But there’s a lot of turning over in the automotive space right now, design wins going from company to company. NVIDIA’s longer term is going to be interesting. It’s never quite had the same automotive cache from the time that it lost its main public-facing relationship with Tesla. That was its big moment.

So across the board though, it was good to see foundationally the company beat. It raised a little bit on guidance. The numbers are much softer than a year ago. It’s got a good diverse data center offering, including not just the software frameworks, hardware and of course networking with the Mellanox acquisition. I’m overall still positive on it. I do think, Pat, one big risk with the arm deal not being done is that there is a lot of hyperscale cloud and cloud companies that are building more and more of their own AI, ASICs. You’re seeing it with AWS. And I ensure you, Microsoft and Google are going to continue to lean into that and then you will see more and more of that because of the democratized designs available from ARM and partners like Ampere and others that’ll enable that for Oracle and other cloud providers.

Patrick Moorhead: Yeah, it’s interesting. This one I want to separate myself from the stock market and what’s I think actually is going on. It gets fun and dandy to jump on the Wall Street memes. I would say that it’s like half right. I mean, I remember all the run-up and NFTs in 2022. I am a believer in generative AI and LLMs. There’s still going to be a very small portion of the market in the near term. When I look at NVIDIA’s long-term prospect, I mean, they’re the only game in town for training. I know there’s other people who have it and there’s some great technology out there, but if NVIDIA doesn’t have 99% market share in that market of training, I would be surprised. That’s not including CPUs, of course, because there’s still a lot of data center training and inference done on processors. But it really is in a good position for that inflection point in a major way.

The here and now, you’ve got the H100 out there and shipping, you’ve got Grace for HPC, and large scale models. You’ve got the networking with the Quantum-2, which by the way I wish they had picked a different name than Quantum because that has nothing to do with quantum computing. But that’s okay, I will let that go. They’re in very good position. The one thing, too, that was reinforced on the call was these new AI as a service that’s hosted by these cloud providers. You actually pay NVIDIA, the revenue goes to NVIDIA, and then it’s fulfilled through these CSPs. I think a couple quarters ago they talked about, I think, it’s called NVIDIA DGX Cloud. That was reinforced a lot. I don’t know why they needed to, but I think what they did is they threw everything they could at the wall to show how they are participants if not the leaders in compute in this market.

On the gaming side, I mean, there’s so many things going on here. They have a 3,000 to 4,000 inventory rebalancing that they’re looking at. The year-on-year compare is a tough one. Let’s face it, there are a lot of people using crypto for NFTs and mining, although NVIDIA did their best to disable that, but I believe that was a big part of their market, and it cratered the lack of AAA games. Man, I mean, we should have four or five AAA games out there right now that are forcing you to buy a new GPU. There’s a few of them, but they’re really not there. I think ADA is off to a great start. The company said it’s exceeding expectations. It was so funny, I think it was the 4080 maybe as a card, people just said, “There’s no way that’s going to sell. Value proposition sucks,” and that thing is selling like freaking hot cakes.

Sequential, I very rarely look at sequential, but I felt like I had to do it for this one, was up 16% on gaming. So anyways, up until a competitor comes in and knocks this company off on the gaming side or on the AI training and inference, I don’t see any indications that it won’t continue its run. Now, competitively, I mean, you’ve got startups like Groq and Tenstorrent, you had Intel, you have AMD, they are pissed that they’re not getting a lot of this money, and they’re coming after it. So it’s essentially NVIDIA against the world. Not to mention at AWS you have Inferentia, you have Trainium out there. While Google hasn’t used its accelerators I think as much as AWS has, Google has its own accelerator out there. In fact, it was the original AI accelerator by cloud folks. We don’t know what Microsoft is up to. We have a pretty good idea they’re getting off of FPGAs for some of their acceleration. That’s it, that’s the story, baby.

Daniel Newman: Yeah, we did it. We did it. We did it. We did it. So and good call out on some of the emerging companies, we’ll say advisor to Groq. So since we did mention it, but a very interesting company building some very, very powerful ASICs for acceleration. Pat, I am fading hard, but I’m going to try to go find me some coffee because my day’s just getting started like yours. What a show. We did it.

Patrick Moorhead: I know.

Daniel Newman: Sometimes-

Patrick Moorhead: Thanks for hanging in there, buddy. I really appreciate you making it happen and literally come in and do this. I would not have done that.

Daniel Newman: It’s coming in hot, would be the way I would explain it. We were coming in hot. But look, we covered a lot today. We talked Lenovo, we talked Luminar, we talked AWS a couple of times and Marvell. We also, of course, covered some earnings. We just finished up by talking NVIDIA. Thanks everybody for tuning in today. If you like what you heard, hit that Subscribe button. Follow danielnewmanUV and follow Patrick Moorehead. His tweets are really good and sometimes really honest. By the way, his honest stuff’s the best. You got to read through it and find it. And Pat, that would insinuate that sometimes you’re not honest. I didn’t mean it that way. What I really meant is sometimes you’re more heartfelt. You shared some great leadership advice the other day after having probably an unsavory interaction. I can’t say for sure what might have happened, but I got the sense that somebody irked you a little bit, Pat, and so that was a great LinkedIn stream.

But yeah, hit that Subscribe button, join us from our shows. June, is it six to eight or seven to nine, Pat? I can’t remember. Six to eight, seven to nine, six to eight or seven to nine is the Six Five Summit this year. We’re going to have some really great guests. We got a lot going on. By the way, follow us all this week. We’re going to be at Mobile World Congress. We’re doing 16 or something, 15 MWC Six Five videos. We’re going to be talking to a bunch of great customers of ours and then the customers of theirs while we’re here on the floor. We’re really excited about that. But for this week, for this episode, time to go. Pat’s already in his email. He’s typing something out. He’s probably closing the deal, and I can’t blame him, that’s what I’d be doing. All right everyone, we’re going to see you later. Bye now.

Author Information

Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.

From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.

A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.

An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.

SHARE:

Latest Insights:

The Futurum Group’s Dr. Bob Sutor looks at five generative AI Python code generators to see how well they follow instructions and whether their outputs check for errors and are functionally complete.
Cerebras CS-3 Powered by 3rd Gen WSE-3 Delivers Breakthrough AI Supercomputer Capabilities Matching Up Very Favorably Against the NVIDIA Blackwell Platform
The Futurum Group’s Ron Westfall assesses why the Cerebras CS-3, powered by the WSE-3, can be viewed as the fastest AI chip across the entire AI ecosystem including the NVIDIA Blackwell platform.
Rubrik Files an S-1 with the US SEC for Initial Public Offering
Krista Macomber, Research Director at The Futurum Group, shares her insights on Rubrik’s S-1 filing with the United States Security and Exchange Commission (SEC) to go public.
The Futurum Group’s Steven Dickens provides his take on Equinix's latest announcement as Equinix pivots to a digital infrastructure leader, investing in AI-ready data centers to meet growing technological demands with a new facility in California.