On this episode of The Six Five Webcast, hosts Patrick Moorhead and Daniel Newman discuss the tech news stories that made headlines this week. The handpicked topics for this week are:
- GlobalFoundries Q3 Earnings
- Arm Q2 Earnings
- OpenAI Announcements
- IBM AI and Research Day
- VMWare / IBM / OpenShift
- Synopsys RISC-V IP
For a deeper dive into each topic, please click on the links above. Be sure to subscribe to The Six Five Webcast so you never miss an episode.
Watch the episode here:
Listen to the episode on your favorite streaming platform:
Disclaimer: The Six Five Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we ask that you do not treat us as such.
Transcript:
Patrick Moorhead: Hi. This is Pat Moorhead and we are back with another Six Five, and we are live after getting back late last night. It’s great to be back, Daniel. It is great to see you. This is my favorite hour of the entire weekday. It’s great to see you, buddy.
Daniel Newman: It’s good to be back, and it is my favorite hour other than the other hours that we hang out, going coast to coast. It is good to be back. It’s been a long week. I cannot honestly tell you where I am or what day it is, really. Someone told me today was a holiday in some cases for some companies and businesses, which was news to me. Apparently our-
Patrick Moorhead: Veterans Day, baby.
Daniel Newman: Apparently, our company, it’s a day off for the Futurum Group, but not for this particular person at the Futurum Group. But yeah, look, it’s good to be back. It’s going to be back on our soil. It was good to spend some time with you yesterday in Yorktown Heights in New York, and I couldn’t be more excited than I am right now to be doing a show, another show, episode 192 with my bestie.
Patrick Moorhead: It’s great to be here. We are having some great conversations here. Oh, by the way, let me just do a little intro here. It’s if your first time at the pod, got to ask you what’s wrong with you. Gosh, we’re on our 192nd episode and we haven’t been canceled yet. Pretty exciting. We stick to tech, maybe hit some other things, but we also talk about publicly traded companies. Don’t take anything we say as investment advice. Seek a professional. So let’s dive in. We’re talking GlobalFoundries. We’re talking Arm, GlobalFoundries’ earnings, Arm earnings, OpenAI announcements. We’re going to focus on those awesome agents.
We’re going to be talking to IBM ANI Research day, which Dan alluded to, which I think I rolled in about 10:00 PM last night. Then Dan’s going to go in and just, he was in Spain forever. He is going to chat about VMware, IBM, OpenShift and what’s going on there. Boy, two vendors that you normally see together, but all three, this is going to be an interesting one. And we’re going to finalize here with some big announcement that Synopsis made with RISC IP, RISC-V IP. So Dan, I am calling your number first. Let’s dive into GlobalFoundries’ earnings.
Daniel Newman: Yeah, so GlobalFoundries had a, again, it’s kind of like a tale of two quarters right now and why do I say that? Because you’ve got are companies beating what the street expected from them and then it’s are they growing on a year-over-year basis? And what we’ve got right now is a lot of companies are beating the expectations. The bars have been neutralized over the last year as expectations from the street and company’s guidances have settled a little bit. But having said that, the ability to operate strategically against a tougher macro … Let’s be really clear, and we talk about this I feel like almost every week, but for being in a growth period in the economy, this is the worst feeling growth period I’ve ever been part of. I just don’t really believe it. I just want to be candid. I don’t believe that we’re growing.
I think it’s pseudoscience, it’s hacked calculus, it’s spin, it’s a lot of things. And I think that you’re seeing that in the earnings this quarter. On a year-over-year basis, the semi companies are almost all shrinking on a revenue basis. Now, on a beating their expectations, they’re all performing a little better. So that’s a combination of product and business diversification, that’s operational excellence. And GlobalFoundries is a really interesting company because they tend to stand up a little bit better against the market and against those expectations really for two reasons. One is they’re not heavily tied to this kind of PC device cycle that has been hampering so many companies. They do have some business in that space, but it’s certainly not outsized as part of their revenue. And the second part of it is they have so many long-term agreements that they have built what I would call a hedge, like a fuel hedge almost, where they are locked in with their customers.
So unfortunately that doesn’t leave them the grounds to charge 1000% on a piece of hardware or a piece of silicon just because they have it and someone needs it. But at the same time, it also gives them a sense of stability when the markets are downward. And that’s enabled the company to hit its target, to be very close to its EBITDA expectations and to be able to guide at or above the expectations. And so that’s really what’s kind of going on there. If you look at the business as a whole, 11% down on revenue, 16% down on EBITDA, they’re down 10% on wafer shipments. So this is the true revenue trend. When you look at it over like four or five quarters, they were doing over 2 billion a year ago and now they’re back under that. So you do see that there is a bit of an inflection here that their stability of the long-term agreements and diversification isn’t fully offsetting the fact that the market is consolidating and parts of the industry are slowing.
And by the way, we’ve seen industrials hit Lattice, we’ve seen industrials hit other Silicon labs and other companies. GlobalFoundries is not immune to that particular part of the market cycle either. The company did see itself expand, it had a new expansion in Malaysia, it did win new contracts from the Department of Defense, I believe it was. And they are doing a good job of what I would say sort of maintaining their margin, but they are seeing some margin contraction, which was something that unfortunately nobody likes to see. What’s probably the best news is they’ve really kept their revenue diversified. They are still seeing a good chunk of their revenue tied to the mobile market, Pat, which in my opinion is slowly showing maybe some comebacks slowly and Qualcomm’s numbers were maybe the best reflection and that’s China based. Is China coming back?
But otherwise across the board they have a pretty good revenue mix. They have five different segments, if you can count Nonway for six and they don’t have half their revenue tied up in any of them. So good, but a little bit heavily leaning on smart mobile and they saw that actually increase a little bit on a year-over-year basis. So Pat, I think Tom and the team are doing a good job. Tom Caulfield, CEO and the team, good, not a great result. But again, I think the stability of their long-term approach and the stability of their diversification got them through another quarter that the street can feel, like I said, good, but not necessarily great about.
Patrick Moorhead: Yeah, it was good analysis there, Dan, and I like to at times like this, look to see if something was inflicted by the company, by a competitor or by the market. And this is clearly the lack of growth is indicative. If I look at the end market breakdown, smart mobile devices down 18%, and even though the supply chain is getting better for smartphones as we saw with companies like Apple and Qualcomm, that doesn’t mean that we’re back to the boom days. We’re far from it. 5G infrastructure way down. So comms infrastructure and data center is down 58%. Now there’s also things like server infrastructure in there that is actually getting a lot better. But I think my likely takeaway is that the decline in 5G infrastructure outweighed any growth that we saw in the data center. You had mentioned Lattice, and if I look at everybody that’s in consumer and industrial IOT, it’s down. Unless you, let’s say, separate automotive out of there.
And not every automotive was up, but GlobalFoundries was we saw on semiconductor way down in automotive, we did see Lattice up. So it’s not every vendor that’s involved in this. Consumer IOT has been down been down forever. And it’s directly related to consumer sentiment, which is pretty much in the toilet right now. And they’re not going to buy as many smart speakers, smart displays, things like that. Doorbells. It’s looked at more as a nice to have as a have to have. And I think the home IOT market’s going to have to see, again, an increase in consumer sentiment, an improvement in the economy. And then on the industrial side, I think we need to chew through some of that inventory that was sitting there that, if you recall, we were trying to make up for on the industrial IOT, and we couldn’t roll out Web 4.0 implementations because we’re missing a key piece of silicon for that.
Yeah, and we covered their customer and their developer event. They made a big announcement there with 95 WSOI. And again, SOI is silicone insulator, which is a substrate that’s different from, let’s say, bulk or what we’ve seen in glass and has very, very good performance per watt characteristic. And that’s one way or one reason that GlobalFoundries is so good here. Sorry, 9SW, not 90 WRFS SOI. We covered it when we covered their event and highly recommend that you go after it. Dan, I’m going to call my own number here, Arm Q2 earnings. So a lot of discussion around this and Dan, I took a lot of heat. I think I did three broadcast interviews about when they went public. Again, I’m not a equities analyst, we’re industry analysts. But when somebody is looking at Arm and saying there’s no growth-
Daniel Newman: Wait, no price target, you don’t price target?
Patrick Moorhead: No price targets, nope. I would have to have a series seven.
Daniel Newman: I think you have to register.
Patrick Moorhead: Okay, I appreciate that, but I stay away from any price targets. But yeah, the meme was they’re not a growth stock, but hey, they had a beat and a beat and a little bit of a cloudy forecast where I think it impacted how the market reacted, but 28% increase in revenue driven by long-term agreements, share gains. And I am glad Arm said this, royalty price increases. I did like the augering in on AI, which they said increased their licensing 106%. I mean on the profit side, EPS was up 112%. Op inc was up up 92% and it’s the first time in the company’s history that they hit over 800 million of revenue. So net, net focus on what Arm is doing in the data center, focus on what they’re doing in automotive, focus on what they’re doing in PC.
The business model for the auto and the data center is very straightforward to me and I think we’re going to see even more announcements in the future of large companies who are creating SOCs that include Arm IP. I’m not going to pre-announce something, but it’s coming. By the way, that’s not exactly a major hanging out there on a limb, take a big risk. It’s just going to happen. And we see that as well. Customers want choices beyond, and they also want the flexibility to create their SOC. Little bit cloudy on what the PC is. On the Mac side, you’ve got an architectural license, on the Qualcomm PC side, it’s cloudy, right? The two companies are suing each other. Less clear and you literally have to guess on where that winds up. Arm wants to have a new contract. Qualcomm says, “Hey, I bought Nuvia and we have an architectural license already. I shouldn’t pay you anything extra.” But we’ll see where it happens. But I’m pretty excited for where Arm and Qualcomm based PCs are going. You and I covered it on the Six Five with Qualcomm’s Maui event.
Daniel Newman: Yeah, I think we both had the chance to sit down with Renee Hassler, CEO this week. My commentary, which is now pretty much Twitter where I like to X, or I like to leave it really-
Patrick Moorhead: I don’t know. You’re a power player in LinkedIn too, so.
Daniel Newman: Yeah, where I like to put my first musings and then copy and paste them to LinkedIn later, but anyone says X is dead, I still love it, so sue me. Join the club. Anyway. But in all serious, two things that I really took away from the conversation outside of the numbers that you so well presented, and that’s one is diversification company’s doing a good job moving and diversifying and really up the stack, right? It’s moved very prolifically from small and embedded into big designs. You can see that with Grace Hopper. You can see that with some of the stuff they’re going to be doing with homegrown silicon and hyperscale clouds. You’ve seen what they’re doing with AWS, some of this stuff they’re going to be doing for automotive. It continues to grow in its levels of sophistication.
And what this also means is new service levels, and service levels are really important for them because they’re kind of the company that’s known for like a penny a core or some tiny little revenue stream. And if Arm ever wants to get out of the hole, it’s in, people need to start to see its designs yielding more per design per core, per chip that gets manufactured with its IP. And so by becoming more sophisticated, helping with cluster design, building more comprehensive capabilities and basically short-circuiting design and engineering cycles for its partners, it’s going to be able to charge more. And you mentioned price increases. There’s a few different ways to look at it, but if Arm takes some of the strain off of the companies that are going to implement and build around Arm, they’re going to be able to charge significantly more per chip, and more per chip means more profit margin means more growth means better numbers, and that’s really what nobody in the market is fully confident.
The only thing I would say is the AI part. Renee did mention this a lot. I tend to agree, you and I both went on CNBC around the announcement of the IPO and were asked the question about AI and we both, I think we’re kind of a little skittish because it’s like, well, they don’t build GPUs, but AI is not just GPUs and I think the combination of, hey, every GPU needs a CPU core to be able to run applications, and also though a lot of AI is not going to be run on GPUs. There’s a lot of AI applications that are going to be run accelerators and accelerators is something that can be built and Arm can have a really significant role to play. Not to mention all of the demand for NPUs and other areas that are going to create next generation PCs, next generation smartphones, and of course building the servers of the future, Arm has a role to play there too.
So they are, in the worst case, they have a halo effect where the halo of AI will pull arms numbers up, but in a more realistic case, they do have a real role to play. Having said that, guide was sort of soft and weak, people didn’t love that, at least the sort of one time expenses related to the IPO are in the past. So the gap losses should subside and frankly, I think there is risk with risk and Arm has to face that challenge. I think there’s risk with X86 catching some tailwinds and both AMD and Pac Ehlinger at Intel doing better and better jobs of meeting timelines. But I also think the overall demand for silicon is going to be a tailwind for Arm either way. So good quarter, solid results. I think that’s all I’ve got to say about that. Forrest Gump, that’s all I’ve got to say about that, Pat
Patrick Moorhead: Yeah, I’m glad you brought up the different way. They make money, they have three different ways and that’s even aside from increasing royalty pricing. They have architectural licenses, they have soft IP, and then they have IP they take all the way down to the foundry and they do a bunch of software testing along with it. So I applaud Arm at pricing to value. Quite frankly, my relationship started with them, I think, God, almost 13 years ago, maybe 12 years ago. And I remember talking to Simon, which was you’re the most important company that doesn’t charge what they’re worth. And some people will debate me on that, on what Arm should be, giving away its IP for free, what happened with SoftBank and quite frankly the SoftBank acquisition gave Arm the ability to come in and create some heavy duty IP for the data center and also for automotive that investors never would’ve let them do, and look where the growth is now. So it’s such an interesting company, such an interesting story. Isn’t competition great?
Daniel Newman: You know I love it.
Patrick Moorhead: Oh, it’s great. No, I love it. Hey, let’s move to some OpenAI announcements here. I’m going to call, am I calling my own number on that or is that-
Daniel Newman: This one’s me.
Patrick Moorhead: Okay.
Daniel Newman: We’ll divide and conquer because I don’t know if you’ve followed what happens in the average week of OpenAI, but even when you just say OpenAI announcements, there was like five, there was a developer day, there was a GPT four turbo day, there is some new build your own LLM adventure day. So I’m going to talk about a couple of things that went on this week and maybe you can fill some gaps.
Patrick Moorhead: Good.
Daniel Newman: Two things I want to really talk about is one is GPT-4 turbo. So as we all know, there’s some big updates that need to be made and the company continues to make them. Now look, the growth of LLMs FMS models that companies can use continues to scale. And the idea of any one model sort of dominating all models I think is actually we’re seeing a fairly steep decline in that curve.
Having said that, OpenAI, is it the grandfather, grandmother of all LLMs? It’s the one that really kind of put this whole generative AI trend on the map. And so the company’s making changes to make it better. First change, it’s changing the knowledge cutoff. So one of the things that I think a lot of people complained about with OpenAI was its older cutoff. So as you know, when you asked it questions, it was like a 2021 date. So I think a lot of information was created in the last two years, so now the new one has taken us all the way through April 2023. Now between you, me and the fence post, that’s six months. Think about all the things that have taken place in the last six months. If you asked a question about, say, the conflict in the Middle East right now, six month old information would be very problematic.
So you can see why real time’s important, but it’s good to see them getting up to date. Increasing the semantic value, meaning being able to go with longer prompts. As you want to be more interactive and have more of a human machine empathic relationship, which isn’t real but possible, you need to be able to talk to it much like you talk to another human. You need to be able to give it more context. So it’s adding that and it also improved instruction following. They also, by the way, lowered their developer pricing. So this isn’t as much of a thing that you and I are going to be thinking about every day, but they’re down to a penny per 1,000 prompt tokens. So for developers, it basically costs less for them to be inputting and then basically being able to receive answers.
There were some other things that came out too, but the one other thing I just want to mention, I’m going to kind of use this as a handoff moment because you’ve also been talking about this pretty passionately, Pat, was this week they came out with anyone create your own version of chat, GPT. And so after spending a little bit of money earlier this year trying to start building the future of AI product and regretfully in less than six months, they basically enabled every company on the planet to build their own LLM and do it in a way that’s supposedly gives you developer capabilities, no code, upload your own data and keep it fully grounded. And keeping that data separate from the public domain data is now out.
So within one year it went from, hey, here’s one LLM model that works for everybody where everybody gets the same and consistent experience to everybody who basically subscribes for a handful of dollars a year, can now build their own personal digital assistant. And Pat, every day it feels like OpenAI and these other companies come out with a new feature that basically just ends an entire venture cycle. How many companies do you think were doing a build your own digital assistant product that just woke up one morning and they’re like, “Well, crap, our company is no longer relevant.” Now again, I realize that’s not always true. Competition’s good. We just said that in the Arms category, but if you’re a small startup, we all know, I think Rob Thomas at IBM Day talked about something like $500 million is the least you could invest if you were serious about building your own foundational models.
To know that overnight, they’re building basically a build your own, no code, you need no developer skills. All you need is to know how to upload documents and you can create a digital assistant and connect it to your calendars and connect it to your apps and connect ti … Wow. And so this came out this Monday on dev day. So Pat, with 100 million weekly users and a really, really big audience and the support of Microsoft and a whole big ecosystem around it, Pat, I’m totally blown away at how quick this company’s moving. I don’t want to take all the thunder out of this one, but they’re moving quickly. They’re making a lot of updates and now I guess apparently you and I with the no coding skills can build our own digital assistant with all the Chat GPT capabilities.
Patrick Moorhead: Yeah, I am very excited about these new, what they’re calling GBTs, right? And these are custom versions of this. It’s literally drag and drop capability. Dan, you and I had a conversation when this whole generative AI thing kicked off and we were thinking about different deployment methods and you were doing a lot of work on this upfront, which had a lot of value. And I was like, you know what? I’m going to wait until somebody comes out with the drag and drop and then lean into this thing and maybe slow roll it, but maybe not for a small and medium business, I think this is excellent. Essentially, you create the type of GPT that you want. They had, it’s funny, one of them was called Tech Advisor in there, a creative writing coach, laundry buddy, stacker whizz, just this. I think they gave a really good flavor of all the things you could do.
And it’s push button. There is no code. This is an application. And then what you do to customize it or ground the model is you throw all your content that you want it to learn on and it sucks it in, and you can even train it when it makes a mistake. Like, no, no, that’s not accurate. This is the right answer. So you’re creating your own, I think this is going to be seriously sticky. The one open throttle here is I need to do some more research on how they’re protecting data. It’s one thing, for example, for me to throw all my white papers, all my blogs, all the transcripts of all my videos in this thing. But what about other types of data? I think right now it’s probably really good for public data and oh, by the way, who owns the output of this and how can you monetize that?
But very impressed. I’ll be honest, I did not expect this because quite frankly, this is consumery, this is small business type of stuff that I didn’t expect them to bring this out as quickly. But yeah, I mean I remember you were trying out and I was trying out a few of these startups and what they can do, and they were using OpenAI as a backend. I’m wondering if this just completely wipes them out. I mean, we’ll see. It would not be fun to be a VC on somebody you gave $50 million to figure out if this was the right move.
Daniel Newman: It’s kind of like the people that are spending billion standing up a AI data center thinking that AWS isn’t going to build their own. I’m not saying AI data centers won’t be a thing. I’m just saying that there’s these kind of massive scale investments and it’s like, oh, we’re going to need this. And it’s like, oh, maybe we’re not. And so, hey man, listen, I can admit when I get something wrong, I was absolutely right about the opportunity, but I was absolutely wrong about the ability to differentiate and try to compete against big tech. Big tech is democratizing this thing. And by the way, Pat, this is great for the world. It’s really exciting. I’m very positive on it. It’s going to continue to push us to find ways to differentiate when all this is available to everybody. This isn’t different anymore. It’s just a thing. So it’s interesting.
Patrick Moorhead: Yeah. Let’s dive into the next topic here. Dan, you and I attended IBM’s AI and Research Day in Yorktown, New York, and I don’t know, there might have been 20, 25 industry analysts there. Two L1s were there, Rob Thomas and Dario Gill who runs research. And obviously Rob runs the commercial side of the entire house. I have to say, I hate to be fan boyish, Dan, and I don’t applaud everything, but I think they absolutely knocked it out of the park for enterprise AI. And what I was most impressed about was their articulation of their client, the way that they see this playing out, what their client needs are, where they’re starting in the journey to solve all of this, and real customers. Okay? Now, that was the public side. We can’t talk about the embargoed or NDA side. They are going to make a lot of announcements coming up, and we were pre-briefed on that, but I want to stick to the public stuff.
First off, IBM is a company that AI was an adder to AI being the business. And I know a lot of people are talking like that, but I’m convinced that IBM turned the entire company upside down. Not when it happened, but when Arvin started, right? Arvin was very clear. He’s like, “This company is going to be a hybrid cloud and AI company.” And I remember thinking, okay, totally get … This was before generative AI popped on everybody’s radar. So I’m like hybrid cloud, I get, but AI for IBM? So to make a long story longer, they’ve made hundreds and millions of dollars of investment. They’ve made 500 million venture fund for enterprise AI. They’ve gone GA, they were the first to GA, true GA, not fake GA as we see with multiple feature sets and multiple countries on the AI side, and then the data side and governance is coming up there.
But yeah, a really good explanation of all of that. And it was not like limited. It’s not, hey, this is in theory. This was we are working with clients to solve customer facing function and experience problems. HR, finance and supply chain, IT development and operations, core business operations, and then putting real numbers. Now, I haven’t tested these numbers. I haven’t talked to these customers on numbers, haven’t run this in my lab or your lab, but the output is staggering. And by the way, it’s something you and I have holistically believed in the opportunity, but reducing costs per invoice by up to 50%, reducing application support tickets by 70%, automating answers with 95% accuracy in customer service, reducing content creation costs up to 40%. This is classic IBM focusing on the enterprise. And I’ve got to tell you, the one thing I have to learn more about Dan is when it comes to Watson, which I liked Rob Thomas’s explanation of it being the IDE for AI was a good one, and it said they’re the only one that can do on-prem and public cloud.
I need to learn, I need to research more about on-prem and how they’re deploying this with companies like Dell, not only in data centers, but also on the edge. It’s the end-to-end stack, Dan, that I’ve been looking for forever. VMware broached it with private AI, and I was all excited about this, but I just didn’t get enough details out of it. And I think IBM has definitely figured out different elements. I need to do more research on it, which by the way, could put IBM again in another leadership zone of itself being end to end.
Daniel Newman: Yeah, I mean, look, first of all, this was a great event. Some of the best slideware I’ve seen, and if you want to check out some more of that, at least the publicly available parts of it posted on my Twitter. X, do I have to call it X? Is that a thing?
Patrick Moorhead: No, I rarely call it X. You don’t have to either. I mean, bringing X back.
Daniel Newman: But look, I thought there was some great one-liners and quips, the nutrition label for AI, governance. I think right now we’re all about building, we’re all about productivity and all about efficiency, but governance is going to be where a lot of the war’s won for companies that want to compete. And in the enterprise space, hearing about chief legal officers and basically being incredible roadblocks for companies implementing generative AI because trust, it’s trust, it’s safety, it’s privacy, there’s huge risk, get it wrong. And then of course, culture, the ability for companies to implement this in a way that upskills its personnel and concurrently takes advantage of value that can be created quickly. But Pat, I’ll tell you something else. This was probably some of the best material as it pertains to use case and customer value. So they had the consulting team come in.
I mean, Rob Thomas probably showed five or six slides of very specific in-depth work and what the returns are in areas like digital labor, in areas like supply chain, where you’re seeing 20, 30, 40% of incremental value improvements, whether that’s cost reductions, whether that’s productivity gains, doing the measurements, doing the testing, doing the validation. And to your point, I know we haven’t done this testing yet, but the methodology seemed to be comprehensive and it seemed to be well-thought-out. And now the question I’m asking is, well, if the numbers are so good, why doesn’t everyone run down the path immediately? And I asked that question, I think it’ll-
Patrick Moorhead: Yeah, it was a good question.
Daniel Newman: It was on the record and it’ll come back out. But the answer in short was it’s a slow move to get culture. It’s a slow move to get budget. And despite the fact that we think that once it’s obvious everyone would do it, it still has some selling to do inside of companies and the allocation of budget. I like those numbers. The six to 16% of spend roughly of, sorry, revenue is spent on it. I think AI is going to force that number upward more substantially if companies want to be able to compete with the capabilities of generative AI.
But there was just a ton of depth there, Pat, and some of the best presentations, unfortunately, we can’t really talk about some of the vision around Quantum. But I think broadly we can say IBM showed a very impressive vision there, and I look forward to that being. I think they have a research day that’ll come up public on quantum, so you’ll all have to wait on that. But Pat, I thought it was a great for analysts event where they really did a good job, maybe the best job I’ve seen IBM do with business value and use case, meaning it wasn’t just … Especially at a research event. For me, usually after the first segment, I get glossy eyed at some of these when they start getting too into the technical weeds.
But even just showing some of the code capabilities, just some of the stuff it can do to enable someone to take code and update it to different code using its generative capabilities, all grounded, all governed. So I’m not going to say fanboy, but I’m going to say a significant uptick in the quality of an event like this. And by the way, great stable of analysts. They kept the numbers small, they kept the executive interactions high and look for our Six Five episode to come out because we got the exclusive with the two L1s that were there. And so you’ll be hearing what could be shared more in depth from Darrio and Rob Thomas probably in the next couple of days, right?
Patrick Moorhead: Yep, absolutely. Great analysis, Dan.
Daniel Newman: Rock and roll.
Patrick Moorhead: It really is IBM’s to lose here, right? They’ve got to continue executing, they have to continue doing, first, sales and marketing is paramount. Some of the comments I can’t share that Rob used, I’ve loved. I mean, there is a cultural shift here at IBM and I know everybody talks about it, but this is reality. I can feel it. I can tell you as an analyst that it is real. Arvin is just a completely different leader and leads in a different way. His team, he’s brought in an all-star team to make it happen. Dan, you spent the week in Spain and you had a lot of conversations. You went to explore. You met, I think with some Red Hat senior executives. What’s going on here?
Daniel Newman: Yeah, I think rather than just doing the VMware explore, which I’ll give a quick rundown, I wanted to talk a little bit about a very interesting sort of partnership and following on from our IBM conversation because there’s a lot of question marks right now. I think the world sort of wonders what’s going on. And by the way, I went and the whole reason I planned to go was to be there in the post-deal era, which by the way got sort of unwound because they didn’t end up finishing the deal. So we’re still out there holding, what was the word Pat, soon?
Patrick Moorhead: Yeah.
Daniel Newman: So soon was the public disclosure. Did spend some time with some of the Broadcom execs, spent some time with VMware execs, had a chance to talk this week with Red Hat CEO, Matt Hicks as well. And so had some great conversations in the multi-cloud slash hybrid private AI era.
And so takeaways from VMware explorer, Pat were, well, one, the big elephant in the room is what was it like? Well, I honestly feel like it was a little bit of a limbo. I just feel like nobody quite knew what to expect. So there just wasn’t a whole lot of, what’s the right word? It’s just people, there’s a whole lot of uncertainty about what’s going to go on, Pat. It’s pretty well public record at this point that offers have gone out. People are leaving, people are coming. Now different in Europe, by the way. They could not do those things in Europe from labor laws. From a labor law standpoint, they could not make those offers, could not make those transitions. So the stability of the labor force in Europe, for instance, is different than the changes that were being made in the US. But some of the new GMs were kind of brought out the people that will be the new GMs, and again, not going to disclose the details, but starting to introduce those people.
And Hock actually got on stage, Hock Tan, CEO of Broadcom, got on stage, talked to the whole group, talked about the three big ways that VMware intends to bring value to the customers. Also did a cameo for the analyst group and took some questions. So clearly there’s kind of this very mixed feeling. Is it moving? Is it not moving? And I think there’s still a pretty outsized confidence in all sides. And Pat, frankly for you and me, we hope just based on a lot of the things that have started moving forward that this deal gets done. But there’s nothing, if not going back to our Six Five summit conversation with Hock Tan, going back to some of the conversations that we’ve had at VMware’s more recent event, there’s probably not anything that’s been more exciting about what VMware is up to than what it’s trying to do with generative AI and it’s private AI offering.
Pat, there’s a whole side of enterprise AI that requires very thoughtful, especially in regulated industries, that is a very thoughtful architecture for what compute is done, where what’s, what’s moved to the cloud, how do workloads move in between what’s virtualized, what’s Kubernetes, and can you do all these things in an ecosystem? And VMware has painted a picture for this, but let’s be very clear, the picture they’re painting is one that is of mixed, I say etiology, but in all serious it’s mixed vendor. And so while you’ve heard a lot about VMware and Tanzu and vSphere and Cloud Foundation, what the company seems to be really leaning into, and actually Chris Wolf and Tarun Chopra of IBM actually came out with something together, but their private AI solution for generative really seems to be an amalgamation of VMware, Red Hat and IBM Watson X.
That was probably one of the coolest things that came out of this is VMware is not trying, while it does have a multi-cloud platform in Tanzu, it does understand that it needs to meet the customer where it is, and the customer is on vSphere and running virtualization with VMware, and they’re running multi-cloud and Kubernetes with OpenShift and taking these two technologies and these platforms and combining them with Watson X is a powerhouse. And so the companies, rather than coming out and trying to dig in and VMware saying basically, “You’ve got to use Tanzu, you’ve got to use vSphere, you’ve got to do it our way.”
They basically doubled down on a partnership to build generative AI in their private AI with IBM, with IBM and Red Hat, and be very clear, Red Hat runs very autonomously from IBM for just this reason. So in the meeting I had with Matt Hicks, he was basically communicating a very high level of confidence that they’re winning at the line of multi-cloud, meaning that while VMware has its legacy and has its strength in the Cloud Foundation, right now, the partnership is Cloud Foundation, Red Hat, and now they’re partnering up on top of it to be able to deliver and enable customers to do it with Watson X.
Very positive overall sentiment on this one though, Pat, the feeling is that companies that need to deal with this sort of private and on-prem, that this is the, and it matches what you were saying earlier about what Rob was saying about being able to be the kind of on-prem for enterprise generative AI, while more and more companies are making it cloud only or cloud led, that was really positive.
Patrick Moorhead: Good analysis, Dan. And I view this as a maturation of things. Markets mature and thought patterns. Everything starts as a winner takes all. And that’s just not the way it ends up. None of the on-prem software folks were partnering with AWS and AWS didn’t want to partner with them. They didn’t think they needed them. And then years later, once growth starts hemming and customers are like, “Listen, I know what you want to do, but I’m not in on all that.” Right? You guys need to get together. This is what we want you to do. And that’s where I think the VMware OpenShift stuff is coming to a reality. And I think it’s good. It’s good for enterprises. I think it’ll be good for these companies. It does mean that they need to maintain and have some really solid APIs and handoffs between that.
But listen, VMware and Red Hat are two powerhouses out there and vital to enterprises, not only in on-prem, but in the public cloud and also the hybrid cloud. I do think Red Hat needs to do a better job communicating and marketing and getting their value out, because quite frankly, VMware, their value add is very straightforward. I get Red Hat as it’s integrated into IBM M and IBM’s AI solutions, but I don’t know if they pulled back on marketing or it’s just a completely different thing. But it needs help and it needs changes, and I think they need to do this. I’ll just leave it at that. So Dan, with that, let’s go to our final topic here, and that is Synopsys embracing RISC-V IP. So Synopsys not only has a tool chain that other companies use to create their own SOCs, but they also have intellectual property that comes into bear here.
And quite frankly, if you look at, I didn’t do these rankings, but it’s from a company called IP Nest. Synopsys is the largest foundation IP, top interface IP, so think IO, physical IP vendor and is only second to Arm in CPU IP. So they are a powerhouse with intellectual property. And up to this point, they had done Arm-based CPUs, but now the company is diving into RISC-V processors. And it’s not all RISC-V processors. They have a nice portfolio that has targeted the markets, you would expect that RISC-V is doing pretty well at. Like industrial, a lot of interest in automotive, storage, networking, and there’s even a lot of interest in consumer. And that’s what Synopsys is hitting.
I’m not saying consumer PCs, I’m thinking more about consumer IOT, wearables, hearables and things like that. And the company brought in three levels of performance all the way from a 32 bit embedded processor for the optimal low power to a 64 bit multicore host processor with what they’re saying is 25% higher performance than the HS6X cores, and you link it in with an AMBA interface. So they have a good better invest to roll this out. And I think the markets that they’re going after, it makes sense to me. So congratulations to Synopsys and RISC-V consortium.
Daniel Newman: Yeah, Pat, I know we’re on a bit of a hard stop, but let’s just say expect more from us on this one. The Synopsys story is the next untold Arm story potentially in terms of how much potential it has. There’s a growing library of capabilities and they continue to be innovating at a really great pace. So the work they’re doing across embedded, across storage, across IOT, there’s a ton of expansion. It’s happening very, very quickly. There’s new leadership, there’s a bunch of new IP and Pat, I think they’re an exciting story. I think they’re one of the kind of quiet and untold and one to really keep an eye on.
Patrick Moorhead: They are. I think the entire semiconductor value chain, the tools to do this if the IP is more readily available and you can take it all the way to the foundry and put that in there. And we’ve seen companies like, look at AWS, look at Apple, look at all the folks. I mean, they’re using tools like this. They typically are using Synopsys and Cadence to do this. Synopsys got a first mover advantage with synopsys.ai, and I’m interested to see, I mean, the big swings too, like 35% efficiency when it comes to things like testing and design. But hey, thanks everybody for tuning in, episode 192. We appreciate you and we will see you next week. Check out all of our Six Five videos out there and the ones that we did for Cloudera Evolve and soon to be published the ones for IBM AI and Research Day. We appreciate you. Give us feedback on social media. Dan, great to see you. Take care, everybody. Bye-Bye.
Author Information
Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.
From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.
A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.
An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.