Dell Technologies AI Ecosystem – Six Five On the Road at Dell Technologies World

Dell Technologies AI Ecosystem - Six Five On the Road

On this episode of the Six Five On the Road, hosts Patrick Moorhead and Daniel Newman are joined by Dell’s Matt Baker, Senior Vice President, Activating AI Strategy at Dell Tech World 2024 for a conversation on Dell’s role and strategy within the AI ecosystem.

Their discussion covers:

  • Insights on Dell Tech World AI ecosystem news and partnerships
  • The significance of GenAI in the tech industry
  • Exploring the landscape beyond LLMs in AI models
  • Dell’s approach to AI enablement through internal deployment
  • Analysis of where AI workloads are most effectively run

Learn more at Dell.

Watch the video below, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Or listen to the audio here:

Disclaimer: The Six Five Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we ask that you do not treat us as such.

Transcript:

Patrick Moorhead: The Six Five is on the road at Dell Technologies World 2024 here in Las Vegas. We are analysts. We love Las Vegas and tech trade shows. Love Las Vegas too. It’s been an awesome event so far. And surprise, it’s all been about AI. AI infrastructure, AI PCs with the new Copilot+ PCs, AI software and AI services. Some great stuff happened up on stage today. A lot of partners. Michael was great just talking ServiceNow, Samsung, and of course NVIDIA with Jensen.

Daniel Newman: Yeah, it was a big morning, but it’s been a big event so far. And if you walk the halls, I always can get this feeling for the event. When you walk you look at the audience, you look at the attendees, you look at the energy of the employees, and Dell feels very rejuvenated. The AI pivot right now has seen a big acceleration in terms of market perception, market value, enterprise value, customer adoption, and now they’ve got a whole new set of solutions. And you can see all the things they’ve been talking to, culminating in a go-to market strategy that we as analysts have to recognize. And so I give a lot of credit. I think it’s come a long way. And here’s the most interesting thing. We sat down with Michael Dell. He actually said something about this. It’s still pregame. It’s still the first inning. We’re on the third tee. We’re in the first half, first quarter. Whatever sports analogy you want to use, it’s early.

Patrick Moorhead: Yeah, and the other thing that Michael talked about was the power of the ecosystem. And I can’t imagine a better person to talk about the AI ecosystem related to Dell then, Matt Baker. Matt, great to see you, buddy.

Matt Baker: It’s great to be here. I think we might be at batting practice.

Patrick Moorhead: Oh, I like that. I like that.

Matt Baker: I think so. This is a decade plus long ride that we’re on, so I’m really excited about it.

Patrick Moorhead: Yeah, we saw how long it took to build out the internet. I think it’s going to be even quicker. And I also think that unlike the internet that had the dark fiber and the bust, the benefits that both of our companies are seeing with enterprises and the true benefits and the paybacks they’re getting off of this are very… I’m not Babe Ruthing at this point, but it’s very compelling, right?

Matt Baker: Super compelling.

Patrick Moorhead: When you can have a company talk about how they decreased content creation costs by 50%. We’re talking about hundreds and millions of dollars for certain companies and making customer service better.

Matt Baker: Yeah.

Daniel Newman: That could be just getting rid of one analyst.

Patrick Moorhead: I think so. Maybe your analysts, come on.

Daniel Newman: Anyway. No, I totally agree. I mean look, we are seeing the power of technology meeting business.

Matt Baker: Absolutely.

Daniel Newman: And we always said that. People don’t buy technology for technology’s sake. They buy it for business sake. This is one of those sort of shifts that you maybe only see once in your life. And you know that those of us that have come up, you’re a little older, but those that have come up over many, many years have seen these technological-

Patrick Moorhead: He’s about to call me a boomer. Just wait.

Daniel Newman: I would never do that. These technological trends, I don’t think we’ve seen one that’s been quite as powerful and quite as condensed. So you said 10 years, I think we are early. I think it’s hard to imagine where this is going, Matt. But it’s going really, really fast.

Matt Baker: I think you picked it apart really correctly which is we have seen moments that have been this powerful. I’m not sure we’ve seen things moving at this velocity. The internet took time, and that’s the closest lived experience that I can think of that this is like. But it’s moving at a much faster pace. And I think I’ve heard Michael say it before is every successive wave of innovation seems to be accelerating and accelerating.

Patrick Moorhead: It’s like a shorter half-life.

Matt Baker: Totally. Well hopefully it’s not a half-life, but you get what I’m saying.

Patrick Moorhead: Well, it makes sense because we have newer technology working on newer technology and you have that compounding of that, so it makes sense. But we also have the ecosystem in place to I think enable these types of innovations as well.

Daniel Newman: And by the way, I just meant between the disruptions.

Matt Baker: Oh yeah. I was like, I don’t want us to disappear into-

Patrick Moorhead: Dark fiber, baby.

Matt Baker: Dark fiber.

Daniel Newman: I don’t know, I’m only 70% sure that that’ll be the case. Listen, partnerships have been a big focus here at Dell Technology World. Pat pointed out on stage some of the biggest and most prolific CEOs on the planet basically coming together, bringing it left to right. How are you encapsulating Dell’s partnership strategy?

Matt Baker: Well, I think that it’s really important in the early stages of these big industry movements is you see innovation everywhere. And so you want to maximally partner with a broad ecosystem. So we’re partnering not just with the big largest established companies, but we’re also looking to find the smaller upstarts and helping. If you look at any AI solution today, it’s made up of scores of open source projects, small companies, in addition to being integrated by companies like ourselves or NVIDIA together, we’re pulling it all together.

And before we started the discussion, you had mentioned finding those patterns. And so we talked about the AI factory, within that AI factory, an instantiation of it I should say, is a whole host of smaller, larger partners that cross between ourselves, NVIDIA and others. So it’s really important to really, I think, cultivate a broad ecosystem early in these waves because you can’t discount away the great innovation of a given company. There are going to be many, many unicorns born out of this and some will stay around, some won’t. But it’s important for us to pull all of those together and help our customers not have to deal with all of that integration work. We can help share the load collectively.

Patrick Moorhead: I can imagine how challenging it is though in partnering because you could partner with an infinite number of people.

Matt Baker: You can.

Patrick Moorhead: You can go too shallow. You need to pick the partners that you can go deep with that can get you to where you and your customers need to go.

Matt Baker: Yeah. I mean, it could be unwieldy, but that’s why I think partnerships and ecosystems are a little bit different, right? Like a partnership is a one-to-one relationship. I think of an ecosystem as a dense vector of partnerships, right? It’s like we all can share the load differently and we’re pulling different elements of it together, right? So NVIDIA is working with tuning with PyTorch and other elements. And look, we’ve got this great partnership. Never would’ve thought we’d been partnering with Meta, but they’ve built something that can really power amazing innovation. And it’s once you use that model, that’s your model. It’s yours to innovate with freely. And there are other examples like Mistral and others. But yeah, I mean, I think it’s building the ecosystem. There are a finite number of people you can have a relationship with. But when you build it into a ecosystem, it can become much more scalable because you’re sharing the load across many, many folks.

Patrick Moorhead: Yeah, Meta’s interesting. I mean 70 billion parameter, Llama then Llama 2. There was a lot of initial talk about these gigantic models, but it seems like there’s been a whole lot of other talk about I would like to call them slims, other people call them SLMs. But even vertical models that address areas like healthcare, legal industry and so on and so on. What are you seeing out there?

Matt Baker: Well, I think we’re starting to see the world move away from the, “My model’s bigger than your model.” That’s just not really serving anyone. In the consumer context if I’m trying to do a project on Mesopotamia, awesome. But if I’m trying to put a AI system to work, it needs to be infused with my knowledge, my data.

Patrick Moorhead: Wait, are you telling me you don’t want a model that’s trained on Reddit doing your healthcare chatbot? What could go wrong?

Matt Baker: Precisely.

Patrick Moorhead: What could go wrong, Matt?

Matt Baker: What could possibly go wrong? No, but I think it’s important not just to make light of it. But if you have a model that’s infused with a bunch of superfluous knowledge, it’s just burning electrons processing through that. Slim the model way down, combine it with your data. And this is where again, you talked about what are the patterns. The pattern of the last year and probably going forward for some time is augmented generation of some type. Today, RAG is dominant. But we see agent-augmented generation coming down the road.

So those use LLMs in a very specific way. You combine knowledge using semantic search, graph search, other techniques to retrieve information, package it up into a prompt and dispatch it to an LLM saying, “Summarize this, or turn it into a stepwise instructions or whatever.” So the large language model is working with other AI types which we might, I hate to say traditional AI, you get what I’m saying? But every great enterprise use case we see today is an amalgam of traditional… I need to come up with something else, but AI in the before times. Before November.

Patrick Moorhead: Machine learning.

Matt Baker: Yeah, machine learning.

Patrick Moorhead: Or analytical-based AI.

Matt Baker: Yeah, exactly. So combining that together, that’s what we are seeing. And we’re seeing a great amount of success with models in the low billions or even below. And you mentioned healthcare. We’ve had a great partnership with Dr. Mozzi at Northwestern Medicine, and he’s developed quite a modest size model. It’s about I think less than 300 million parameters. Million, not billion. 300 million. And he’s using that to analyze and summarize the results of chest X-rays, and he’s reducing the cognitive load on radiologists.

I mean think about it, after the pandemic doctors have been under fire and it’s not getting any easier on the other side. So taking any of the cognitive load down is great for doctors. And think about it, going to the hospital is not a walk in the park. In fact, it’s the opposite. It’s super stressful and scary. And if I can get you your results faster, it’s a better patient outcome. And it’s a hospital, the cost of operating the hospital goes down because it’s more efficient. That is a great example of a built-to-purpose model that has been fine-tuned on very, very high quality data that’s leading to great results and is again, super modestly sized.

Daniel Newman: Yeah, we had the chance to talk to Feinberg at Northwestern this morning-

Matt Baker: Oh, great.

Daniel Newman: …along with John Siegel. Yeah, we had a nice conversation over in the broadcast booth, and it was very interesting to hear these real world. And by the way, one thing you didn’t even mention on top of all that by going to smaller models is power. And there’s a huge conflicting situation that’s growing because as AI becomes more performant, it’s also creating tons of stress on the grid.

Matt Baker: Yeah. So it’s sort of like, why go there in a C-5 Galaxy monster plane when you could take a moped? Take the moped. It’s way more sustainable.

Daniel Newman: Yeah. It goes back to when we talked about cloud traditionally, right? Workload, right? Where some of this is happening again.

Matt Baker: I’m glad you bring that up because I think people have a real world counter case of, “Well was cloud easier, cheaper?” It was maybe faster back in the day, but now not so much. So with that fresh in your mind, you’re like, “Why am I paying on a per-token basis when actually in those RAG workflows, 90 plus percent of the work is the pipeline?” The inferencing at the end with the LLM, small part of it. So it’s like if I built all this, why don’t I just run that? Trivial.

Daniel Newman: Right. We’ll give you a chance to actually talk about where that runs in a minute. I did want to, because we have a few minutes left, a couple of questions I know that I want to ask. Maybe another one. But I’ve been listening to your customer zero stories all day. It seems to me like one of the things Dell’s really talking about here at Dell Tech World is your own AI story. Talk a little bit more about that, about how you’re sort of really customer zero and how much of a focus that.

Matt Baker: Well, and that’s the great part about this new job of mine is that I’m not just working on what we’re putting out into the market to help our customers, but I’m helping drive the internal programs. I mean at some point, no surface will be untouched. But initially we focused on four areas around driving sales productivity, driving developer productivity, improving customer experience with services by adding more. And then you mentioned it, content. Content’s an obvious one. So those were the first four areas. We also needed a place to run it. So we have our own internal platform that we run it on. Since then, we’ve grown it out to add our supply chain. And interestingly in supply chain, it’s not a lot of generative use cases. It’s actually a lot of machine learning at scale.

So think of it as lots of forecasts and simulations all coming together to, in essence, create a digital twin of our supply chain so that we can push in a weather event here or push in a supply constraint there to understand how the whole system reacts and better optimize the whole versus optimizing the points. So it’s across those areas, supply chain, finance and online. I’m really excited about the potential to rethink how customers interact with their commerce experience. And frankly, you heard on stage about the AI PC and the role of the PC. I’m excited about how we’re going to just completely change the way that we interact with the digital world. Imagine machine vision, speech recognition, generative AI, clickety clackety. It’s going to lead to a whole different world.

Daniel Newman: You got little kids, Matt? Because- Click, Clack, Moo.

Matt Baker: I don’t have any little kids. Oh, clickety clackety?

Daniel Newman: Yeah, clickety clackety. That was a book I used to read my kids.

Matt Baker: Oh, there you go.

Patrick Moorhead: My kids are all up and out. I can’t relate. But hey, real quick, the speed round question.

Matt Baker: Speed round question.

Patrick Moorhead: GenX, I think you might be. Are you a GenX?

Matt Baker: I’m a GenX. Of course I am.

Patrick Moorhead: There we go, buddy.

Daniel Newman: Millennial.

Matt Baker: Millennial?

Daniel Newman: I lost my hair when I was like nine.

Matt Baker: I did too.

Patrick Moorhead: I love this.

Matt Baker: It was the stress, man.

Daniel Newman: Get back on topic.

Patrick Moorhead: Matt, speed round question here. I mean listen, the public cloud is 15 years old. 75% of the data is still on prem or on the enterprise edge.

Matt Baker: I think it’s more than that.

Patrick Moorhead: This notion of private clouds, these stacks are about five years old. We still see public clouds. Revenue is just going crazy, and there are AI workloads in this. Where is the best place to run an AI workload?

Matt Baker: Well, I think we’ve stated very clearly that the world’s hybrid. I’m not going to say that the public cloud doesn’t have a role to play, but I do think that the conditions are different. The computational intensity is such that executing that same cloud model and making it profitable by oversubscribing it gets real difficult when you’re consuming lots of computers. I just think that the logic is the data’s on-premises. This is all fueled by data, and data has gravity. It’s hard to move. It’s expensive to move. Why not move the compute side of it which is much lighter weight? Think about the size of even a 70 billion parameter model is still just a few tens of gigs, right? So it’s easier to move the model to the AI and execute it. And all of the tools to do it are there. I mean, people are trying to separate the concept of cloud and AI as they’re something fundamentally different, but they’re not.

Do you need a bare-metal Kubernetes cloud to do it? Yeah, that’s probably the most effective way of doing it. But all of that orchestration to deploy a generative AI or AI workload exists. This is another class of workloads that has a different makeup of compute storage, networking than what came before it. And I think the conditions based on what I said and the intensity of the computation is going to follow the data, which means a lot of it’s going to occur on prem. I also think that the promise of the cloud, that it was going to be significantly more cost-effective, that didn’t play out. And so people are like, “Well, you tried to sell me on that before.” So let me find a balance.

And Dan, you said it before. It’s all about the right workload in the right place. If there’s a reason for it to be in a certain place, edge, you’re going to do your inferencing where the activity’s happening because you can’t do real-time operations over vast distance, right? It’s like there is this thing called the speed of light and it’s fast, but it’s not as fast as you think.

Daniel Newman: Yeah. Well listen, I really appreciate you kind of breaking that down, Matt and just that you spent the time here with us. We know how busy Dell Technologies World can be for any executive and the mandate you have now. I expect to see you running around maybe a little gassed by the end of the week, but.

Matt Baker: I’m gassed already. It’s day one.

Daniel Newman: I won’t tell anybody. Just everybody out there.

Patrick Moorhead: It’s our little secret, yeah.

Daniel Newman: Let’s have you back soon.

Matt Baker: Yeah, I’d love it. It’s awesome.

Daniel Newman: All right. All right, thanks everyone for tuning in to this episode of The Six Five. We are here in Las Vegas at Dell Technologies World 2024. Join us for all of our coverage. Subscribe and become part of our community. We would appreciate that very much. But for this episode, we’ve got to say goodbye. See you all soon.

Author Information

Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.

From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.

A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.

An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.

SHARE:

Latest Insights:

Brad Shimmin, VP and Practice Lead at The Futurum Group, examines why investors behind NVIDIA and Meta are backing Hammerspace to remove AI data bottlenecks and improve performance at scale.
Looking Beyond the Dashboard: Tableau Bets Big on AI Grounded in Semantic Data to Define Its Next Chapter
Futurum analysts Brad Shimmin and Keith Kirkpatrick cover the latest developments from Tableau Conference, focused on the new AI and data-management enhancements to the visualization platform.
Colleen Kapase, VP at Google Cloud, joins Tiffani Bova to share insights on enhancing partner opportunities and harnessing AI for growth.
Ericsson Introduces Wireless-First Branch Architecture for Agile, Secure Connectivity to Support AI-Driven Enterprise Innovation
The Futurum Group’s Ron Westfall shares his insights on why Ericsson’s new wireless-first architecture and the E400 fulfill key emerging enterprise trends, such as 5G Advanced, IoT proliferation, and increased reliance on wireless-first implementations.

Book a Demo

Thank you, we received your request, a member of our team will be in contact with you.