Search

Dell’s Density Journey – Six Five On the Road

Dell's Density Journey - Six Five On the Road

On this episode of the Six Five On the Road, hosts Keith Townsend and Lisa Martin are joined by Dell Technologies’ Matt Baker, SVP – AI Enablement for a conversation on Dell’s Density Journey, highlighting their collaboration with Solidigm. During this thought-provoking discussion, Matt shares insights into how Dell Technologies is navigating the intricate landscape of AI enablement, their partnership with Solidigm, and the impact this journey is having on technology density and innovation.

Their discussion covers:

  • The strategic partnership between Dell Technologies and Solidigm and its significance.
  • Innovations in technology density and how it is reshaping the AI landscape.
  • Challenges and opportunities in AI enablement from an industry perspective.
  • Insights into future trends in technology density and AI capabilities.
  • Practical applications of these innovations in real-world scenarios.

Learn more at Dell Technologies. Also, discover more about this collaboration at Solidigm.

Watch the video below, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Or listen to the audio here:

Disclaimer: The Six Five Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we ask that you do not treat us as such.

Transcript:

Lisa Martin: Hey everyone. Welcome back to Six Five On The Road, sponsored by Solidigm. I’m Lisa Martin here with one of my favorite co-hosts, Keith Townsend. How are you? Looking dapper, right?

Keith Townsend: I am doing well. I’m with one of my favorite people in the industry, Matt. We should be doing this in like an Airstream, a Teardrop, something. That was our brand during-

Matt Baker: That was our brand.

Keith Townsend: Yeah.

Matt Baker: The pandemic brand.

Keith Townsend: The pandemic brand.

Lisa Martin: Matt Baker joins us once again. SVP, AI Enablement at Dell Technologies. Welcome back.

Matt Baker: Hey, it’s great to be here. Thanks for having me.

Lisa Martin: So we’ve heard a lot about Dell’s AI ambitions, what it’s doing there, the AI PCs, what’s happening in storage networking, customers that it’s helping to really navigate this journey. But we want to take a peek into Dell’s internal journey, drinking your own champagne.

Matt Baker: Champagne.

Lisa Martin: What can you share there?

Matt Baker: There you go.

Lisa Martin: Yeah.

Matt Baker: We can’t say eat your own dog food anymore.

Lisa Martin: You can, but the champagne reference, I just feel it’s a little classier.

Keith Townsend: Lisa, I just feel like eating your own dog food would be more on brand for you.

Lisa Martin: You would think so. I’m a dog mom. Yeah, but I like champagne. Well, I like it all.

Matt Baker: Yeah, you can do both. Anyhow, so yes, the cool thing about this job that I stepped into last year is that I’m not just working on what we’re offering to our customers. That’s a big part of it, but I’m also driving our internal programs. And those internal programs are tied to, in essence, a broad modernization effort for the company overall. And that modernization effort is in service of solidifying the company for the next 40 years, if you will. So we’re not doing AI for AI’s sake, but certainly AI influenced what we believed we could do to modernize the company for the next 40 years.

And so we’ve been working tirelessly over the last, in essence, almost a year now, to identify the key areas that we want to attack first. We focused on sales. In essence, optimizing the day in the life of SalesMaker, improving and making more quick the CPQ or the configure price quote process. We are also looking at services. As you can imagine, we have tons of products and unfortunately products do have challenges and problems sometimes that you have to resolve. People call in. We’ve created solutions that help our agents get to an answer much more quickly by reasoning over vast amounts of data about issues, challenges that our products might have. So it improves the day in the life of an agent, but it also makes our customers much happier because we solve their problems much more quickly.

Content, you all are in the content business. AI is an amazing tool to automate the creation of content and really turbocharge that process. And then last, we have a ton of software developers and the opportunity to improve the productivity of software developers is huge. Those were the four initial areas, obviously that has to run somewhere. So we have had a fifth area that sat below all of those, which is to develop the internal platform on which we run all of our AI and generative AI workloads. That was born out of another capability that we call DSX or the data science experience at Dell, which is a part of Dell digital, our IT organization.

It’s been great working across the business. And now we’ve added additional areas with supply chain, digital twin development, finance, just doing forecasting at scale. And then last one I’m really excited about is our online efforts. So obviously dell.com operates at an insane scale. We have billions of visits every year, and that represents therefore one of the largest potential workloads that we will develop. But I’m more excited not just by the scope and scale of it, but by reimagining the commerce experience for our customers from an experience that is driven by the internet paradigm of the last few decades, which is point, click, point, click, click, click, tap, tap, tap.

To one that is more of a choose-your-own-adventure journey that you can interact with more seamlessly. And so those are the rough buckets that we’re operating in. And the cool thing about this job is I see both sides. I’m influencing what we’re doing in CSG and ISG, but that’s informed by what we’re learning in real time with our own efforts. So it becomes this fantastic flywheel of innovation and our goal is to accelerate the adoption of AI for everyone. So all of those learnings we just freely send out to our customers and say, Hey, this is the way you should probably proceed. Why skin your knee? We already did. Don’t make the same mistakes that we did.

Keith Townsend: So Matt, you started out with a lot of areas that seem around internal productivity, kind of like the big thing. And the thing that most people are excited about is that customer-facing experience. Is that by design? Why not start with the thing that when we hear about a trillion tokens and all the investment in AI and GPUs, CPUs, et cetera, it seems like the justification is that new market, but that’s not where you folks started your journey.

Matt Baker: No, no. We started the journey understanding that this is a truly transformational period in the industry. And people are like, what is this like? Is it like virtualization? Is it like cloud? And I’m like, no, no, no. This is like internet. This is fundamental to change. And if we want to be successful in the new market, we need to be experts. And what better way to become experts than to, as you say, drink your own champagne, right? Do it ourselves. So our focus has been on that internal innovation because, look, this is new, daunting to some people. All the tokens, what’s the token? It’s a fraction of a word. It’s like, oh, come on. You get what I’m saying?

Keith Townsend: Can’t we just say it’s a fraction of word?

Matt Baker: Can’t we just say it’s a fraction of a word? And it’s not even that it’s a numerical representation of a fraction of a word, but I digress. We started this way because we believe that those learnings will pay dividends for us to then inform the broader market. And I’m very proud of the fact that it feels like we are definitely pioneering a lot of this notion of doing on-prem with open source models is not necessarily the norm from the start. It was to call the big API in the sky to a model hosted elsewhere. And while that completely has a role, we’re not denying it. There is an opportunity to really innovate and become a true practitioner, but in order to help people understand how to do that, you yourself have to become an expert practitioner.

And so that’s why we worked it this way. I mean, ultimately we are benefiting from a big market move. And it’s not to discount that we don’t see fantastic growth with our accelerated servers, AI servers if you want to call them that. But now we have new opportunities with the AI PC, and you’ve seen all of the announcements this week of the infrastructure and broader solutions landscape that we’re bringing to market. Again, they’re informed by all of the work that we’ve done internally.

Lisa Martin: One of the things that I always am impressed with is history to organizations. And Dell just celebrated its fortieth birthday with Michael at the helm since he was 19. And it’s always interesting to me to see historied companies and how do they transform digitally and culturally. But what I really got the sense of this week is everyone really being on the same page.

Matt Baker: For sure.

Lisa Martin: Culturally, to help Dell transform, to your point, to inform what you need to be doing for customers and how they can learn from Dell. That was a theme that I thought was woven very strongly throughout the conference.

Matt Baker: Well, and I think that that’s informed by the fact that we’re doing this at the same time. So it’s not just like we have an internal operations and then we have products. Literally, we are going through a major transformation ourselves because we believe in this technology and the power that it represents, and therefore everyone is naturally on the same page. The thing that I get worried about, and I’ve heard from press and analysts and customers, is people saying, “Hey, I need you to help me do AI.” And it’s like… Remember big data? I need to do Hadoop. It’s like, well, why?

So it’s like, slow down, I’m happy to sell you stuff, that’s what we do. But I’m more interested in helping you understand how you’re going to use the technology. But that should come at the end. It’s people and process that you need to think your way through. So we also have an opportunity to provide services to do that. And in fact, the interesting thing is we used our own services organization to help us work through the prioritization of our own internal use cases. So again, another example of drinking our own champagne.

Lisa Martin: Yeah, I like it.

Keith Townsend: So Matt, we’re infrastructure people. We love infrastructure. And if we use your example of the internet, we can even go further back to the invention of the automobile. How did people use the automobile the same way that they used a horse? How did e-commerce initially come about? They use it the same way they did brick and mortar. It took some time to figure out. I get the sense, we don’t have that same cycle of time. I don’t think I’ve ever experienced a technology change such as AI. One of the biggest problems that we’re experiencing around AI has been the lack of power and cooling to do these systems. Talk to me about the importance of the individual components. As you folks are, honestly, you’re several months ahead of most Fortune 100s on the AI journey. What have been those learnings?

Matt Baker: Well, it’s interesting, there’s a lot to unpack there. And I think you’re right. You saw Jeff Clark’s keynote. He actually hearkened back to the beginnings of the Industrial Revolution, that it is so fundamental. And that so fundamental is, I was like, when you had an automobile, we had to build roads, right? We’re in a place where we need to build new infrastructure because the today times are not the before times, right? It’s very different. And so you saw at the keynote yesterday a lot of focus on liquid cooling and other technologies to help deal with the thermal load associated with AI. The thing that I find really interesting is that this is truly a marked change in the nature of an application where the compute intensity is unbalanced versus where it was before.

So we had years of balance where we understood how much storage, networking, and compute required for any given application. But this application is so compute intensive that we are stuffing an insane amount of computing power into a very small place. And what that leads to is energy and thermal density and how do you deal with that? You have to extract all of that heat out of the systems. And people had asked me yesterday, is this really a time when we’re finally going to see liquid cooling become not exotic, but more mainstream? And I think for people who are, especially people who are developing foundation models, which is not a large number of people, but it is a big, big part of the industry, certainly we’re going to see liquid cooling technologies there.

But you also saw that we’re an endlessly pragmatic company. If you’re doing inferencing, the density of your compute might not be as high. And so you saw that we have models, same model is capable of operating, or I should say the same family has a variant that is liquid cooled and one that is air cooled. So it is different. It’s fundamentally different. And the components are different. Networking. If you are fine tuning models or training models, there’s a whole new fabric that needs to be built that interconnects the GPUs. When you’re operating an LLM with inferencing, we found that one of the best ways to do that is you don’t run one instance of a model. We run hundreds of instances of models on individual GPUs. So you don’t need to inter-network that.

But if you’re fine tuning a model or developing a model from the ground up, you have to interconnect all of those GPUs. And that requires a lot of networking expertise that is being developed in real time. So how do I create a massive high bandwidth, low latency network to feed the beast, if you will, to feed the model all of the data, which brings us to storage, which is if you’re developing, you hear about how many tokens, how many parameters, all of this stuff that’s representations of storage, right? So there is a need for new storage architectures to feed through the network, to the GPU, all of the data that is being integrated into a model either in a pre-training environment or in a fine tuning environment. So every element is changing and there’s a new approach to how you do it. I got some great questions the other day about are we ready for this?

And I reminded folks that we were a big player in supplying infrastructure to the folks who have now become what we call hyperscalers. And the lessons we learned there of how to build data centers. And we were a big innovator in the concept of modular data centers. People thought of them as shipping containers. They were not shipping containers. They were purpose-built modules. So that is another opportunity where we are going to dust off that playbook for certain classes of customers where we’re literally going to help them develop from soil to system, from the ground up, designing new data centers and data center types that can accommodate the intensity of the computing that’s occurring with this technology.

Keith Townsend: Yeah, I love Jeff Clark’s TLDR, which was, it reminds you of the human body, the GPU, CPU is the brain, the heart is the network and the lungs, storage. I think that’s a really great shortcut to think of it.

Matt Baker: Oh, for sure.

Lisa Martin: I thought that was too. Last question for you. You said you’ve been in this role for about a year?

Matt Baker: I have, yes.

Lisa Martin: You sound like you have one of the coolest jobs at Dell, by the way.

Matt Baker: I think I have the coolest job.

Lisa Martin: You just might. What is on the horizon for you as you see the prioritization, the accomplishments, the impact made? What does next 12 months look like from your lens?

Matt Baker: Well, I think what we’re looking to do is obviously scale up our internal ambitions and get those where they need to be. And I said to somebody the other day is that there’s not a surface area within the company that AI will not touch. So there’s so much opportunity to innovate with this technology across. It’s not going to just be those eight areas that I highlighted. It’s going to be everything.

Lisa Martin: Pervasive.

Matt Baker: So internally that’s exciting. And setting the course and just watching it go. I hope to not be as busy as I am today in a year from now, but I have a feeling I will be. But externally, I mean the industry’s learning in real time. What became the AI factory started with a conversation between myself and a couple of my colleagues and the NVIDIA team. And we said to ourselves, how do we get technology in the hands of our customers that takes the guesswork out of deploying it? And what we discovered over the preceding few months leading up to October roughly, was that almost every enterprise use case was a RAG implementation, a retrieval augmented generation approach to this technology.

And so the thought was, and this is where the marketing people sort of like, Matt, stick to your day job. It’s just like a box of rags. They’re like… But we learned more through time. And then ultimately what the AI factory is our attempt at creating the easy button for enterprises to adopt this technology. So over the next 12 months, I think we’re seeing rag move from sort of your basic vector search oriented stuff to now combinations of graph search and vector search. And then I’m excited about what we’re seeing with agent augmented generation. So through time, I’m just excited to see how, not the technology, but the patterns of usage develop and how we capture those into solutions that just make it super simple, the easy button to innovate with AI.

Lisa Martin: I’ll take that easy button and it’ll bring me a glass of that Dell Champagne, and I’ll have a sip.

Matt Baker: That sounds great. We’ll toast.

Lisa Martin: Exactly. Matt, keep innovating. Great work. Thank you for joining Keith and me on the program today.

Matt Baker: Sure thing.

Lisa Martin: Sharing the impact that internal journey is making across the globe. We appreciate your insights.

Matt Baker: Yeah, well, it’s fun and it’s one of those things when you can share with our customers, they are so grateful.

Lisa Martin: Absolutely. We’ll give some confidence.

Matt Baker: It’s great. It does. It does.

Lisa Martin: And that’s currency these days.

Matt Baker: It sure is.

Lisa Martin: Awesome. Matt, thank you.

Matt Baker: Sure. Thank you

Lisa Martin: For Matt Baker and Keith Townsend, I’m Lisa Martin. Thank you for tuning in to Six Five on the Road, sponsored by Solidigm.

Author Information

Keith Townsend is a technology management consultant with more than 20 years of related experience in designing, implementing, and managing data center technologies. His areas of expertise include virtualization, networking, and storage solutions for Fortune 500 organizations. He holds a BA in computing and an MS in information technology from DePaul University. He is the President of the CTO Advisor, part of The Futurum Group.

Lisa Martin is a Silicon Valley-based technology correspondent that has been covering technologies like enterprise iPaaS, integration, automation, infrastructure, cloud, storage, and more for nearly 20 years. She has interviewed nearly 1,000 tech executives, like Michael Dell and Pat Gelsinger, on camera over many years as a correspondent.

SHARE:

Latest Insights:

GPT-4 vs Claude and the Implications for AI Applications
Paul Nashawaty discusses Anthropic's launch of the Claude Android app, bringing its AI capabilities to Android users and also, a comparative analysis of long context recall between GPT-4 and Claude.
Dynamic Chatbot Is Designed to Support Seamless Collaboration Between Digital and Human Workforces
Keith Kirkpatrick, Research Director with The Futurum Group, covers Salesforce’s Einstein Service Agent, which is designed to help improve self-service and agent-driven support experiences by leveraging AI and automation.
New Release Brings AI and Automation Across Business Cloud, Business AI, and Business Technology Offerings
Keith Kirkpatrick, Research Director with The Futurum Group, covers the release of OpenText Cloud Edition 24.3, which incorporates AI to drive enhancements across its Business Clouds, Business AI, and Business Technology offerings.
Experts from Kyndryl, Intel, and Dell Technologies share their insights on enabling practical and scalable Enterprise AI solutions that drive impactful outcomes. Discover the potential of AI factories, the critical role of tailored infrastructure, and the path towards AI readiness in enterprises.