Analysts: Alex Smith, Olivier Blanchard, Krista Case
Publication Date: May 28, 2025
What is Covered in this Article:
- Dell’s advancements and partnerships in the AI space, including the Dell AI Factory and collaborations with companies like NVIDIA, Google, and Meta.
- The discussion of AI’s impact on infrastructure and data centers, emphasizing the resurgence of on-premise computing and the development of new technologies like liquid cooling.
- The introduction of AI-enabled PCs and devices, including Dell’s new Pro Max line, designed for both general productivity and specialized AI model training at the edge.
The Event: Another Dell Tech World conference in Las Vegas welcomed thousands of customers, partners, and industry enthusiasts. It is the main stage for Dell to showcase various innovations, announcements, and newly formed partnerships. A number of announcements were made across Dell AI Factory, launches across the infrastructure portfolio, new services offerings, device updates, and a continued expansion of its relationship with NVIDIA.
Dell AI Factory has been a major push since its initial launch, with Dell now having around 3,000 customers who have deployed the solution. Advancements were made around power and energy efficiency, expansion of ecosystem partnerships, new professional services, and new offerings across the software stack. Dell can now directly offer the NVIDIA AI Enterprise Suite. Dell is one of the only infrastructure companies that can OEM and resell NVIDIA software.
Other AI-related partnerships were either announced or expanded upon, including those with Cohere, Glean, Google, Meta, Mistral AI, and Red Hat. This includes Gemini on Google Distributed Cloud for Dell on-premise environments, and Dell and Glean On-Premises Solution to help improve enterprise AI search.
On the device side, Dell claims its new Pro Max Plus laptop is the first mobile workstation with an enterprise-grade discrete NPU. The goal is to enable secure on-device inference at the edge, even for large 109-billion-parameter AI models.
Dell Tech World 2025 – Is It a Good Time To Be in Hardware Again?
Analyst Take: Dell has undergone various iterations in its long history. Evolution has been constant. It has been both a public company and a private one. It has grown organically as well as through acquisitions. In the years following its acquisition of EMC, it became a large controller of software companies, with Pivotal, RSA, Virtustream, and most critically, VMware. Even from a go-to-market standpoint, Dell was the poster child for the direct model, but has evolved to a company where around 50% of its business flows through channel partners.
But throughout its evolution, what has remained consistent is its dedication to its hardware portfolio, where it continues to innovate and compete on the breadth of its global supply chain. Those tenants were very much on display at the latest Dell Tech World, which occurred during a very busy event week for the technology industry (e.g., Computex, Google I/O, Microsoft Build, SAP Sapphire).
AI Galvanizes the Infrastructure Opportunities
AI is a central theme that touches all parts of the technology stack, and especially the underlying infrastructure and devices that are needed to run large language models (LLMs) or the agentic workflows. Dell showcased the breadth of its AI portfolio, from device to data center. It also showcased new and expanding relationships across the silicon landscape as well as in the software layer, serving as another important reminder that relevance in today’s technology landscape is not just about the portfolio in your catalog, but the strength of partnerships that you develop.
AI has galvanized the infrastructure industry to discuss the benefits of on-premise computing again, much more so than in recent years. The arguments are the same, just magnified for the AI use case. Do you want to send your valuable enterprise data to the cloud, or bring AI to that data? Do you want your data to be used to train public LLMs? Is the cloud affordable at the scale that some of these projects require? AI Factories are the proposed solutions for enterprises wanting to deploy AI projects – Dell has amassed 3,000 customers to date.
It is not just the enterprise on-premise opportunity that Dell is targeting, but the next generation of “cloud” providers looking to establish themselves in this next generation of compute. Coreweave made an on-stage appearance on the back of its recent IPO and surge in revenue growth. Dell is clearly happy to highlight its position as a core technology provider and partner to them. It highlights the sheer demand for infrastructure across traditional hyperscalers, new compute providers, enterprises, and even increasingly sovereign initiatives such as those seen in Saudi Arabia over the past few weeks.
AI is redefining the data center. Next-generation servers are allowing for a significant reduction in the footprint of traditional workloads to make space for the much more “hungry” AI workloads. Liquid cooling is becoming a critical technology and may even become mainstream as enterprises push their AI needs further. But given the complexities of deploying liquid cooling solutions, new air-cooling solutions were also unveiled as data centers need these new technologies to support their AI deployments.
Part of the appeal of on-premise going forward has to be the availability of many of the software solutions that are typically found in the cloud. This is where key partnerships come into play. Consider some of the announcements from cloud and SaaS companies. Dell will collaborate with Glean to deliver the first on-premises deployment architecture for Glean’s Work AI platform. Likewise, with Cohere North, it will enable its first on-premise deployment. Google Gemini and Google Distributed Cloud will be available on Dell PowerEdge XE9680 and XE9780 servers. Meta’s Llama Stack will feature within Dell AI Solutions.
Of course, one of the most important partners to Dell right now is NVIDIA, and its relationship there continued to expand. Critically, this extends to the software part of NVIDIA’s portfolio, putting Dell in a unique position compared with some of its competitors. NVIDIA is clearly keen to diversify its data center business, and infrastructure partners like Dell will play a critical role in achieving that. Expanding the relationship to software only serves to help NVIDIA extend its moat as more customers develop applications and workflows using NVIDIA technologies. Dell’s relationship with NVIDIA now spans compute, storage, networking, software, and devices. On top of that, Dell also offers Dell Managed Services for the Dell AI Factory with NVIDIA.
It was also fascinating to understand how Dell itself is leveraging AI. It has been very purposeful in establishing a centralized AI office, and only approving very specific projects and use-cases, and only those that align with core processes and demonstrate clear ROI. Dell has not allowed rampant experimentation amongst its employee base, especially not letting Dell data land in public LLMs. But we saw examples of AI usage within the partner organization where it is now driving AI-assisted selling. One of the most significant use cases was within its Global Services organization. By integrating data from various sources (case history, knowledge base, product telemetry, parts dispatch, repair data), Dell created a system where service team members have AI assistance to resolve issues more effectively. This AI implementation led to agents closing cases faster, with increased resolution rates, reduced parts dispatch rates, and higher customer satisfaction. The system was deployed on-premises, leveraging existing data and infrastructure, with a rapid return on investment. Dell, like many technology companies, is eating their own dog food with respect to AI.
Cybersecurity’s Role in Dell’s AI Vision Must Move to the Forefront
Enabling real-time AI processing at the edge for industries such as healthcare, manufacturing, and retail creates massive opportunities and expands the potential attack surface. Inference at the edge demands more than low latency and compute density; it requires robust, distributed security built for highly dynamic environments.
In fact, in Futurum’s most recent CIO Decision Maker survey, Cybersecurity (90%) led in IT investment priorities, eking out driving digital transformation (89%) and surpassing adopting emerging technologies (80%).
Security was visibly acknowledged as foundational to the Dell AI Factory architecture on the Dell Tech World mainstage. However, from a messaging standpoint, it remains secondary in emphasis and largely coalesces around specific products. For example, Dell Tech World 2025 saw the addition of Index Engines CyberSense-fueled ransomware detection capabilities, previously exclusive within the Dell portfolio to PowerProtect Cyber Vault environments, to PowerStore. Futurum will be looking for Dell to construct a more cohesive and cross-portfolio narrative around cybersecurity and cyber-resilience, such as the value of ransomware detection across production and protection environments, in 2H25 and beyond.
Leading with clear business outcomes, including reduced risk, regulatory assurance, and operational continuity, will be foundational to successful positioning around cybersecurity and cyber-resilience. The Dell sales organization must elevate the conversation beyond “speeds and feeds” to engage line-of-business stakeholders and security leaders. Dell must also invest in enabling channel partners to articulate outcome-driven value and navigate increasingly complex AI and security buying centers.
This is especially true as organizations navigate the new world of securely integrating AI as a component of their business workflows and addressing new security threats that emerge due to AI, with vulnerabilities spanning AI models, agents, and data pipelines. A more services-supported approach will also be required. At the show, Dell launched its new AI Security and Resilience Services to help organizations identify AI-specific threats, build resilient environments, and operationalize security across the AI lifecycle.
In today’s environment, where AI adoption is accelerating and cyber incidents are pervasive, security isn’t just foundational; it is a critical differentiator. Going beyond risk mitigation to meaningful business value will materially support the success of Dell’s Modern Data Center initiatives.
Momentum for AI-enabled PCs and Touchpoints
On the Devices side of Dell’s business, three distinct themes were highlighted during the event: The role that edge devices will increasingly play in Dell’s hybrid AI ecosystem play, training AI models at the edge thanks to a new class of powerful enterprise-grade AI PCs, and the way that displays and other peripherals help modernize the workspace for the era of AI-enabled productivity.
Cloud computing has revolutionized both the economics and efficiency of IT by simultaneously democratizing access to compute power and enabling businesses, regardless of size, to scale their compute capabilities as needed. And yet, one operational hurdle that hasn’t entirely been solved by Cloud solutions is that the vast majority of the data that enterprises and SMBs (small-to-medium-sized businesses) depend on exists at the edge. Despite the convenience of Cloud computing, the fact that data has to be moved from the edge to the cloud to be processed, and then back to the edge to be consumed, is burdensome. This operational model monopolizes resources, eats up bandwidth, drives up costs, and creates lag between data inputs and outputs.
Dell’s overall approach to solving the problem is simple: Rather than push all of that data to processing centers, which can be inefficient and costly, could the processing itself be moved closer to the data instead? With the advent of AI-enabled devices like AI PCs, Dell sees an opportunity to do just that. While moving all processing to the edge remains impractical, being strategic about moving some of it closer to the edge to minimize latency, security threats, and costs while maximizing productivity and operational efficiency makes sense.
This is essentially the strategy behind Dell’s Hybrid AI ecosystem play: Delivering distributed, federated compute capabilities that combine data center solutions, on-prem solutions, and endpoint solutions, with seamless integration and orchestration between all layers to maximize their ROI. As it did at last year’s Dell Tech World event, Dell provided not only a vision for what IT departments will look like in 5 years’ time, but highlighted its full complement of software, services, and hardware solutions that constitute that vision’s real-world building blocks, including AI PCs.
At first glance, NPU(Neural Processing Unit)-equipped AI PCs are already far more power-efficient than their predecessors, delivering all-day and in some cases multi-day battery life. They also bring significantly improved levels of performance to the PC segment with new generations of silicon. But more broadly, they also promise to enable a new generation of agentic features that will be capable of operating with or without cloud services.
Examples of agentic-enabled tasks range from a PC user asking an agent to create presentations, summarize notes and sift through email thread clutter to drafting reports, analyzing data, and editing videos: Tasks that traditionally took hours or days for a human worker to complete could be shortened to minutes or hours, respectively, and without necessarily relying on network connectivity: With Ai-enabled PCs capable of handling agentic workloads on-device, interruptions in connectivity to cloud services don’t have to impact productivity or delivery deadlines. Also, agentic workloads that can be managed directly on-device are more secure and tend to lower latency.
To be fair, the on-device agentic piece of that equation has been disappointingly slow to catch up to the hardware ready to enable it. Having said that, enterprises have been quick to see the potential and are already investing in AI PCs to future-proof their workforces for AI. Dell touched on this consistently throughout the event, connecting its intelligent edge business to its overall Hybrid AI strategy.
Moving beyond AI PCs as the natural evolution of PCs and integral pieces of a broad hybrid AI ecosystem puzzle, Dell also introduced enterprise-grade AI PCs designed to enable developers to build, train and test AI models locally: Dell’s new NVIDIA-powered Pro Max GB10 and GB300 aim to bring server-grade AI workload capabilities to organizations in both a desktop and a deskside format (respectively).
- The GB10 version is powered by an NVIDIA GB10 Grace Blackwell Superchip and features 128GB LPDDR5x of unified memory. This enables it to support up to a 200Bn parameter model with 1 Petaflop (1000 TFLOPS) of FP4 computing power. Like its desktop counterpart, it also features Dual ConnectX-7 SmartNIC and runs NVIDIA DGXTM OS on Linux and NVIDIA’s AI Enterprise software stack.
- The GB 300 version is powered by NVIDIA’s GB300 Grace Blackwell Ultra Desktop Superchip with 496GB LPDDR5X CPU memory and 288GB HBM3e GPU memory. This supercomputer in a box can support up to ~460Bn parameter models with 20 Petaflops (20,000 TFLOPS) of FP4 computing power.
For use cases in which a notebook form factor is preferable to a desktop or deskside configuration, Dell also introduced a Dell Pro Max Plus laptop powered by a discrete NPU (Qualcomm’s AI 100 data center chip). This professional enterprise-grade PC can handle models up to 109 billion parameters and delivers roughly 450 total system TOPS, thanks in part to its 32 AI cores and 64 GB of dedicated LPDDR4x memory. Note that this system ditches the GPU in favor of an NPU for better performance per watt and thermal efficiency.
Dell’s new Pro AI Studio toolkit easily runs on all three devices and enables data scientists, developers, and AI engineers to work securely on their critical AI projects without having to worry about data leaks, connectivity bottlenecks, or unnecessary data center costs.
Lastly, highlighted some of its other essential technologies, which unfortunately don’t always get as much attention as their more explicitly AI-forward counterparts: For starters, the company is delivering some noteworthy innovations in display technologies, with smart, adaptive displays that don’t just look more crisp and rich than their predecessors and improve user experience, but are also shown to simultaneously reduce eye fatigue and significantly boost productivity. In other words, Dell’s new displays aren’t just prettier and more impressive; they also aim to boost productivity independently of the productivity improvements promised by AI PCs. I was also impressed with Dell’s renewed design attention to peripherals like docking stations (like its new streamlined WD19S 180W) and hearables. Speaking of hearables, Dell’s new Pro Premium Wireless ANC Headset (WL7024) delivers up to an impressive 80 hours of listening time on a full charge and comes with a 3-year warranty, which helps derisk the investment for IT buyers.
With so much bandwidth dedicated to the company’s ambitious hybrid AI ecosystem strategy, it was refreshing to see the extent to which Dell’s Displays and Peripherals teams are also approaching the PC refresh cycle’s attach opportunities with compelling innovations, design improvements, and device management and support offerings.
What to Watch:
- How will infrastructure competitors start to differentiate their respective Ai Factory offerings? It was notable that Hewlett-Packard Enterprise made announcements to its NVIDIA AI Factory the same week as Dell Tech World from Computex.
- Will more AI workloads start to be deployed in non-hyperscaler environments? Cloud companies will increasingly lean on long-term commit engagements to try and allay any cost concerns that enterprises may raise. Sovereign clouds will also become more prevalent strategies to try and win government contracts.
- The continued expansion of AI PC SKUs encompasses Intel, AMD, Qualcomm, and NVIDIA chips. The primary question, given Dell’s market share leadership in the PC segment is whether or not Dell will exercise SKU discipline (minimizing its number of AI PC configurations as much as it can by leading enterprise PC buyers to existing SKUs) or instead give way to a certain degree of SKU creep driven by IT buyers looking to optimize their new systems for an ever-evolving on-ramp of new AI PC use cases.
- Dell is making more explicit “attach” motions in its overall device-to-cloud hybrid AI strategy (Cloud/On-Prem + Software + services + PCs) and in its device/Workspace ecosystem (PC + Displays + Peripherals).
- Market reactions to Dell’s advanced new Pro Max GB10, Pro Max GB300, and Pro Max Plus AI PCs: Will enterprises, SMEs, and SMBs invest in these systems to accelerate their own AI enablement journeys, or will they instead continue to rely on Cloud-based options? While initial reactions at Dell Tech World were overwhelmingly positive, shipments of those three products will be the first real indicator of their market viability in the commercial segment.
Disclosure: Futurum is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.
Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of Futurum as a whole.
Other insights from Futurum:
Can AMD’s EPYC 4005 Series Disrupt the Entry-Level Enterprise Server Market?
Direct Connect: Signs That Intel Foundry “Gets It”
Why Google’s Chromebook Plus Strategy May Turn Into A Case Study In Quiet Disruption
Author Information
Alex is Vice President & Practice Lead, Channels & Go-to-Market at the Futurum Group. He is responsible for establishing and maintaining the Channels Research program as part of the overall Futurum GTM and Channels Practice. This includes overseeing the channel data rollout in the Futurum Intelligence Platform, primary research activities such as research boards and surveys, delivering thought-leading research reports, and advising clients on their indirect go-to-market strategies. Alex also supports the overall operations of the Futurum Research Business Unit, including P&L segmentation, sales and marketing alignment, and budget planning.
Prior to joining Futurum, Alex was VP of Channels & Enterprise Research at Canalys where he led a multi-million dollar research organization with more than 20 analysts. He played an integral role in helping the Canalys research organization migrate into Omdia after having been acquired in 2023. He is an accomplished research leader, as well as an expert in indirect go-to-market strategies. He has delivered numerous keynotes at partner-facing conferences.
Alex is based in Portland, Oregon, but has lived in numerous places, including California, Canada, Saudi Arabia, Thailand, and the UK. He has a Bachelor in Commerce and Finance Major from Dalhousie University, Halifax Canada.
Research Director Olivier Blanchard covers edge semiconductors and intelligent AI-capable devices for Futurum. In addition to having co-authored several books about digital transformation and AI with Futurum Group CEO Daniel Newman, Blanchard brings considerable experience demystifying new and emerging technologies, advising clients on how best to future-proof their organizations, and helping maximize the positive impacts of technology disruption while mitigating their potentially negative effects. Follow his extended analysis on X and LinkedIn.
With a focus on data security, protection, and management, Krista has a particular focus on how these strategies play out in multi-cloud environments. She brings approximately 15 years of experience providing research and advisory services and creating thought leadership content. Her vantage point spans technology and vendor portfolio developments; customer buying behavior trends; and vendor ecosystems, go-to-market positioning, and business models. Her work has appeared in major publications including eWeek, TechTarget and The Register.
Prior to joining The Futurum Group, Krista led the data protection practice for Evaluator Group and the data center practice of analyst firm Technology Business Research. She also created articles, product analyses, and blogs on all things storage and data protection and management for analyst firm Storage Switzerland and led market intelligence initiatives for media company TechTarget.