Are NVIDIA’s DGX Mini PCs the Start of Desktop-Class AI Supercomputers?

Are NVIDIA’s DGX Mini PCs the Start of Desktop-Class AI Supercomputers

Analyst(s): Olivier Blanchard
Publication Date: March 27, 2025

NVIDIA’s DGX Spark, powered by the GB10 Grace Blackwell Superchip, is now available for pre-order. The system, originally called Project Digits, delivers up to 1,000 TOPS of AI compute, supports models with up to 200 billion parameters, and is priced at $3,000 with a 4TB SSD. Asus, Dell, and HP have also introduced mini PCs with the GB10 chip, such as the Ascent GX10, Dell Pro Max with GB10, and HP ZGX Nano AI Station.

What is Covered in this Article:

  • NVIDIA launches DGX Spark, its first GB10-based mini PC, priced at $3,000
  • Asus, Dell, and HP announce similar systems with NVIDIA’s GB10 Grace Blackwell chip
  • DGX Spark includes 128GB LPDDR5X memory, up to 4TB SSD, and a proprietary AI stack
  • NVIDIA touts 1,000 TOPS of AI performance and 200 billion parameter model support
  • DGX Base OS (Ubuntu-based) preinstalled on all OEM and NVIDIA systems

The News: NVIDIA has officially introduced the DGX Spark mini PC, powered by the GB10 Grace Blackwell Superchip. Previously codenamed Project Digits, the DGX Spark is being promoted as the smallest AI supercomputer ever made and is now up for pre-order, starting at $3,000 for the 4TB SSD version. At the heart of the system, the GB10 combines high-performance ARM CPU cores with Blackwell GPU architecture and a dedicated AI accelerator.

Major OEMs like Asus, Dell, and HP have also revealed their own compact systems built on the same GB10 platform. Among the first to appear is Asus’ Ascent GX10, priced competitively against the DGX Spark, with 1TB of storage. Pricing and launch details for Dell and HP units are still under wraps.

DGX Spark is outfitted with four USB4 ports, 10 Gbit/s Ethernet, Wi-Fi 7, HDMI 2.1, and draws up to 170W of power. These devices are built to act as standalone AI desktops replicating many datacenter-level features in a far smaller form factor. Alongside Spark, NVIDIA also introduced the DGX Station, a high-end desktop using the GB300 Grace Blackwell Ultra Superchip, which will be available later this year via partners including Asus, Dell, HP, Lambda, and Supermicro.

Are NVIDIA’s DGX Mini PCs the Start of Desktop-Class AI Supercomputers?

Analyst Take: With the rollout of DGX Spark and GB10-based mini PCs from major OEMs, NVIDIA is clearly aiming to shift AI development workflows toward more accessible desktop environments. These systems, powered by the GB10 Superchip, are intended to support tasks like model fine-tuning, on-device inferencing, and rapid prototyping – thanks to a unified memory design and robust AI compute throughput. The newly revealed DGX Station further extends NVIDIA’s AI platform to cater to heavier, more complex workloads.

The GB10 Superchip Brings Compute Density to the Desktop

The GB10 packs ten Cortex-X925 and ten Cortex-A725 ARM cores alongside a Blackwell GPU running on fifth-gen Tensor Cores and supporting FP4. The chip delivers a staggering 1,000 trillion operations per second – enough power to handle models with up to 200 billion parameters. It comes with 128GB of unified LPDDR5X memory, delivering 273GB/s bandwidth via 256 data lanes. That’s comparable to Apple’s M4 Pro in performance but offers twice the memory. NVLink-C2C enables high-speed communication between the CPU and GPU, offering up to 5x the bandwidth of PCIe Gen5.

OEM Ecosystem Expands Access to GB10 AI Capability

Asus, Dell, and HP have each unveiled their own GB10-powered mini PCs – the Ascent GX10, Dell Pro Max with GB10, and the ZGX Nano AI Station. Asus’ version is the most affordable so far, coming in at €2,760 with a 1TB SSD. While these systems differ in design and configuration, their core specs remain consistent. Lenovo is also expected to enter the scene soon with a GB10-based device. NVIDIA and Asus are also bundling in the ConnectX-7 network chip for enhanced multi-device connectivity. These options give buyers more form and pricing flexibility while tapping into NVIDIA’s AI computing core.

Preconfigured AI Stack Supports Immediate Deployment

Each of these machines ships with NVIDIA’s DGX Base OS, built on Ubuntu and preinstalled with essential drivers and the NVIDIA AI software suite. This allows developers to get started right away with inferencing and fine-tuning models. According to NVIDIA, workloads can easily shift to DGX Cloud or similar platforms with minimal code adjustments, ensuring smooth portability and workflow continuity.

What to Watch:

  • All systems run on the Linux-based DGX Base OS; Windows on ARM support is uncertain as the Qualcomm-Microsoft exclusivity agreement nears expiry.
  • DGX Spark’s specifications suggest a shift toward compact AI workstations, but broader adoption may depend on software compatibility and workload integration.
  • Dell plans to launch its GB10-based mini PCs in early summer, while Asus and HP have not confirmed shipping timelines.
  • Lenovo is expected to release a GB10 mini PC, but no product design or specs have been revealed so far.

See the complete press release on the DGX Spark and Grace Blackwell personal AI systems on the NVIDIA website.

Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.

Other insights from The Futurum Group:

NVIDIA Q4 FY 2025: AI Momentum Strengthens Despite Margin Pressures

Unpacking the Benefits of AI PCs – Six Five On The Road

Dell’s Innovative Approach with AI PCs – Six Five On The Road

Author Information

Olivier Blanchard

Research Director Olivier Blanchard covers edge semiconductors and intelligent AI-capable devices for Futurum. In addition to having co-authored several books about digital transformation and AI with Futurum Group CEO Daniel Newman, Blanchard brings considerable experience demystifying new and emerging technologies, advising clients on how best to future-proof their organizations, and helping maximize the positive impacts of technology disruption while mitigating their potentially negative effects. Follow his extended analysis on X and LinkedIn.

SHARE:

Latest Insights:

Oracle Introduces a Platform to Design, Deploy, and Manage AI Agents Across Fusion Cloud at No Additional Cost to Users
Keith Kirkpatrick, Research Director at The Futurum Group, analyzes Oracle’s AI Agent Studio, a platform enabling enterprise users to create, manage, and extend AI agents across Fusion Cloud Applications without added cost or complexity.
Nokia Bell Labs’ 100th Anniversary Created the Opportunity for Nokia CNS to Showcase How Collaboration with Bell Labs is Productizing Portfolio Innovation
Ron Westfall, Research Director at The Futurum Group, shares insights on why Nokia CSN and Bell Labs are driving the portfolio innovation key to enable CSP and enterprise transformation of cloud, AI and automation, and monetization capabilities.
Synopsys Deepens NVIDIA Collaboration to Accelerate EDA Workloads on Grace Blackwell Platform
Richard Gordon, VP & Practice Lead, Semiconductors at The Futurum Group, examines how Synopsys and NVIDIA aim to accelerate chip design with Grace Blackwell, targeting 30x EDA speedups and enhanced AI productivity.
Custom Arm Neoverse V2 Chip Posts Gains in AI, HPC, and General Compute Across C4A VMs
Richard Gordon, VP & Practice Lead, Semiconductors at The Futurum Group, unpacks Google Axion’s strong benchmarks across AI, HPC, and cloud workloads, showing how Google’s custom Arm CPU could reshape enterprise infrastructure.

Book a Demo

Thank you, we received your request, a member of our team will be in contact with you.