AI Field Day: Solidigm Storage Options for AI Training

AI Field Day: Solidigm Storage Options for AI Training

At AI Field Day in Santa Clara, Solidigm explained the value of NVMe SSD storage for AI workloads, particularly with multiple options for training. Supermicro took the stage to show Solidigm storage options for AI that span a range of use cases, from data center rack-scale solutions to IoT solutions that might be mounted on a power pole.

Solidigm (the SSD business unit that Intel sold to SK Hynix at the end of 2021) continues to tell us why its NVMe SSDs are the best option for storing data you expect to access with any urgency. Data for AI training and inference gains the same benefits as many other data types, as we saw Solidigm storage options for AI at AI Field Day in Santa Clara. There was not anything very new in the Solidigm presentation. NVMe SSDs are still extremely fast, and Solidigm offers them in various performance, capacity, and physical form factors. We revisited the TCO comparison of using physically smaller, more power-efficient all-flash storage versus the higher whole-of-life cost of getting the same transactional performance from hard drives. The conclusions were familiar: all-flash storage lowers storage latency, improving application performance without excessive cost.

Supermicro’s Portfolio for Solidigm

Supermicro joined Solidigm’s presentation to showcase their joint products, ranging from rugged industrial PCs to rack-scale deployment of multi-petabyte storage. Supermicro’s ability to load a server with a Solidigm NVMe drive and a GPU into a sealed unit mounted to a utility pole was interesting news. This rugged hardware delivers Solidigm storage options and AI at the far edge. There were mode Solidigm storage options for AI in storage server chassis from Supermicro with options for dense high-performance storage using NGSFF (ruler) format drives. The NGSFF chassis is likely to support CXL, the next generation of storage virtualization, when it is fully standardized. One of the surprises from my previous work with Supermicro was the extensive networking hardware lineup. There was even Supermicro 400 GB InfiniBand switching in one of the rack-scale architectures delivered already built, integrated, and tested.

Solidigm Storage Options for AI

While the message from Solidigm and Supermicro is very much the same as it always has been, that does not detract from its value. Like every other high-performance workload, AI benefits from well-designed storage that keeps the processing fed with data. Like every other infrastructure, no design decision is suitable for every situation. Having cost, performance, capacity, and form factor options is vital. There are plenty of Solidigm storage options for AI.

Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.

Other Insights from The Futurum Group:

Dell Enhances PowerScale NAS to Aid Generative AI

HPE Enhances GreenLake for File Storage for AI Workloads

VAST Data Unveils New Data Center Architecture to Accelerate AI

Author Information

Alastair has made a twenty-year career out of helping people understand complex IT infrastructure and how to build solutions that fulfil business needs. Much of his career has included teaching official training courses for vendors, including HPE, VMware, and AWS. Alastair has written hundreds of analyst articles and papers exploring products and topics around on-premises infrastructure and virtualization and getting the most out of public cloud and hybrid infrastructure. Alastair has also been involved in community-driven, practitioner-led education through the vBrownBag podcast and the vBrownBag TechTalks.

SHARE:

Latest Insights:

Stefano Pallard, Head of Fan Development at Ferrari, and Daniel Newman explore how AI and data are redefining fan engagement in Formula 1, showcasing Ferrari's collaboration with IBM.
Juniper Networks Showed AI Network Design, Implementation, and Operation at AI Infrastructure Field Day
Alastair Cooke, Tech Field Day Event Lead at Futurum, shares his insights on the Juniper Networks presentation at AI Infrastructure Field Day. Intent-based design and deployment using Apstra delivers 800 GB Ethernet for GPUs, storage, and front-end access, making an AI-ready Data Center.
Broad-Based Growth Across Customers, Segments, and Geographies Drives Q1 Performance
Keith Kirkpatrick, Research Director at Futurum, examines Workday’s Q1 FY 2026 results, spotlighting accelerating AI adoption, partner-driven growth, and platform expansion across customer tiers and global markets.

Book a Demo

Thank you, we received your request, a member of our team will be in contact with you.