Weka Announces WEKApod Solution for Streamlined AI Deployments

Weka Announces WEKApod Solution for Streamlined AI Deployments

The News: Weka has announced a new integrated appliance offering for its Weka Data Platform solution, named WEKApod. The WEKApod solution is certified for NVIDIA DGX SuperPOD systems and is designed to offer streamlined deployment of storage for AI. More about the release of WEKApod can be found in Weka’s blog post.

Weka Announces WEKApod Solution for Streamlined AI Deployments

Analyst Take: Weka has introduced a new integrated appliance solution, dubbed WEKApod, for deploying its Weka Data Platform. The WEKApod offering introduces a new turnkey appliance solution, alongside Weka’s existing software offering, providing flexibility for customers, and ultimately offering streamlined deployment of data storage for AI.

WEKApod is designed specifically to support NVIDIA-based AI deployments and the solution is certified for NVIDIA DGX SuperPOD. WEKApod also notably integrates with NVIDIA Base Command Manager for management and observability.

Configurations of WEKApod begin at 8 nodes and can expand in 4-node increments up to hundreds of nodes. Weka states an 8-node WEKApod configuration is capable of providing the performance required to support 128 NVIDIA DGX H100 systems. Weka has stated the following specifications for an initial 8-node deployment:

  • 1 PB storage capacity
  • 18.3 million IOPs
  • 720 GB/s read bandwidth
  • 186 GB/s write bandwidth

WEKApod offers the same software and features as the standard Weka Data Platform software, but in an easily deployable hardware offering. This offering provides greater flexibility for Weka customers and a streamlined approach for organizations to deploy infrastructure for AI workloads.

Traditionally, the HPC market has gravitated toward a do-it-yourself approach to parallel file systems, opting for software-only offerings. This trend stems in part from a desire to flexibly deploy the solutions as they choose, as well as having the knowledge base and dedicated staff required to do so. In the race to deploy AI applications, however, organizations appear to be taking a different approach, prioritizing more streamlined methods and opting for integrated appliance offerings that offer quicker deployment or less configuration. With the announcement of WEKApod, Weka can provide this streamlined deployment approach, simplifying the deployment of AI data infrastructure for organizations running AI workloads on NVIDIA systems.

Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.

Other Insights from The Futurum Group:

Pure Storage and NVIDIA Announce New Reference Architectures for AI

MinIO Announces Enterprise Object Store

VAST Data Announces New Data Center Architecture to Accelerate AI

Image Credit: Weka

Author Information

Mitch comes to The Futurum Group through the acquisition of the Evaluator Group and is focused on the fast-paced and rapidly evolving areas of cloud computing and data storage. Mitch joined Evaluator Group in 2019 as a Research Associate covering numerous storage technologies and emerging IT trends.

With a passion for all things tech, Mitch brings deep technical knowledge and insight to The Futurum Group’s research by highlighting the latest in data center and information management solutions. Mitch’s coverage has spanned topics including primary and secondary storage, private and public clouds, networking fabrics, and more. With ever changing data technologies and rapidly emerging trends in today’s digital world, Mitch provides valuable insights into the IT landscape for enterprises, IT professionals, and technology enthusiasts alike.

SHARE:

Latest Insights:

Daniel Newman sees 2025 as the year of agentic AI with the ability to take AI and create and hyperscale your business by maximizing and automating processes. Daniel relays to Patrick Moorhead that there's about $4 trillion of cost that can be taken out of the labor pool to drive the future of agentics.
On this episode of The Six Five Webcast, hosts Patrick Moorhead and Daniel Newman discuss Microsoft, Google, Meta, AI regulations and more!
Oracle’s Latest Exadata X11M Platform Delivers Key Enhancements in Performance, Efficiency, and Energy Conservation for AI and Data Workloads
Futurum’s Ron Westfall examines why Exadata X11M allows customers to decide where they want to gain the best performance for their Oracle Database workloads from new levels of price performance, consolidation, and efficiency alongside savings in hardware, power and cooling, and data center space.
Lenovo’s CES 2025 Lineup Included Two New AI-Powered ThinkPad X9 Prosumer PCs for Hybrid Workers
Olivier Blanchard, Research Director at The Futurum Group, shares his insights on how Lenovo’s new Aura Edition ThinkPad X9 prosumer PCs help the company maximize Intel’s new Core Ultra processors to deliver a richer and more differentiated AI feature set on premium tier Copilot+ PCs to hybrid workers.

Thank you, we received your request, a member of our team will be in contact with you.