Hammerspace Adds AWS SVP and LLM Training Architecture

Hammerspace Adds AWS SVP and LLM Training Architecture

The News: Global file system vendor Hammerspace named former Amazon Web Services (AWS) general manager Marc Cree as its senior vice president of strategic partnerships. In this role, he will build out Hammerspace’s strategic partner ecosystem through alliances, integration collaboration, and OEM relationships. You can see the announcement on the Hammerspace website.

Hammerspace Adds AWS SVP and LLM Training Architecture

Analyst Take: Cree brings decades of storage, networking, and cloud experience to Hammerspace, which intersects in all those areas. The Hammerspace distributed file system spans data and applications in data centers and public cloud services such as AWS, Google Cloud, Microsoft Azure, and Seagate Lyve Cloud.

At AWS, Cree was the GM for AWS Storage Gateway, responsible for product and business strategy for the service that allows on-premises workloads to use the AWS cloud. He has also been the CEO and founder of InfiniteIO, StorSpeed, and NuSpeed, an iSCI storage company that Cisco acquired in 2000. Cree joined Cisco as president and GM of its storage router business unit.

The Futurum Group classifies Hammerspace as a global file system, putting it in the same category as CTERA, Nasuni, and Panzura. These vendors use the cloud to provide access to data anywhere. Hammerspace also works with and competes with traditional NAS products, which makes its partner ecosystem crucial.

Future Steps

Hammerspace describes itself as “orchestrating the Next Data Cycle,” and you cannot do that without a strategy for AI and large language models (LLMs). Hammerspace last week released a data architecture for training inference for LLMs within hyperscale environments. The goal of the reference architecture is to enable AI technologies to design a unified data architecture that provides a supercomputing-class parallel file system that is as easy to access as standard NFS.

The reference architecture includes client servers, graphics processing units (GPUs), data storage nodes, and networking. Hammerspace software decouples the file system layer from the storage layer, enabling independent scaling of I/O and IOPS at the data layer. Integrated machine learning (ML) capabilities within the Hammerspace architecture will place related data sets in high-performance, local NVMe storage when the first file from the data set is accessed.

By bolstering its strategic alliances and its AI technology, Hammerspace is checking off two boxes it will need to compete with a plethora of unstructured data vendors, including some of the largest in the industry.

Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.

Other insights from The Futurum Group:

Hammerspace raises $56.7M in funding to unlock business opportunities

Hammerspace Introduces Data Orchestration Solution at NAB 2023

Key Trends in Generative AI – The AI Moment, Episode 1

Author Information

Dave focuses on the rapidly evolving integrated infrastructure and cloud storage markets.

Related Insights
Can Claude Opus 4.7 and Ensemble AI Models Finally Make Code Review Reliable?
April 18, 2026

Can Claude Opus 4.7 and Ensemble AI Models Finally Make Code Review Reliable?

CodeRabbit's ensemble AI code review system using Claude Opus 4.7 catches subtle bugs and race conditions that single-model systems miss, signaling a major shift in software quality assurance....
Will GPT-Rosalind Redefine AI’s Role in Life Sciences R&D?
April 18, 2026

Will GPT-Rosalind Redefine AI’s Role in Life Sciences R&D?

OpenAI's GPT-Rosalind marks a pivotal shift in enterprise AI, delivering domain-specific reasoning for life sciences while intensifying competition between horizontal and vertical AI specialists....
Can Real-Time Code Quality Tools Like Qodo and Cursor Break the Pull Request Bottleneck?
April 18, 2026

Can Real-Time Code Quality Tools Like Qodo and Cursor Break the Pull Request Bottleneck?

Qodo's integration with Cursor demonstrates how real-time code quality tools are eliminating pull request bottlenecks by surfacing issues as developers write code, not after submission....
Can CodeRabbit's Multi-Repo Analysis End the Microservices Blind Spot in Code Review?
April 18, 2026

Can CodeRabbit’s Multi-Repo Analysis End the Microservices Blind Spot in Code Review?

CodeRabbit's new Multi-Repo Analysis feature surfaces cross-repository breaking changes that traditional code review tools miss, addressing a critical pain point for microservices architectures and distributed teams....
Is PyTorch Europe's Rise a Turning Point for Open Source AI Leadership?
April 17, 2026

Is PyTorch Europe’s Rise a Turning Point for Open Source AI Leadership?

PyTorch Conference Europe 2026 drew 600+ AI leaders to Paris, showing open source AI's growing enterprise influence as organizations shift from proprietary solutions toward agentic AI and hybrid deployments....
Agentic AI or Pipeline AI for Code Reviews? Why the Architecture Decision Now Shapes Dev Velocity
April 17, 2026

Agentic AI or Pipeline AI for Code Reviews? Why the Architecture Decision Now Shapes Dev Velocity

Enterprise leaders face a critical decision: agentic AI versus pipeline AI for code reviews. Futurum Group's latest analysis reveals how this architectural choice directly impacts developer velocity, risk management, and...

Book a Demo

Newsletter Sign-up Form

Get important insights straight to your inbox, receive first looks at eBooks, exclusive event invitations, custom content, and more. We promise not to spam you or sell your name to anyone. You can always unsubscribe at any time.

All fields are required






Thank you, we received your request, a member of our team will be in contact with you.