Menu

HPE Invests in LLM Aleph Alpha to Fuel On-Premises AI Strategy

HPE Invests in LLM Aleph Alpha to Fuel On-Premises AI Strategy

The News: In November, Hewlett Packard Enterprise (HPE) announced it joined with a consortium of investors for large language model (LLM) player Aleph Alpha ’s series B funding round of more than $500 million. The investment follows HPE selection of Aleph Alpha’s LLM Luminous as the foundation for the company’s first AI private cloud service, HPE GreenLake for Large Language Models. Here are the key details of the growing relationship:

  • HPE sees the relationship as a strategic partnership that allows both companies to “further integrate our complementary technologies” – HPE’s leadership in supercomputer technology and Aleph Alpha’s focus on the development of generative AI for data sensitive industries such as healthcare, finance, law, government, and security.
  • Luminous is trained on an HPE supercomputer and leverages the HPE Machine Learning Development Environment, which is designed to efficiently scale AI model training.
  • The partners have several joint generative AI projects in process, including a project for a US federal government agency “which includes the analysis, summary and generation of documents critical for national security.”
  • The federal agency project is an example of a “private on-premises LLM environment based on HPE supercomputing and AI technology, which the agency uses for training, tuning and inferencing based on its own documents and databases.”

Read the press release from HPE on the Aleph Alpha partnership here.

Read the Aleph Alpha blog post on the Series B funding here.

HPE Invests in LLM Aleph Alpha To Fuel On-Prem AI Strategy

Analyst Take: For many enterprises, the risk of running ops in the public cloud is too high because of the sensitive nature of their data or the sensitive nature of their overall work. The foundation models that fuel generative AI, particularly LLMs, typically train on massive amounts of publicly available data, so the use of most LLMs can introduce a level of risk for these enterprises they are not willing to take. HPE is thinking about how to solve that dilemma with a vision for an on-premises AI stack. The company’s partnership with Aleph Alpha is part of that equation. Here is why the partnership is important to HPE’s on-premises AI approach.

Partners Are Philosophically Aligned

HPE has been a long-time vendor to enterprises that deploy private IT stacks, so they well understand the drivers and requirements these enterprises have. Aleph Alpha’s mission and creation is based on a similar premise. From the company’s landing page: “A new generation of AI is re-shaping knowledge work. In the most complex and critical environments there are no simple answers. Taking responsibility requires a human-machine paradigm beyond chatbots designed around data security, technology transparency and result explainability.” Note that HPE, in its announcement of the Aleph Alpha investment chose to specifically detail a joint project for a US federal agency working on national security.

Synergy to Drive Efficiencies

Compute costs are a concern for all enterprises seeking to leverage generative AI. In Aleph Alpha, HPE chose an LLM that by design is more efficient than many other available models. Luminous, the model HPE is deploying for its customers, is trained on 70 billion parameters, less than half as big as OpenAI’s ChatGPT-3. Aleph Alpha claims Luminous is “twice as efficient which translates to a better scaling and lower resource consumption when in use.” You can read more about the Luminous LLM in Aleph Alpha’s blog post on the Aleph Alpha website.

Conclusion

It will be interesting to see how the HPE-Aleph Alpha relationship develops. Other players vying for the on-prem AI business have taken a broader approach to partnerships, focusing on open-source options. However, it’s been argued that open-source software in general isn’t secure enough for many enterprise customers and applications, which would bode well for HPE’s current arch with Aleph Alpha. A focused partnership could lead to further refinement and innovation that’s required to make an on-prem AI stack a reality for security-minded enterprises.

Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.

Other insights from The Futurum Group:

Supercomputing 2023

Empowering AI Innovation with HPE’s Advanced Supercomputing Solution

Powering Your Future Business with AI Inference – Futurum Tech Webcast

Author Information

Based in Tampa, Florida, Mark is a veteran market research analyst with 25 years of experience interpreting technology business and holds a Bachelor of Science from the University of Florida.

Related Insights
March 11, 2026

AI Accelerators – Futurum Signal

The rapid acceleration of artificial intelligence is fundamentally reshaping the semiconductor and data center landscape. In our latest Futurum Signal Report: AI Accelerators, we examine how a new generation of...
OpenAI Acquires Promptfoo, Gaining 25% Foothold in Fortune 500 Enterprises
March 11, 2026

OpenAI Acquires Promptfoo, Gaining 25% Foothold in Fortune 500 Enterprises

Mitch Ashley, VP Practice Lead at Futurum, examines OpenAI's acquisition of Promptfoo and what it signals about the security and governance requirements blocking AI agents from enterprise production....
HPE Q1 FY 2026 Results Show Networking Strength, AI Backlog, and Higher Outlook
March 11, 2026

HPE Q1 FY 2026 Results Show Networking Strength, AI Backlog, and Higher Outlook

Futurum Research analyzes HPE’s Q1 FY 2026 earnings, focusing on networking-for-AI demand, memory-driven supply constraints, Juniper integration progress, and what the updated outlook implies for FY 2026 execution....
Claude Marketplace Tests Whether Anthropic Can Win the Procurement Heart
March 11, 2026

Claude Marketplace Tests Whether Anthropic Can Win the Procurement Heart

Alex Smith, VP and Practice Lead at Futurum examines Anthropic’s Claude Marketplace and how commitment-based procurement and partner apps shift enterprise AI buying toward consolidated spend and workflow-specific tools....
Teradata Trades Duct Tape for Unified Intelligence With Its Latest Release
March 10, 2026

Teradata Trades Duct Tape for Unified Intelligence With Its Latest Release

Brad Shimmin, VP and Practice Lead at Futurum, analyzes Teradata’s launch of the Agentic Enterprise Vector Store. This multi-modal pivot aims to challenge the standalone vector database by bringing AI...
Can Microsoft's Frontier Suite Deliver AI Excellence at Scale
March 10, 2026

Can Microsoft’s Frontier Suite Deliver AI Excellence at Scale?

Futurum analysts Keith Kirkpatrick and Fernando Montenegro share their insights on Microsoft’s Frontier Suite, and discuss the implications for both enterprise buyers and the company’s competitors....

Book a Demo

Newsletter Sign-up Form

Get important insights straight to your inbox, receive first looks at eBooks, exclusive event invitations, custom content, and more. We promise not to spam you or sell your name to anyone. You can always unsubscribe at any time.

All fields are required






Thank you, we received your request, a member of our team will be in contact with you.