Menu

AWS, Microsoft, and Google Cloud: Tying Up LLMs

AWS, Microsoft, and Google Cloud: Tying Up LLMs

The News: On September 25, Amazon announced what it is calling a strategic collaboration with foundation model player Anthropic. Anthropic will use Amazon Web Services (AWS) as its primary cloud provider; AWS Trainium and Inferentia chips to build, train, and deploy future foundation models; and provide AWS Bedrock customers with access to future generations of its foundation models. Amazon will invest up to $4 billion in Anthropic, have a minority ownership position in the company, and have access to Anthropic models to incorporate into Amazon projects. Read the full details of the Amazon and Anthropic announcement on the Anthropic website.

Amazon’s $4 billion investment in Anthropic follows Microsoft’s $10 billion investment in OpenAI announced in January 2023.

AWS, Microsoft, and Google Cloud: Tying Up LLMs

Analyst Take: It would be logical to think there is some strategic positioning going on among hyperscalers, particularly the three big cloud providers – AWS, Microsoft, and Google Cloud – when it comes to partnering with emerging large language models (LLMs), but it might not be for the reasons you think.

Are the Cloud Providers Trying To Pick the LLM Winner? No

Since OpenAI launched ChatGPT in November 2022, dozens of LLMs and other foundation model players have emerged. Most of these LLMs are rapid mutations of the biggest LLMs – offering a variety of value propositions. Some are narrower, trained on smaller datasets; some are built to be used with private domains; some are private; and some are open source.

The short list of LLM/foundation model players not called OpenAI or Anthropic includes but is not limited to Cohere, Inflection, Llama (Meta); PaLM, BERT, etc. (Google); Aleph Alpha; Mistral.AI; BLOOM (BigScience); NVIDIA; Stability; Falcon (Technology Innovation Institute); AI21 Labs; DOLLY and MPT (Databricks, MosaicML); Granite (IBM); and Midjourney. There are also proprietary models and open source.

AI models are proliferating, and for now, enterprises prefer to have a range of choices, including open source. With that in mind, and the fact that no one LLM fits all uses, it is unlikely we will see a “winner” in the LLM space for a couple of years, nor necessarily a consolidation of players offering AI foundation models. So, although two of the big three cloud providers are starting to pick LLM partners to feed their own IP needs (Microsoft Copilot, Amazon with Anthropic), it is likely they will partner with more LLMs. More on that in the next section.

Are the Cloud Providers Looking for the Biggest AI Compute Customers? Yes

AI workloads based on foundational models are the most complex and expensive computational workloads. Consider that Netflix is widely considered one of the top customers of cloud computing. Business intelligence firm Intricately estimates Netflix spends about $9.6 million per month on cloud compute with AWS. According to a report by Analytics India Magazine discussed in Technext, OpenAI spends about $700,000 per day to operate ChatGPT, that is $21 million each month in compute costs.

So, it would seem AI model players are potentially the biggest cloud compute customers, and it makes sense that cloud providers are interested in handling their cloud compute needs. It does not matter if, in practice, the AI model player is not paying directly for the compute if the cloud players have stakes in them and share profits or if those cloud players get an opportunity to handle the cloud compute needs of the AI model players’ customers.

Regardless, these AI model players are buying cloud compute in some form, and they will be chased by cloud players that are looking to be exclusive cloud compute providers.

Conclusions

Look for more partnerships and investments by cloud providers in AI model players. Next is likely Cohere, which currently has an exclusive cloud compute deal with Google Cloud. Google could increase its investment and exclusive tie to Cohere. Other big tie-ups could come for Inflection AI and AI21 Labs. It would seem to make sense that most AI model players will gain some leverage with cloud providers because of their potential as customers.

Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.

Other insights from The Futurum Group:

Cohere Launches Coral, a New AI-Powered Knowledge Assistant

Generative AI Investment Accelerating: $1.3 Billion for LLM Inflection

Generative AI War? ChatGPT Rival Anthropic Gains Allies, Investors

Author Information

Based in Tampa, Florida, Mark is a veteran market research analyst with 25 years of experience interpreting technology business and holds a Bachelor of Science from the University of Florida.

Related Insights
The Storage Era is Dead; Long Live Everpure!
February 25, 2026

The Storage Era is Dead; Long Live Everpure!

Brad Shimmin, VP and Practice Lead at Futurum, shares his insights on Pure Storage’s rebrand to Everpure as well as its supportive acquisition of 1touch.io, exploring why dropping "Storage" is...
Five9 Q4 FY 2025 Earnings Revenue Beat, AI Momentum, Cash Flow High
February 25, 2026

Five9 Q4 FY 2025 Earnings: Revenue Beat, AI Momentum, Cash Flow High

Keith Kirkpatrick, VP & Research Director, Enterprise Software & Digital Workflows at Futurum, notes Five9’s Q4 FY 2025 AI momentum and record bookings signal strong H2 FY 2026 growth....
Will Canva’s MangoAI and Cavalry Bets Redefine Enterprise Creative Stack—Or Hit Adoption Barriers
February 25, 2026

Will Canva’s MangoAI and Cavalry Bets Redefine Enterprise Creative Stack—Or Hit Adoption Barriers?

Keith Kirkpatrick, VP and Research Director at Futurum, shares his insights on how Canva’s acquisition of MangoAI and Cavalry challenges Adobe by unifying motion design and AI-driven marketing....
Amazon Ads MCP Server Debuts, Streamlining AI-Managed Campaign Execution
February 24, 2026

Amazon Ads MCP Server Debuts, Streamlining AI-Managed Campaign Execution

Futurum Research examines the Amazon Ads MCP Server and how AI-managed workflows streamline ad execution while redefining the role of human oversight in Amazon advertising....
Will Zoho’s Value-Generation Approach Drive More Success With Enterprises
February 24, 2026

Will Zoho’s Value-Generation Approach Drive More Success With Enterprises?

Keith Kirkpatrick, VP and Research Director at Futurum, shares his insights into the evolution of Zoho as a provider of enterprise technology, and discusses the steps it should take to...
Cohere’s Multilingual & Sovereign AI Moat Ahead of a 2026 IPO
February 20, 2026

Cohere’s Multilingual & Sovereign AI Moat Ahead of a 2026 IPO

Nick Patience, AI Platforms Practice Lead at Futurum, breaks down the impact of Cohere's Tiny Aya and Rerank 4 launches. Explore how these efficient models and the new Model Vault...

Book a Demo

Newsletter Sign-up Form

Get important insights straight to your inbox, receive first looks at eBooks, exclusive event invitations, custom content, and more. We promise not to spam you or sell your name to anyone. You can always unsubscribe at any time.

All fields are required






Thank you, we received your request, a member of our team will be in contact with you.