Menu

AWS, Microsoft, and Google Cloud: Tying Up LLMs

AWS, Microsoft, and Google Cloud: Tying Up LLMs

The News: On September 25, Amazon announced what it is calling a strategic collaboration with foundation model player Anthropic. Anthropic will use Amazon Web Services (AWS) as its primary cloud provider; AWS Trainium and Inferentia chips to build, train, and deploy future foundation models; and provide AWS Bedrock customers with access to future generations of its foundation models. Amazon will invest up to $4 billion in Anthropic, have a minority ownership position in the company, and have access to Anthropic models to incorporate into Amazon projects. Read the full details of the Amazon and Anthropic announcement on the Anthropic website.

Amazon’s $4 billion investment in Anthropic follows Microsoft’s $10 billion investment in OpenAI announced in January 2023.

AWS, Microsoft, and Google Cloud: Tying Up LLMs

Analyst Take: It would be logical to think there is some strategic positioning going on among hyperscalers, particularly the three big cloud providers – AWS, Microsoft, and Google Cloud – when it comes to partnering with emerging large language models (LLMs), but it might not be for the reasons you think.

Are the Cloud Providers Trying To Pick the LLM Winner? No

Since OpenAI launched ChatGPT in November 2022, dozens of LLMs and other foundation model players have emerged. Most of these LLMs are rapid mutations of the biggest LLMs – offering a variety of value propositions. Some are narrower, trained on smaller datasets; some are built to be used with private domains; some are private; and some are open source.

The short list of LLM/foundation model players not called OpenAI or Anthropic includes but is not limited to Cohere, Inflection, Llama (Meta); PaLM, BERT, etc. (Google); Aleph Alpha; Mistral.AI; BLOOM (BigScience); NVIDIA; Stability; Falcon (Technology Innovation Institute); AI21 Labs; DOLLY and MPT (Databricks, MosaicML); Granite (IBM); and Midjourney. There are also proprietary models and open source.

AI models are proliferating, and for now, enterprises prefer to have a range of choices, including open source. With that in mind, and the fact that no one LLM fits all uses, it is unlikely we will see a “winner” in the LLM space for a couple of years, nor necessarily a consolidation of players offering AI foundation models. So, although two of the big three cloud providers are starting to pick LLM partners to feed their own IP needs (Microsoft Copilot, Amazon with Anthropic), it is likely they will partner with more LLMs. More on that in the next section.

Are the Cloud Providers Looking for the Biggest AI Compute Customers? Yes

AI workloads based on foundational models are the most complex and expensive computational workloads. Consider that Netflix is widely considered one of the top customers of cloud computing. Business intelligence firm Intricately estimates Netflix spends about $9.6 million per month on cloud compute with AWS. According to a report by Analytics India Magazine discussed in Technext, OpenAI spends about $700,000 per day to operate ChatGPT, that is $21 million each month in compute costs.

So, it would seem AI model players are potentially the biggest cloud compute customers, and it makes sense that cloud providers are interested in handling their cloud compute needs. It does not matter if, in practice, the AI model player is not paying directly for the compute if the cloud players have stakes in them and share profits or if those cloud players get an opportunity to handle the cloud compute needs of the AI model players’ customers.

Regardless, these AI model players are buying cloud compute in some form, and they will be chased by cloud players that are looking to be exclusive cloud compute providers.

Conclusions

Look for more partnerships and investments by cloud providers in AI model players. Next is likely Cohere, which currently has an exclusive cloud compute deal with Google Cloud. Google could increase its investment and exclusive tie to Cohere. Other big tie-ups could come for Inflection AI and AI21 Labs. It would seem to make sense that most AI model players will gain some leverage with cloud providers because of their potential as customers.

Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.

Other insights from The Futurum Group:

Cohere Launches Coral, a New AI-Powered Knowledge Assistant

Generative AI Investment Accelerating: $1.3 Billion for LLM Inflection

Generative AI War? ChatGPT Rival Anthropic Gains Allies, Investors

Author Information

Based in Tampa, Florida, Mark is a veteran market research analyst with 25 years of experience interpreting technology business and holds a Bachelor of Science from the University of Florida.

Related Insights
ServiceNow Bets on OpenAI to Power Agentic Enterprise Workflows
January 23, 2026

ServiceNow Bets on OpenAI to Power Agentic Enterprise Workflows

Keith Kirkpatrick, Research Director at Futurum, examines ServiceNow’s multi-year collaboration with OpenAI, highlighting a shift toward agentic AI embedded in core enterprise workflows....
January 21, 2026

AI-Enabled Enterprise Workspace – Futurum Signal

The enterprise workspace is entering a new phase—one shaped less by device refresh cycles and more by intelligent integration. As AI-enabled PCs enter the mainstream, the real challenge for IT...
Does Smartsheet's Partner Program Transformation Signal Market Consolidation?
January 21, 2026

Does Smartsheet’s Partner Program Transformation Signal Market Consolidation?

Keith Kirkpatrick and Alex Smith of Futurum cover Smartsheet’s enhancements to its Aligned Partner Program, which may serve as a key differentiator for the work management platform provider....
AWS European Sovereign Cloud Debuts with Independent EU Infrastructure
January 16, 2026

AWS European Sovereign Cloud Debuts with Independent EU Infrastructure

Nick Patience, AI Platforms Practice Lead at Futurum, shares his/her insights on AWS’s launch of its European Sovereign Cloud. It is an independently-run cloud in the EU aimed at meeting...
Five9 Expands Google Cloud Partnership With a Unified Enterprise CX AI Platform
January 16, 2026

Five9 Expands Google Cloud Partnership With a Unified Enterprise CX AI Platform

Keith Kirkpatrick, Research Director at Futurum, examines Five9’s expanded partnership with Google Cloud and the launch of a joint Enterprise CX AI offering integrating Gemini Enterprise and Vertex AI....
Salesforce’s Slackbot Goes GA - Is This the Real Test for Agentforce
January 15, 2026

Salesforce’s Slackbot Goes GA – Is This the Real Test for Agentforce?

Keith Kirkpatrick, Research Director at Futurum, examines Slackbot general availability and how Salesforce is operationalizing Agentforce 360 by embedding a permissioned, context-aware AI agent directly into Slack workflows....

Book a Demo

Newsletter Sign-up Form

Get important insights straight to your inbox, receive first looks at eBooks, exclusive event invitations, custom content, and more. We promise not to spam you or sell your name to anyone. You can always unsubscribe at any time.

All fields are required






Thank you, we received your request, a member of our team will be in contact with you.