Menu

AWS, Microsoft, and Google Cloud: Tying Up LLMs

AWS, Microsoft, and Google Cloud: Tying Up LLMs

The News: On September 25, Amazon announced what it is calling a strategic collaboration with foundation model player Anthropic. Anthropic will use Amazon Web Services (AWS) as its primary cloud provider; AWS Trainium and Inferentia chips to build, train, and deploy future foundation models; and provide AWS Bedrock customers with access to future generations of its foundation models. Amazon will invest up to $4 billion in Anthropic, have a minority ownership position in the company, and have access to Anthropic models to incorporate into Amazon projects. Read the full details of the Amazon and Anthropic announcement on the Anthropic website.

Amazon’s $4 billion investment in Anthropic follows Microsoft’s $10 billion investment in OpenAI announced in January 2023.

AWS, Microsoft, and Google Cloud: Tying Up LLMs

Analyst Take: It would be logical to think there is some strategic positioning going on among hyperscalers, particularly the three big cloud providers – AWS, Microsoft, and Google Cloud – when it comes to partnering with emerging large language models (LLMs), but it might not be for the reasons you think.

Are the Cloud Providers Trying To Pick the LLM Winner? No

Since OpenAI launched ChatGPT in November 2022, dozens of LLMs and other foundation model players have emerged. Most of these LLMs are rapid mutations of the biggest LLMs – offering a variety of value propositions. Some are narrower, trained on smaller datasets; some are built to be used with private domains; some are private; and some are open source.

The short list of LLM/foundation model players not called OpenAI or Anthropic includes but is not limited to Cohere, Inflection, Llama (Meta); PaLM, BERT, etc. (Google); Aleph Alpha; Mistral.AI; BLOOM (BigScience); NVIDIA; Stability; Falcon (Technology Innovation Institute); AI21 Labs; DOLLY and MPT (Databricks, MosaicML); Granite (IBM); and Midjourney. There are also proprietary models and open source.

AI models are proliferating, and for now, enterprises prefer to have a range of choices, including open source. With that in mind, and the fact that no one LLM fits all uses, it is unlikely we will see a “winner” in the LLM space for a couple of years, nor necessarily a consolidation of players offering AI foundation models. So, although two of the big three cloud providers are starting to pick LLM partners to feed their own IP needs (Microsoft Copilot, Amazon with Anthropic), it is likely they will partner with more LLMs. More on that in the next section.

Are the Cloud Providers Looking for the Biggest AI Compute Customers? Yes

AI workloads based on foundational models are the most complex and expensive computational workloads. Consider that Netflix is widely considered one of the top customers of cloud computing. Business intelligence firm Intricately estimates Netflix spends about $9.6 million per month on cloud compute with AWS. According to a report by Analytics India Magazine discussed in Technext, OpenAI spends about $700,000 per day to operate ChatGPT, that is $21 million each month in compute costs.

So, it would seem AI model players are potentially the biggest cloud compute customers, and it makes sense that cloud providers are interested in handling their cloud compute needs. It does not matter if, in practice, the AI model player is not paying directly for the compute if the cloud players have stakes in them and share profits or if those cloud players get an opportunity to handle the cloud compute needs of the AI model players’ customers.

Regardless, these AI model players are buying cloud compute in some form, and they will be chased by cloud players that are looking to be exclusive cloud compute providers.

Conclusions

Look for more partnerships and investments by cloud providers in AI model players. Next is likely Cohere, which currently has an exclusive cloud compute deal with Google Cloud. Google could increase its investment and exclusive tie to Cohere. Other big tie-ups could come for Inflection AI and AI21 Labs. It would seem to make sense that most AI model players will gain some leverage with cloud providers because of their potential as customers.

Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.

Other insights from The Futurum Group:

Cohere Launches Coral, a New AI-Powered Knowledge Assistant

Generative AI Investment Accelerating: $1.3 Billion for LLM Inflection

Generative AI War? ChatGPT Rival Anthropic Gains Allies, Investors

Author Information

Based in Tampa, Florida, Mark is a veteran market research analyst with 25 years of experience interpreting technology business and holds a Bachelor of Science from the University of Florida.

Related Insights
Will Supermicro's Legal Crisis Shift Server Market Share to New Dell and HPE GPU Platforms?
March 27, 2026

Will Supermicro’s Legal Crisis Shift Server Market Share to New Dell and HPE GPU Platforms?

Brendan Burke, Research Director at Futurum, shares insights on how Supermicro's export crisis creates a GPU allocation opening for Dell and HPE, reshaping the AI server competitive landscape post-NVIDIA GTC...
Is Workflow AI Now Native After Microsoft Embeds Copilot in Power Platform?
March 27, 2026

Is Workflow AI Now Native After Microsoft Embeds Copilot in Power Platform?

Keith Kirkpatrick, Research Director at Futurum, examines Microsoft Copilot integration within Power Platform and how agentic workflows and process mining reshape enterprise application execution and decision-making....
Infosys Bets on P&C Insurance Depth With Stratus Acquisition
March 27, 2026

Infosys Bets on P&C Insurance Depth With Stratus Acquisition

Infosys agreed to acquire Stratus, a New Jersey tech firm specializing in property and casualty insurance, signaling a strategic shift toward domain-specific AI in regulated industries....
Can Palo Alto's Agentic NetOps Actually Kill the NOC Ticket?
March 27, 2026

Can Palo Alto’s Agentic NetOps Actually Kill the NOC Ticket?

Palo Alto Networks' agentic AI Troubleshooting Agent in Prisma SD-WAN enables autonomous network remediation, reducing resolution time from hours to minutes, marking AI-native networking's transition to production....
Infosys Bets on Anthropic to Survive the Automation Wave It Helped Build
March 27, 2026

Infosys Bets on Anthropic to Survive the Automation Wave It Helped Build

Infosys expands its Anthropic partnership to develop enterprise AI agents, signaling that its labor arbitrage model faces disruption and reflecting an urgent pivot toward AI-first service delivery....
Red Piranha's Global InfoSec Win: Can Smaller Vendors Break the Cybersecurity Stalemate?
March 26, 2026

Red Piranha’s Global InfoSec Win: Can Smaller Vendors Break the Cybersecurity Stalemate?

Book a Demo

Newsletter Sign-up Form

Get important insights straight to your inbox, receive first looks at eBooks, exclusive event invitations, custom content, and more. We promise not to spam you or sell your name to anyone. You can always unsubscribe at any time.

All fields are required






Thank you, we received your request, a member of our team will be in contact with you.