Menu

AWS, Microsoft, and Google Cloud: Tying Up LLMs

AWS, Microsoft, and Google Cloud: Tying Up LLMs

The News: On September 25, Amazon announced what it is calling a strategic collaboration with foundation model player Anthropic. Anthropic will use Amazon Web Services (AWS) as its primary cloud provider; AWS Trainium and Inferentia chips to build, train, and deploy future foundation models; and provide AWS Bedrock customers with access to future generations of its foundation models. Amazon will invest up to $4 billion in Anthropic, have a minority ownership position in the company, and have access to Anthropic models to incorporate into Amazon projects. Read the full details of the Amazon and Anthropic announcement on the Anthropic website.

Amazon’s $4 billion investment in Anthropic follows Microsoft’s $10 billion investment in OpenAI announced in January 2023.

AWS, Microsoft, and Google Cloud: Tying Up LLMs

Analyst Take: It would be logical to think there is some strategic positioning going on among hyperscalers, particularly the three big cloud providers – AWS, Microsoft, and Google Cloud – when it comes to partnering with emerging large language models (LLMs), but it might not be for the reasons you think.

Are the Cloud Providers Trying To Pick the LLM Winner? No

Since OpenAI launched ChatGPT in November 2022, dozens of LLMs and other foundation model players have emerged. Most of these LLMs are rapid mutations of the biggest LLMs – offering a variety of value propositions. Some are narrower, trained on smaller datasets; some are built to be used with private domains; some are private; and some are open source.

The short list of LLM/foundation model players not called OpenAI or Anthropic includes but is not limited to Cohere, Inflection, Llama (Meta); PaLM, BERT, etc. (Google); Aleph Alpha; Mistral.AI; BLOOM (BigScience); NVIDIA; Stability; Falcon (Technology Innovation Institute); AI21 Labs; DOLLY and MPT (Databricks, MosaicML); Granite (IBM); and Midjourney. There are also proprietary models and open source.

AI models are proliferating, and for now, enterprises prefer to have a range of choices, including open source. With that in mind, and the fact that no one LLM fits all uses, it is unlikely we will see a “winner” in the LLM space for a couple of years, nor necessarily a consolidation of players offering AI foundation models. So, although two of the big three cloud providers are starting to pick LLM partners to feed their own IP needs (Microsoft Copilot, Amazon with Anthropic), it is likely they will partner with more LLMs. More on that in the next section.

Are the Cloud Providers Looking for the Biggest AI Compute Customers? Yes

AI workloads based on foundational models are the most complex and expensive computational workloads. Consider that Netflix is widely considered one of the top customers of cloud computing. Business intelligence firm Intricately estimates Netflix spends about $9.6 million per month on cloud compute with AWS. According to a report by Analytics India Magazine discussed in Technext, OpenAI spends about $700,000 per day to operate ChatGPT, that is $21 million each month in compute costs.

So, it would seem AI model players are potentially the biggest cloud compute customers, and it makes sense that cloud providers are interested in handling their cloud compute needs. It does not matter if, in practice, the AI model player is not paying directly for the compute if the cloud players have stakes in them and share profits or if those cloud players get an opportunity to handle the cloud compute needs of the AI model players’ customers.

Regardless, these AI model players are buying cloud compute in some form, and they will be chased by cloud players that are looking to be exclusive cloud compute providers.

Conclusions

Look for more partnerships and investments by cloud providers in AI model players. Next is likely Cohere, which currently has an exclusive cloud compute deal with Google Cloud. Google could increase its investment and exclusive tie to Cohere. Other big tie-ups could come for Inflection AI and AI21 Labs. It would seem to make sense that most AI model players will gain some leverage with cloud providers because of their potential as customers.

Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.

Other insights from The Futurum Group:

Cohere Launches Coral, a New AI-Powered Knowledge Assistant

Generative AI Investment Accelerating: $1.3 Billion for LLM Inflection

Generative AI War? ChatGPT Rival Anthropic Gains Allies, Investors

Author Information

Based in Tampa, Florida, Mark is a veteran market research analyst with 25 years of experience interpreting technology business and holds a Bachelor of Science from the University of Florida.

Related Insights
CIO Take Smartsheet's Intelligent Work Management as a Strategic Execution Platform
December 22, 2025

CIO Take: Smartsheet’s Intelligent Work Management as a Strategic Execution Platform

Dion Hinchcliffe analyzes Smartsheet’s Intelligent Work Management announcements from a CIO lens—what’s real about agentic AI for execution at scale, what’s risky, and what to validate before standardizing....
Will Zoho’s Embedded AI Enterprise Spend and Billing Solutions Drive Growth
December 22, 2025

Will Zoho’s Embedded AI Enterprise Spend and Billing Solutions Drive Growth?

Keith Kirkpatrick, Research Director with Futurum, shares his insights on Zoho’s latest finance-focused releases, Zoho Spend and Zoho Billing Enterprise Edition, further underscoring Zoho’s drive to illustrate its enterprise-focused capabilities....
Will IFS’ Acquisition of Softeon Help Attract New Supply Chain Customers
December 19, 2025

Will IFS’ Acquisition of Softeon Help Attract New Supply Chain Customers?

Keith Kirkpatrick, Research Director at Futurum, shares his insights into IFS’ acquisition of WMS provider Softeon, and provides his assessment on the impact to IFS’s market position and the overall...
NVIDIA Bolsters AI/HPC Ecosystem with Nemotron 3 Models and SchedMD Buy
December 16, 2025

NVIDIA Bolsters AI/HPC Ecosystem with Nemotron 3 Models and SchedMD Buy

Nick Patience, AI Platforms Practice Lead at Futurum, shares his insights on NVIDIA's release of its Nemotron 3 family of open-source models and the acquisition of SchedMD, the developer of...
Will a Digital Adoption Platform Become a Must-Have App in 2026?
December 15, 2025

Will a DAP Become the Must-Have Software App in 2026?

Keith Kirkpatrick, Research Director with Futurum, covers WalkMe’s 2025 Analyst Day, and discusses the company’s key pillars for driving success with enterprise software in an AI- and agentic-dominated world heading...
Broadcom Q4 FY 2025 Earnings AI And Software Drive Beat
December 15, 2025

Broadcom Q4 FY 2025 Earnings: AI And Software Drive Beat

Futurum Research analyzes Broadcom’s Q4 FY 2025 results, highlighting accelerating AI semiconductor momentum, Ethernet AI switching backlog, and VMware Cloud Foundation gains, alongside system-level deliveries....

Book a Demo

Newsletter Sign-up Form

Get important insights straight to your inbox, receive first looks at eBooks, exclusive event invitations, custom content, and more. We promise not to spam you or sell your name to anyone. You can always unsubscribe at any time.

All fields are required






Thank you, we received your request, a member of our team will be in contact with you.