AWS, Microsoft, and Google Cloud: Tying Up LLMs

AWS, Microsoft, and Google Cloud: Tying Up LLMs

The News: On September 25, Amazon announced what it is calling a strategic collaboration with foundation model player Anthropic. Anthropic will use Amazon Web Services (AWS) as its primary cloud provider; AWS Trainium and Inferentia chips to build, train, and deploy future foundation models; and provide AWS Bedrock customers with access to future generations of its foundation models. Amazon will invest up to $4 billion in Anthropic, have a minority ownership position in the company, and have access to Anthropic models to incorporate into Amazon projects. Read the full details of the Amazon and Anthropic announcement on the Anthropic website.

Amazon’s $4 billion investment in Anthropic follows Microsoft’s $10 billion investment in OpenAI announced in January 2023.

AWS, Microsoft, and Google Cloud: Tying Up LLMs

Analyst Take: It would be logical to think there is some strategic positioning going on among hyperscalers, particularly the three big cloud providers – AWS, Microsoft, and Google Cloud – when it comes to partnering with emerging large language models (LLMs), but it might not be for the reasons you think.

Are the Cloud Providers Trying To Pick the LLM Winner? No

Since OpenAI launched ChatGPT in November 2022, dozens of LLMs and other foundation model players have emerged. Most of these LLMs are rapid mutations of the biggest LLMs – offering a variety of value propositions. Some are narrower, trained on smaller datasets; some are built to be used with private domains; some are private; and some are open source.

The short list of LLM/foundation model players not called OpenAI or Anthropic includes but is not limited to Cohere, Inflection, Llama (Meta); PaLM, BERT, etc. (Google); Aleph Alpha; Mistral.AI; BLOOM (BigScience); NVIDIA; Stability; Falcon (Technology Innovation Institute); AI21 Labs; DOLLY and MPT (Databricks, MosaicML); Granite (IBM); and Midjourney. There are also proprietary models and open source.

AI models are proliferating, and for now, enterprises prefer to have a range of choices, including open source. With that in mind, and the fact that no one LLM fits all uses, it is unlikely we will see a “winner” in the LLM space for a couple of years, nor necessarily a consolidation of players offering AI foundation models. So, although two of the big three cloud providers are starting to pick LLM partners to feed their own IP needs (Microsoft Copilot, Amazon with Anthropic), it is likely they will partner with more LLMs. More on that in the next section.

Are the Cloud Providers Looking for the Biggest AI Compute Customers? Yes

AI workloads based on foundational models are the most complex and expensive computational workloads. Consider that Netflix is widely considered one of the top customers of cloud computing. Business intelligence firm Intricately estimates Netflix spends about $9.6 million per month on cloud compute with AWS. According to a report by Analytics India Magazine discussed in Technext, OpenAI spends about $700,000 per day to operate ChatGPT, that is $21 million each month in compute costs.

So, it would seem AI model players are potentially the biggest cloud compute customers, and it makes sense that cloud providers are interested in handling their cloud compute needs. It does not matter if, in practice, the AI model player is not paying directly for the compute if the cloud players have stakes in them and share profits or if those cloud players get an opportunity to handle the cloud compute needs of the AI model players’ customers.

Regardless, these AI model players are buying cloud compute in some form, and they will be chased by cloud players that are looking to be exclusive cloud compute providers.

Conclusions

Look for more partnerships and investments by cloud providers in AI model players. Next is likely Cohere, which currently has an exclusive cloud compute deal with Google Cloud. Google could increase its investment and exclusive tie to Cohere. Other big tie-ups could come for Inflection AI and AI21 Labs. It would seem to make sense that most AI model players will gain some leverage with cloud providers because of their potential as customers.

Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.

Other insights from The Futurum Group:

Cohere Launches Coral, a New AI-Powered Knowledge Assistant

Generative AI Investment Accelerating: $1.3 Billion for LLM Inflection

Generative AI War? ChatGPT Rival Anthropic Gains Allies, Investors

Author Information

Based in Tampa, Florida, Mark is a veteran market research analyst with 25 years of experience interpreting technology business and holds a Bachelor of Science from the University of Florida.

Related Insights
ABB Q1 FY 2026 Earnings Driven by Data Center and Grid Demand
April 28, 2026

ABB Q1 FY 2026 Earnings Driven by Data Center and Grid Demand

Olivier Blanchard, Research Director & Practice Lead, Intelligent Devices at The Futurum Group, analyzes ABB’s Q1 FY 2026 earnings, focusing on electrification demand tied to data centers and grid upgrades....
IBM Q1 FY 2026 Earnings Show Software Growth and Mainframe AI Monetization
April 28, 2026

IBM Q1 FY 2026 Earnings Show Software Growth and Mainframe AI Monetization

Futurum Research reviews IBM Q1 FY 2026 earnings, focusing on software mix durability, Confluent-driven data streaming strategy, and mainframe AI inferencing as IBM maintains full-year growth and cash flow expectations....
Can Agentic ITOps Transform IT Incident Management or Will Complexity Stall Progress?
April 28, 2026

Can Agentic ITOps Transform IT Incident Management or Will Complexity Stall Progress?

AI-powered ITOps platforms automate incident detection and remediation, cutting costs from $14,000+ per minute downtime, yet integration challenges and security concerns hinder enterprise adoption....
Is Brave Setting a New Standard for Browser Privacy, or Just Raising the Bar?
April 28, 2026

Is Brave Setting a New Standard for Browser Privacy, or Just Raising the Bar?

Brave claims superior privacy defaults via three-layered tracker and fingerprint blocking, requiring no user setup. As regulators scrutinize tech, this aggressive stance may reset enterprise browser standards....
Is Brave Setting the New Standard for Browser Privacy and Security?
April 28, 2026

Is Brave Setting the New Standard for Browser Privacy and Security?

Brave positions itself as the privacy-first browser with integrated protections against tracking and data leakage. As cyber threats escalate, Brave's default privacy model pressures competitors to rethink their approach to...
Can Edwin AI and Catchpoint Finally Deliver True Autonomous IT Without Blind Spots?
April 28, 2026

Can Edwin AI and Catchpoint Finally Deliver True Autonomous IT Without Blind Spots?

LogicMonitor's integration of Edwin AI with Catchpoint aims to deliver true autonomous IT by addressing a critical blind spot: visibility into failures originating beyond enterprise infrastructure, including CDNs, DNS, and...

Book a Demo

Newsletter Sign-up Form

Get important insights straight to your inbox, receive first looks at eBooks, exclusive event invitations, custom content, and more. We promise not to spam you or sell your name to anyone. You can always unsubscribe at any time.

All fields are required






Thank you, we received your request, a member of our team will be in contact with you.