Menu

Google Cloud Announces Hugging Face Partnership

Google Cloud Announces Hugging Face Partnership

The News: On January 25, Google Cloud announced a new partnership to enable Hugging Face developers to choose Google Cloud infrastructure for all Hugging Face services.

Here are the key details:

  • Developers can train, tune, and serve Hugging Face models with Google Cloud’s Vertex AI, enabling them to utilize Google Cloud end-to-end MLOps services to build generative AI applications
  • Access to Google Kubernetes Engine (GKE) deployments means Hugging Face developers can train, tune, and serve their workloads with “do it yourself” infrastructure and scale models using Hugging Face-specific Deep Learning Containers on GKE
  • Access to Google’s Cloud TPU v5e AI accelerators
  • Google Cloud joins Amazon Web Services (AWS) and Microsoft Azure as AI model training, tuning, and serving options

Read the Google Cloud Hugging Face partnership press release here.

Google Cloud Announces Hugging Face Partnership

Analyst Take: Google Cloud becomes the latest hyperscaler Hugging Face developers can choose to train, tune, and serve up their AI models. Developers could already do so with AWS or Microsoft Azure. Hugging Face, which has been referred to as The GitHub of Open-Source AI, hosts more than 300,000 models and is being used by more than 50,000 organizations. What will the impact of the Google Cloud-Hugging Face partnership be? Here are my thoughts.

Hugging Face Is the Competitive Marketplace for AI Platforms and Compute

Google Cloud, AWS, and Microsoft Azure each offer very competitive and equally comprehensive AI platforms and AI compute options, but enterprise customers choose among those players for a variety of reasons, many of which might have nothing to do with AI capabilities of said platform. With the addition of Google Cloud, Hugging Face has become the de facto marketplace for AI platforms and compute—developers can easily choose to switch providers to test and try, without a significant commitment. Developers and enterprises benefit and the hyperscalers get sharper with their AI offerings.

Hugging Face Is the Hottest Testing Ground for AI Compute

Google Cloud will join not only AWS and Microsoft Azure but also Intel, AMD, and NVIDIA in the hottest testing ground for AI compute. The advantage to all of these vendors is the chance to rapidly see how their hardware performs in delivering AI compute workloads. Note that Google Cloud mentioned Hugging Face developers will have access to Google Cloud’s TPU v5e AI accelerators. Given the lack of friction, Google Cloud now has the opportunity to see how its latest AI accelerators perform across a wide range of use cases and applications, leading to greater learnings to iterate their AI platforms and compute offerings.

Google Is Increasingly Embracing Open Source

While Google offers some proprietary AI models (such as Gemini), it, along with AWS, Microsoft, and IBM are increasingly embracing open source AI models as well. For the major hyperscalers, it is not a matter of either proprietary models or open source models; it is the idea that enterprises want and will use both proprietary models and open source models, when it makes sense to do so.

Conclusion

Hugging Face has effectively accelerated AI development and learning by adding Google Cloud as a partner to bring most of the major cloud providers (except IBM) to developers from one platform. Enterprises will iterate on their AI learnings, applications, and workloads more rapidly. AI compute vendors will iterate on their AI development and compute services more quickly. This is a win-win for enterprise AI adoption.

Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.

Other Insights from The Futurum Group:

Why the Launch of LLM Gemini Will Underpin Google Revenue

With Azure AI Studio, Microsoft Contends for Top Dev Platform

Google Cloud Next: Vertex AI Heats Up Developer Platform Competition

Author Information

Based in Tampa, Florida, Mark is a veteran market research analyst with 25 years of experience interpreting technology business and holds a Bachelor of Science from the University of Florida.

Related Insights
CIO Take Smartsheet's Intelligent Work Management as a Strategic Execution Platform
December 22, 2025

CIO Take: Smartsheet’s Intelligent Work Management as a Strategic Execution Platform

Dion Hinchcliffe analyzes Smartsheet’s Intelligent Work Management announcements from a CIO lens—what’s real about agentic AI for execution at scale, what’s risky, and what to validate before standardizing....
Will Zoho’s Embedded AI Enterprise Spend and Billing Solutions Drive Growth
December 22, 2025

Will Zoho’s Embedded AI Enterprise Spend and Billing Solutions Drive Growth?

Keith Kirkpatrick, Research Director with Futurum, shares his insights on Zoho’s latest finance-focused releases, Zoho Spend and Zoho Billing Enterprise Edition, further underscoring Zoho’s drive to illustrate its enterprise-focused capabilities....
NVIDIA Bolsters AI/HPC Ecosystem with Nemotron 3 Models and SchedMD Buy
December 16, 2025

NVIDIA Bolsters AI/HPC Ecosystem with Nemotron 3 Models and SchedMD Buy

Nick Patience, AI Platforms Practice Lead at Futurum, shares his insights on NVIDIA's release of its Nemotron 3 family of open-source models and the acquisition of SchedMD, the developer of...
Will a Digital Adoption Platform Become a Must-Have App in 2026?
December 15, 2025

Will a DAP Become the Must-Have Software App in 2026?

Keith Kirkpatrick, Research Director with Futurum, covers WalkMe’s 2025 Analyst Day, and discusses the company’s key pillars for driving success with enterprise software in an AI- and agentic-dominated world heading...
Broadcom Q4 FY 2025 Earnings AI And Software Drive Beat
December 15, 2025

Broadcom Q4 FY 2025 Earnings: AI And Software Drive Beat

Futurum Research analyzes Broadcom’s Q4 FY 2025 results, highlighting accelerating AI semiconductor momentum, Ethernet AI switching backlog, and VMware Cloud Foundation gains, alongside system-level deliveries....
Oracle Q2 FY 2026 Cloud Grows; Capex Rises for AI Buildout
December 12, 2025

Oracle Q2 FY 2026: Cloud Grows; Capex Rises for AI Buildout

Futurum Research analyzes Oracle’s Q2 FY 2026 earnings, highlighting cloud infrastructure momentum, record RPO, rising AI-focused capex, and multicloud database traction driving workload growth across OCI and partner clouds....

Book a Demo

Newsletter Sign-up Form

Get important insights straight to your inbox, receive first looks at eBooks, exclusive event invitations, custom content, and more. We promise not to spam you or sell your name to anyone. You can always unsubscribe at any time.

All fields are required






Thank you, we received your request, a member of our team will be in contact with you.