Menu

Oracle, Microsoft, and OpenAI Form GenAI Power Trio

Oracle, Microsoft, and OpenAI Form GenAI Power Trio

The News: Oracle, Microsoft, and OpenAI are partnering to extend the Microsoft Azure AI platform to Oracle Cloud Infrastructure (OCI) to provide additional capacity for OpenAI. Read the full press release on the Oracle website.

Oracle, Microsoft, and OpenAI Form GenAI Power Trio

Analyst Take: Oracle Cloud Infrastructure (OCI) has emerged as the fourth hyperscaler, significantly expanding its footprint in the cloud market. The global surge in AI adoption and the growing demand for large language models (LLMs) provide substantial tailwinds for OCI’s growth. Oracle’s deep integration of AI capabilities, such as generative AI and computer vision, into its Gen2 AI infrastructure highlights its readiness to meet the evolving needs of businesses. Moreover, Oracle’s stronghold on enterprise data, coupled with its multi-cloud agility, positions it advantageously to capitalize on the burgeoning AI ecosystem.

OpenAI selected OCI to extend the Microsoft Azure AI platform to OCI, enlarging OpenAI’s overall cloud capacity. OpenAI is the AI research and development company behind ChatGPT, which provides generative AI (GenAI) to more than 100 million users every month. OpenAI’s high-profile endorsement aligns with the burgeoning AI ecosystem demand for greater sized LLMs, directly invigorating demand for Oracle Gen2 AI infrastructure and its price performance advantages.

As such, OpenAI will join thousands of players across industries globally that run their AI workloads on OCI AI infrastructure. Adept, Modal, MosaicML, NVIDIA, Reka, Suno, Together AI, Twelve Labs, xAI, and others use OCI Supercluster to train and inference next-generation AI models.

Oracle Gen2 AI infrastructure includes GenAI, computer vision, and predictive analytics throughout its distributed cloud. Gen2 OCI uses ultra-swift remote data memory access (RDMA) networking and connects NVIDIA GPUs in huge superclusters that can efficiently train LLMs. As such, OCI Supercluster can scale up to 64K NVIDIA Blackwell GPUs or GB200 Grace Blackwell Superchips connected by ultra-low-latency RDMA cluster networking and a choice of HPC storage.

RDMA networking means that one computer in the network can access the memory of another computer without interrupting the other computer. This results in the ability to move massive amounts of data from one computer to another extremely fast, especially in relation to conventional networks. OCI’s speed and cost advantages are why vendors such as Cohere, NVIDIA, plus X.AI, and now OpenAI, are using it to train their LLMs.

Oracle already has demonstrated its GenAI acumen with the general availability of its OCI Generative AI offering as well as OCI Generative AI Agents and OCI Data Science AI Quick Actions. Key details include that OCI Generative AI service is a fully managed service that integrates LLMs (e.g., OpenAI, Cohere, and Meta Llama 2) to address a wide range of business use cases. OCI Generative AI service includes multilingual capabilities that support over 100 languages, an improved GPU cluster management experience, and flexible fine-tuning options. Customers can use OCI Generative AI service in the Oracle Cloud and on-premises through OCI Dedicated Region.

Oracle and Microsoft Reinforce Multi-cloud and GenAI Credentials

From our view, Oracle’s strategic focus on enabling multi-cloud agility strengthens the company’s competitive position in the unfolding multi-cloud era, especially as enterprise customers increase their demand for streamlined multi-cloud interconnectedness and capabilities. Oracle is committed to getting the interconnect latencies to the equivalent of two data centers acting as one data center, effectively minimizing latency issues and providing a one cloud experience to customers.

Oracle’s multi-cloud is serving as the interconnect approach, which all hyperscalers and most cloud service providers should adopt. Business customers will continue to drive the demand to use whatever service they want in the cloud that is best suited for any workload. As such, AWS, Google Cloud, and Azure should all connect to other clouds in the same manner as interstate highways interconnect motor vehicle traffic.

Fully validating Oracle’s multi-cloud philosophy is the coinciding announcement that Oracle and Google Cloud entered a partnership that gives customers the choice to combine OCI and Google Cloud technologies to help accelerate their application migrations and modernization. Specifically, Google Cloud’s Cross-Cloud Interconnect will be initially available for customer onboarding in 11 global regions, allowing customers to deploy general purpose workloads with no cross-cloud data transfer charges.

Both companies will jointly go-to-market with Oracle Database@Google Cloud, with the prime objective of benefiting enterprises worldwide and across multiple industries, including financial services, healthcare, retail, manufacturing, and more. Oracle’s Real Application Clusters (RAC) and Autonomous Database can be paired with services from Google Cloud, Azure, and AWS, according to customer needs, including interconnect assurances with Azure and now Google Cloud.

Of interest is the question of OCI supporting Google’s Bard/Gemini offering, emulating its new OpenAI/Azure relationship. For now, it’s not a specific component of the Google Cloud alliance, although OCI and Google Cloud indicated the eventual possibility of adding Google’s chatbot and related portfolio capabilities akin to the OpenAI/Azure implementation as well. Ultimately, customer demand will determine such an outcome.
Key Takeaways and Looking Ahead: Oracle, Microsoft, OpenAI Make GenAI Multi-Cloud Friendly

Oracle’s strategic positioning in the AI landscape is reinforced by its robust Gen2 AI infrastructure and key partnerships with leading AI entities such as OpenAI. These collaborations enhance Oracle’s capabilities to support and scale advanced AI workloads, leveraging its ultra-low-latency RDMA networking and extensive GPU superclusters. By integrating multi-cloud interoperability, Oracle ensures that enterprises can seamlessly deploy AI solutions across various cloud environments, maximizing flexibility and performance.

The ability to combine Oracle’s enterprise data management expertise with cutting-edge AI technologies makes it a formidable player in the hybrid multi-cloud world. As businesses increasingly adopt AI-driven solutions, Oracle’s comprehensive offerings and strategic alliances position it to lead in delivering innovative and efficient AI services.

Overall, we believe the collaboration with OpenAI to extend the Microsoft Azure AI platform to OCI validates that Oracle Gen2 AI infrastructure, underscored by RDMA-fueled innovation, can support and scale the most demanding GenAI/LLM workloads with immediacy. OCI’s multi-cloud advances underline how combining cloud services from multiple clouds can enable customers to optimize cost, performance, and functionality in modernizing their databases and applications.

Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.

Other Insights from The Futurum Group:

Oracle Database 23ai: Taking Enterprise AI to the Next Level

Oracle Fiscal 2024 Q3: Results Fueled by Cloud and Generative AI Surge

Oracle Generative AI: Advancing the Frontier of Enterprise Innovation

Image Credit: Oracle

Author Information

Ron is an experienced, customer-focused research expert and analyst, with over 20 years of experience in the digital and IT transformation markets, working with businesses to drive consistent revenue and sales growth.

Ron holds a Master of Arts in Public Policy from University of Nevada — Las Vegas and a Bachelor of Arts in political science/government from William and Mary.

Steven engages with the world’s largest technology brands to explore new operating models and how they drive innovation and competitive edge.

Related Insights
Microsoft’s Maia 200 Signals the XPU Shift Toward Reinforcement Learning
January 27, 2026

Microsoft’s Maia 200 Signals the XPU Shift Toward Reinforcement Learning

Brendan Burke, Research Director at Futurum, analyzes Microsoft’s custom Maia 200 architecture and market position. The accelerator supports reinforcement learning with low-precision formats and deterministic networking....
Amazon EC2 G7e Goes GA With Blackwell GPUs. What Changes for AI Inference
January 27, 2026

Amazon EC2 G7e Goes GA With Blackwell GPUs. What Changes for AI Inference?

Nick Patience, VP and AI Practice Lead at Futurum, examines Amazon’s EC2 G7e instances and how higher GPU memory, bandwidth, and networking change AI inference and graphics workloads....
NVIDIA and CoreWeave Team to Break Through Data Center Real Estate Bottlenecks
January 27, 2026

NVIDIA and CoreWeave Team to Break Through Data Center Real Estate Bottlenecks

Nick Patience, AI Platforms Practice Lead at Futurum, shares his insights on NVIDIA’s $2 billion investment in CoreWeave to accelerate the buildout of over 5 gigawatts of specialized AI factories...
Did SPIE Photonics West 2026 Set the Stage for Scale-up Optics
January 27, 2026

Did SPIE Photonics West 2026 Set the Stage for Scale-up Optics?

Brendan Burke, Research Director at The Futurum Group, explains how SPIE Photonics West 2026 revealed that scaling co-packaged optics depends on cross-domain engineering, thermal materials, and manufacturing testing....
Will Microsoft’s “Frontier Firms” Serve as Models for AI Utilization
January 26, 2026

Will Microsoft’s “Frontier Firms” Serve as Models for AI Utilization?

Keith Kirkpatrick, VP and Research Director at Futurum, covers the New York Microsoft AI Tour stop and discusses how the company is shifting the conversation around AI from features to...
Will Twilio's Partnership with AEG Redefine Fan Engagement in Live Events
January 26, 2026

Will Twilio’s Partnership with AEG Redefine Fan Engagement in Live Events?

Keith Kirkpatrick, VP and Research Director at Futurum, shares his insights on the new partnership between Twilio and entertainment company AEG, focusing on improving personalization and engagement in ticketing, sports,...

Book a Demo

Newsletter Sign-up Form

Get important insights straight to your inbox, receive first looks at eBooks, exclusive event invitations, custom content, and more. We promise not to spam you or sell your name to anyone. You can always unsubscribe at any time.

All fields are required






Thank you, we received your request, a member of our team will be in contact with you.