Oracle Pioneers the Integration of Generative AI in Cloud Computing

Oracle Pioneers the Integration of Generative AI in Cloud Computing

The News: At Oracle CloudWorld, the company announced limited availability of the new Oracle Cloud Infrastructure (OCI) Generative AI service, supporting the use of large language models (LLMs), built in collaboration with Cohere. Workloads in the service run on dedicated infrastructure within Oracle’s new NVIDIA Supercluster architecture. With the OCI Generative AI service, users can add AI to their applications via application programming interface (API) and, leveraging new features of the just-released Oracle Database 23c, securely include proprietary data with pre-trained LLMs for greater accuracy. See the complete Press Release covering the OCI Generative AI service at Oracle.com.

Oracle Pioneers the Integration of Generative AI in Cloud Computing

Analyst Take: Oracle has been busy this year infusing generative AI into everything it does, from database software to services and applications and industry solutions to Oracle’s own operations. Central to every element of this effort is the implementation of LLMs on OCI, the company’s global public cloud service.

The partnership with Cohere, a leader in the LLM space, brought the necessary data science, models, and machine learning (ML) development processes to Oracle. Oracle has the data, infrastructure, and cloud service delivery capabilities needed to implement AI at scale. Oracle customers with access to the OCI Generative AI service now have a much easier on-ramp to LLM:

  • API access to the service enables familiar, straightforward addition of AI to existing applications.
  • The AI system is dedicated, ensuring usage and data remains proprietary to the customer.
  • Privacy and security features are built into the infrastructure and Oracle Database, alleviating the need for additional security measures or error-prone human processes.

Though currently only in limited availability—and we note, also limited in global rollout by the same supply shortages experienced everywhere in AI deployments—this service promises to be a major factor in AI adoption in the enterprise. OCI has a large footprint, with considerable infrastructure deployed in regions worldwide as well as a deep service catalog extending from infrastructure as a service (IaaS) to development and integration platforms to software as a service (SaaS). Thanks to the OCI Generative AI service, all of these are rapidly gaining AI capabilities in both their management and feature set, and Oracle customers will be able to connect existing applications to it as well as build new ones around it as quickly as the company can manage.

Go Ahead, Light Your Hair on Fire

In his Oracle CloudWorld keynote address, Oracle Founder, Executive Chairman, and CTO Larry Ellison called generative AI the biggest movement in the history of information technology (IT). No small statement from a person who has literally seen, if not himself been part of, almost every such movement. But he is right. As he noted, it has been only 10 months from the release of ChatGPT-3.5. For the past 9 months, AI has been the dominant topic of every IT conversation and strategy session. Cloud computing, the last great movement, took 5 years to become dominant.

Ellison also noted that last year’s Oracle CloudWorld had almost no mention of AI. Meaning almost no keynotes, roadmaps, breakout sessions, or booth messages incorporated AI, let alone made it a cornerstone. What Oracle has achieved in the past 9 months in completely reorienting its entire corporate, product, and customer strategy around generative AI, is remarkable and singular, and a flare for the industry. One may be tempted already to be weary of AI, AI, AI everywhere, but we are here to tell you that if a $312 billion company can light its hair on fire and reorient everything, you can, too.

Alongside some comparatively small but critical updates in Oracle Database 23c, OCI is central and foundational to this transformation. It delivers the base infrastructure of the necessary architecture, performance, and efficiency profile; the hosting of the LLMs from Cohere; integration with its services, customer applications, and most important, masses of customer data; and the inherent and managed security required to maintain privacy and accountability, such a justifiable customer concern when it comes to AI. This transformation is a model for enabling AI everywhere, and it is only classic market friction and the escalating challenges of data ingesting and migration that limit it.

Looking Ahead

Oracle’s recent announcement about the limited availability of its new OCI Generative AI service is a significant development that warrants comparative analysis. Oracle’s partnership with Cohere, a recognized player in the LLM space, indicates a well-thought-out strategic alignment, leveraging both Oracle’s robust cloud infrastructure and Cohere’s specialized ML capabilities. The move is indicative of Oracle’s shift toward becoming a more AI-centric organization, which parallels the broader industry’s focus on implementing AI in versatile applications.

Although Google Vertex AI offers end-to-end ML for deploying and scaling models easily, it is typically geared toward data scientists and ML engineers. Microsoft’s ChatGPT, in contrast, is renowned for its language-generation capabilities but is often implemented as a piece of a larger service architecture. Oracle’s OCI Generative AI service aims to create a more seamless on-ramp for enterprises to incorporate AI into existing ecosystems. The dedicated infrastructure and API access offer a lower barrier to entry, while data and application security features are tightly woven into the fabric of the Oracle Database 23c. This positions Oracle’s service as a compelling option for businesses looking for turnkey AI solutions that can scale and integrate easily into existing IT architectures.

In summary, Oracle is making a calibrated move to infuse AI across its product and service offerings, a strategy that not only aligns the company with current trends but also places Oracle in direct competition with other tech giants such as Amazon Web Services (AWS), IBM, Google, and Microsoft. The emphasis on easier adoption, dedicated resources, and embedded security could give Oracle a differentiated edge in a marketplace that is becoming increasingly crowded with AI-based solutions.

Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.

Other Insights from The Futurum Group:

Oracle Database Analyst Summit: Powering the Multi-Cloud Era and Liberating Developers

Oracle Fiscal Q4 and FY 2023 Results: Oracle Showcases Cloud and AI Mettle in Delivering Record Full-Year Revenue

Oracle FY 2024 Q1: Solid Results Bolstered by IaaS Gain

Author Information

Steven engages with the world’s largest technology brands to explore new operating models and how they drive innovation and competitive edge.

Guy is the CTO at Visible Impact, responsible for positioning, GTM, and sales guidance across technologies and markets. He has decades of field experience describing technologies, their business and community value, and how they are evaluated and acquired. Guy’s specialty areas include cloud, DevOps/cloud-native/12-factor, enterprise applications, Big Data, governance-risk-compliance, containerization, virtualization, HPC, CPUs-GPUs, and systems lifecycle management.

Guy started his technology career as a research director for technology media company Ziff Davis, with stints at PC Magazine, eWeek, and CIO Insight. Prior to joining Visible Impact, he worked at Dell, including postings in marketing, product, and technical marketing groups for a wide range of products, including engineered systems, cloud infrastructure, enterprise software, and mission-critical cloud services. He lives and works in Austin, TX

SHARE:

Latest Insights:

Strengthened Partnership with Samsung Foundry Yields Major Advances in HBM3, EDA Flows, and IP on SF2 and SF2P Nodes
Ray Wang, Research Director at Futurum, shares his insights on Synopsys and Samsung’s expanded collaboration to fast-track AI and multi-die chip design using certified flows, advanced packaging, and a robust portfolio of silicon IP.
Ray Wang, Research Director with The Futurum Group shares his insights on Micron’s Q3 earnings and company’s strong performance amid record-high DRAM and data center revenue.
Jack Huynh, SVP and GM at AMD, joins the Six Five On The Road to discuss AMD's innovative strides in AI PCs, Ryzen, and next-gen personal computing, spotlighting COMPUTEX announcements.
David Nicholson, Keith Townsend, and Matt Kimball join the Six Five to discuss HPE's advancements in hybrid cloud at HPE Discover 2025, focusing on control, cost, and flexibility. A deep dive into how these initiatives are reshaping enterprise cloud strategies.

Book a Demo

Thank you, we received your request, a member of our team will be in contact with you.