Cohesity Introduces Turing, Highlighting Data Protection’s Role in AI

The News: Cohesity introduces Turing, a product unit dedicated to integrating with AI models. Currently, Google Cloud Vertex AI and Microsoft Azure OpenAI are supported. See the full Press Release from Cohesity here.

Cohesity Introduces Turing, Highlighting Data Protection’s Role in AI

Analyst Take: Cohesity’s introduction of Turing, its product unit focused on integrating with AI tools and organizing its capabilities for facilitating a secure, controlled and quality data pipeline for AI, is well timed given that artificial intelligence (AI) is all the rage these days. That said, it is important to remember that any result generated by AI is only as good as the data that underpins it and must be based on data stores that are quality and complete, protected, and compliant with regional and industry-specific data privacy legislation.

The Role of Data Protection is Critical

With this in mind, data protection has an important role to play, not only from the standpoint of facilitating the availability of data — especially given the rampant threat of cyberattacks such as ransomware — but also from the standpoint of allowing data to be indexed and classified for collation across multi hybrid cloud networks of applications and infrastructure. There has never been more opportunity for dark, redundant, and obsolete data, and threat landscapes have never been wider or more diverse.

Working across this fragmented and heterogeneous landscape is difficult for IT Operations, however. This is especially true when we factor in the tall order of gaining visibility into and controlling access to the full range of cloud resources in use across today’s typical enterprise.

What Cohesity is Targeting with Turing

This is a challenge that data protection solutions can help with, and it is the one that Cohesity is targeting with Turing, its new product that is focused on driving how Cohesity works with AI tools and organizes its capabilities for facilitating a secure, controlled and quality data pipeline for AI. For its part specifically, Cohesity is using its distributed file system to provide a workflow for creating a unified pipeline of data that is clean, quality, and ready for AI – no matter where it is stored. Its global indexing and rapid search capabilities help when it comes to identifying, tracking and quickly locating applicable data. Additionally, Cohesity is building the integration points with customers’ desired AI tools, in an API-type fashion.

At the end of the day, it is important to remember that AI for AI’s sake does not add value. As an industry, we are investing in generative AI capabilities to move the needle for the business. Over the long term, this will ultimately yield use cases such as taking ransomware detection to the next level. In this example, AI could be used to more quickly deliver insights that incorporate business context. For example, if sensitive finance data was impacted and needs to be addressed as an immediate priority. It can also yield use cases such as cognitive search for eDiscovery, which factors in the user’s intent to deliver more intuitive, insightful, and accurate responses.

As we move toward this new era and begin to implement tools that will ultimately make these use cases a reality, an important part of IT Operations’ role — and the role of data protection — will be establishing oversight and strong access controls to maintain practices of least privileged data access and to ensure that compliance requirements are not violated.

Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.

Other insights from The Futurum Group:

Cohesity DataProtect Product Review

The Cost of The Next Big Thing – Artificial Intelligence

NIST Launches the Trustworthy & Responsible Artificial Intelligence Resource Center

Author Information

Krista Case

Krista Case brings over 15 years of experience providing research and advisory services and creating thought leadership content. Her vantage point spans technology and vendor portfolio developments; customer buying behavior trends; and vendor ecosystems, go-to-market positioning, and business models. Her work has appeared in major publications including eWeek, TechTarget and The Register.

Related Insights
Embeddable Contact
April 20, 2026

Twilio Flex as an Embeddable Contact Center: Will Platform Integration Redefine CX Sourcing for the Enterprise?

Twilio Flex's embeddable contact center capability intensifies CCaaS competition, offering enterprises deeper integration and AI-driven customization while challenging legacy providers....
Unlock Faster AI
April 20, 2026

Can Eridu’s AI Networking Break the Data Center Bottleneck—or Just Move It?

With 78% of organizations boosting AI budgets, Eridu emerges from stealth with $200M+ in funding, claiming to break the data center bottleneck—but whether new architectures solve the problem or just...
Sovereign Cloud
April 20, 2026

Can NetApp and Google Cloud Redefine Distributed Cloud Data Infrastructure for the AI Era?

NetApp and Google Cloud partnered to deliver unified sovereign cloud infrastructure for government agencies and regulated enterprises, integrating NetApp's data platform into Google Distributed Cloud for compliant, distributed AI solutions....
Cadence and NVIDIA
April 20, 2026

Cadence and NVIDIA Double Down on AI-Driven Engineering—Accelerated Computing Bridges Simulation and Verification

Cadence and NVIDIA have announced an expanded partnership embedding agentic AI and GPU acceleration into simulation and verification platforms, reshaping engineering productivity across RTL design, analog, and 3D IC workflows....
Hybrid Data
April 20, 2026

Can Cloudera’s Stability Bet Win the Hybrid Data War?

Cloudera's platform enhancements enable hybrid data environments with stability, elastic scaling, and Apache Iceberg interoperability, positioning the company to serve enterprises balancing cloud and on-premises infrastructure....
Can Databricks Out-Iceberg the Competition?
April 20, 2026

Can Databricks Out-Iceberg the Competition?

Brad Shimmin, Research Director at Futurum, analyzes Databricks’ public preview of Apache Iceberg v3, detailing how deletion vectors and the VARIANT data type bring performance parity and interoperability to the...

Book a Demo

Newsletter Sign-up Form

Get important insights straight to your inbox, receive first looks at eBooks, exclusive event invitations, custom content, and more. We promise not to spam you or sell your name to anyone. You can always unsubscribe at any time.

All fields are required






Thank you, we received your request, a member of our team will be in contact with you.