Embracing Data Protection-as-a-Service Without Sacrificing Cyber-Resiliency

Embracing Data Protection-as-a-Service Without Sacrificing Cyber-Resiliency

Previously, in the second blog of our four-part series discussing key tradeoffs for IT Operations to consider when comparing options for on-premises and cloud-based data protection operations, we covered ease of use and IT simplicity, which is arguably the biggest driver to the cloud. In this blog, we will cover the overarching topics of security and cyber-resiliency – arguably the biggest barriers to the cloud.

The first item relating to this topic that needs to be addressed is the issue of data placement and localization. That is, giving up, or at least limiting, control over where data is stored, and on what infrastructure it is stored (e.g., a dedicated or multi-tenant system) when moving off-premises. Especially when it comes to sensitive data, organizations have concerns – and, in many cases, have legislation they must comply with – about data privacy and sovereignty. Simply put, data might need to remain on premises for compliance purposes. That being said, service providers are working to address these challenges, with many certifying their solutions per compliance standards such as the Federal Risk and Authorization Management Program (FedRAMP) and the Health Insurance Portability and Accountability Act (HIPAA). In addition to looking for any pertinent compliance certifications, IT Operations should look for the ability to dictate where their data is stored (e.g., specific region) and for tenant isolation.

Alongside issues pertaining to lack of control over the data environment, we see concerns around lack of visibility and potentially insecure methods of data access, when moving off-premises. Especially as environments become multi- and hybrid-cloud by default, the threat landscape broadens greatly when moving off-premises. On-premises, IT teams have full autonomy to customize their security implementations to their specific needs, and this includes access control measures. Carrying best practices of providing “least privileged access” through to the cloud or other off-premises implementations with the same diligence and expertise is important.

On the flip side, as previously alluded to, cloud service providers have been building security-related technical capabilities and security teams that are robust. Especially given the staffing and expertise limitations touched upon in the previous blog, this is potentially a significant value-add for IT Operations teams. It can help to avoid potential misconfigurations, and it can help to implement, maintain, and update best-practice security measures such as encryption, firewalls, intrusion detection, and monitoring of the environment. As threats continue to evolve, this is not a small feat.

A cyber-resiliency-focused consideration for public cloud-hosted data protection infrastructure is the cross-region replication that providers use, which effectively builds in high levels of service availability. Additionally, replication, failover and failback for disaster recovery become easier; while people and expertise are both still required to make decisions regarding the environment’s architecture, the need to purchase, deploy and manage infrastructure is eliminated.

Along a similar vein, cloud-hosted “data vault’ solutions are coming in vogue – meaning they are being actively developed and pushed by vendors, and that they are increasingly in evaluation and in use by enterprises. They offer a potential alternative to traditional tape-based solutions for physical air gapping, but naturally, warrant close consideration by IT Operations for true isolation and cyber-resiliency. I personally audited two solutions – Cohesity FortKnox, and Dell PowerProtect Cyber Recovery (which has the option to be deployed on- or off-premises) – and my resulting reports offer key criteria for IT Operations to consider when evaluating these types of solutions specifically.

Security and cyber-resiliency are top-of-mind considerations across the board for IT Operations teams, especially given the onslaught of ransomware and data extortion. In addition to considering these tradeoffs between on- and off-premises solutions, data immutability, scanning for anomalous activity that could indicate nefarious activity, and the ability to monitor and audit data usage for suspicious behavior are table-stakes capabilities to consider regardless of the specific implementation.

Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.

Other insights from The Futurum Group:

How to Migrate Data Protection to the Cloud and Not Regret It

Streamlining Operations with Cloud Data Protection

Spectrum Enterprise and Cisco Give Business Cybersecurity Protection Ease and SASE Appeal

Related Insights
Embeddable Contact
April 20, 2026

Twilio Flex as an Embeddable Contact Center: Will Platform Integration Redefine CX Sourcing for the Enterprise?

Twilio Flex's embeddable contact center capability intensifies CCaaS competition, offering enterprises deeper integration and AI-driven customization while challenging legacy providers....
Unlock Faster AI
April 20, 2026

Can Eridu’s AI Networking Break the Data Center Bottleneck—or Just Move It?

With 78% of organizations boosting AI budgets, Eridu emerges from stealth with $200M+ in funding, claiming to break the data center bottleneck—but whether new architectures solve the problem or just...
Sovereign Cloud
April 20, 2026

Can NetApp and Google Cloud Redefine Distributed Cloud Data Infrastructure for the AI Era?

NetApp and Google Cloud partnered to deliver unified sovereign cloud infrastructure for government agencies and regulated enterprises, integrating NetApp's data platform into Google Distributed Cloud for compliant, distributed AI solutions....
Hybrid Data
April 20, 2026

Can Cloudera’s Stability Bet Win the Hybrid Data War?

Cloudera's platform enhancements enable hybrid data environments with stability, elastic scaling, and Apache Iceberg interoperability, positioning the company to serve enterprises balancing cloud and on-premises infrastructure....
Meta’s MTIA Partnership With Broadcom Solidifies the Future of XPUs in Inference Optimization
April 20, 2026

Meta’s MTIA Partnership With Broadcom Solidifies the Future of XPUs in Inference Optimization

Brendan Burke, Research Director at Futurum, examines how the Meta Broadcom MTIA partnership expands custom AI silicon and tests whether multi-gigawatt infrastructure can scale efficiently....
Can Databricks Out-Iceberg the Competition?
April 20, 2026

Can Databricks Out-Iceberg the Competition?

Brad Shimmin, Research Director at Futurum, analyzes Databricks’ public preview of Apache Iceberg v3, detailing how deletion vectors and the VARIANT data type bring performance parity and interoperability to the...

Book a Demo

Newsletter Sign-up Form

Get important insights straight to your inbox, receive first looks at eBooks, exclusive event invitations, custom content, and more. We promise not to spam you or sell your name to anyone. You can always unsubscribe at any time.

All fields are required






Thank you, we received your request, a member of our team will be in contact with you.