The News: Datadobi StorageMAP enables customers to manage unstructured data across enterprise-scale distributed storage from scale-out NAS to public cloud platforms.
Manage Unstructured Data: Cost, Risk, and Value with Datadobi
Analyst Take: One of the evergreen challenges in IT infrastructure is the inevitable growth of unstructured and semi-structured data. A vast amount of data is stored as files, usually on server shares and various NAS platforms but increasingly on object storage. Often, the data is of questionable or unknown value, but deletion is never the IT team’s prerogative as they do not own the data. IT simply provides the storage location and pays the bills; consequently, optimizing unstructured data storage is vital for cost management. Cost is not the only issue; unstructured data represents a risk if IT does not know what is stored in these files. There can be huge and potentially invisible liability, compliance, and e-discovery risks all over these file systems. There is also tremendous potential value in these files; they often hold much of the organization’s intellectual property (IP). However, without knowing what data is where, the potential value is impossible to realize. This intersection of cost, risk, and value is where Datadobi’s StorageMAP can help customers manage unstructured data.
Datadobi’s earlier products, Dobimigrate and Dobireplicate, addressed data movement between NAS devices, often scale-out NAS systems. StorageMAP adds orchestration capabilities, where the movement or replication of files is driven by their metadata to better manage unstructured data. StorageMAP generates additional metadata beyond the usual date information in the file system. Its workflow engine uses this metadata to drive movement between cost and performance tiers, including offline tiers such as Amazon Web Services (AWS) Glacier for compliance retention. Files are scanned and indexed to produce rich metadata, which can drive analytics and a workflow engine to act on the files. StorageMAP can generate chargeback or show-back cost information as part of the analytics, either in financial cost or environmental cost as the amount of CO2 generated. The analytics also helps identify redundant or obsolete data that delivers no value for the business.
StorageMAP’s metadata is vital to the risk perspective; knowing where financial or customer-identifying information is held is central to managing that risk. The workflow engine might help enforce data retention and deletion policies and log any data lifecycle activities for later audits. The metadata and history information make e-discovery tasks significantly less onerous, and the extensive reporting simplifies proof of compliance.
While the optimization and risk management aspects are critical, I find the application enablement parts of StorageMAP more interesting. Once the scanner element has indexed an organization’s unstructured data, the workflow can bring newly discovered valuable data into a data pipeline for artificial intelligence/machine learning (AI/ML) to extract more value.
Very commonly, large organizations have difficulty discovering and cataloguing the information they hold, and StorageMap simplifies this challenge. It reminds me of how the AWS Glue crawler is used to discover and index files on AWS S3 storage; only with StorageMap, you keep your data on-premises. You get a data catalogue and a workflow engine to make that data available to your applications. Controlling cost and risk is essential, maximizing the value you receive for that cost and risk is even more critical.
Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.
Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.
Other Insights from The Futurum Group:
Hammerspace Unveils Hyperscale NAS Addressing the AI/HPC Workloads
Snowflake and the AI Landscape Amid Databricks Competition
Starburst Launches Managed Icehouse Implementation
Author Information
Alastair has made a twenty-year career out of helping people understand complex IT infrastructure and how to build solutions that fulfil business needs. Much of his career has included teaching official training courses for vendors, including HPE, VMware, and AWS. Alastair has written hundreds of analyst articles and papers exploring products and topics around on-premises infrastructure and virtualization and getting the most out of public cloud and hybrid infrastructure. Alastair has also been involved in community-driven, practitioner-led education through the vBrownBag podcast and the vBrownBag TechTalks.