Segment launches Data Lakes To Simplify CDP Complexity

The News: Segment, the world’s leading customer data platform (CDP), announced the launch of Data Lakes, a new data architecture product built specifically to help companies create cutting-edge customer experiences with their customer data. Flexible, affordable and easy to use, Segment Data Lakes provides companies with the foundation needed to produce advanced analytics, uncover rich customer insights, and power machine learning and AI initiatives. Read the full news on MarTechSeries.

Analyst Take: As I’ve been watching the data conversation rapidly shift more toward CDP, Segment is proving to be a company with serious ambition.

The announcements of its new Data Lakes architecture focuses on the complexities that most companies are facing when trying to do more with their data. Essentially, the fragmented, multi-schema, distributed nature of data means a lot of data goes untapped for companies. The idea of Data Lakes appears to be all about simplifying those complexities.

Zoom with margin

Initially, Segment Data Lakes will be built on Amazon (AWS), but the company does intend in the coming months to expand to also support Microsoft Azure. I envision other clouds will follow based upon customer demand and popularity.

Perhaps more interesting, is the way Data Lakes work, to provide a “Data Warehouse” layer for users eliminating a lot of the manual work for data engineers. Segment Data Lakes does this inherently, which is one of the key benefits of a data warehouse.  It can then break down what is typically unstructured data and creates a schema that is accessible via a metadata store – AWS Glue Data Catalog in this case.

The system continuously monitors event logs to infer from the schema and then use the data to create new tables and columns in Glue Data Catalog. From here data is further partitioned by day and hour to significantly reduce the amount of data which needs to be scanned to return even faster query results–This is a very important capability to work with larger, more complex data sets.

Overall Impressions of Segment Data Lakes 

In the end, it is about better use of data with less effort. Segment Data Lakes is addressing these two needs allowing data engineers, teams and organizations to:

  • Unlock richer customer insights with less effort
  • Reduce time building and maintaining their data lake
  • Optimize data storage and compute costs
  • Future-proof architecture to build the fundamentals for machine learning investments

I firmly believe that customer data is essential to delivering exceptional products and experiences, but since data lakes are difficult to build and maintain, few businesses have the architecture in place to truly make the most of it. Therefore, as a data lake built specifically for customer data, I feel that what Segment is planning to deliver with its new Data Lakes provides the foundation to unlock complex use cases like machine learning and advanced analytics, so businesses can do more with their data and power deeper personalization of the customer journey.

Futurum Research provides industry research and analysis. These columns are for educational purposes only and should not be considered in any way investment advice

Read more analysis from Futurum Research:

Amazon Set to Invest $18 Billion in Small and Medium Business in 2020

Microsoft Wins Department of Defense JEDI Contract Award Again – For Now Anyway

Zoom Follows Blowout Quarter with Blowout Quarter

Image:  Segment

Author Information

Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.

From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.

A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.

An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.


Latest Insights:

Daniel Newman and Patrick Moorhead share their insights on Intel's latest developments and performance at Computex, revealing how these announcements could shape the future of technology.
In this episode of Infrastructure Matters, hosts Camberley Bates, Steve Dickens and Krista Macomber cover the latest from conferences: Broadcom Mainframe, NetApp AR, Splunk and AWS reinforce.
Spurthi Kommajosula from IBM joins host Steven Dickens to share insights on modernizing data exchange to accelerate data and AI outcomes—illustrating the need for efficiency, governance, and the benefits of treating data as a valuable product.
Prakash Darji and Shawn Hansen, GMs at Pure Storage, join Daniel Newman and Patrick Moorhead to share their insights on pioneering data storage solutions and the transformative impact on businesses today.