Search
Close this search box.

Why Has Kubernetes Taken So Long to Cross the Chasm?

Understanding the Market Adoption of Kubernetes and Future Investments

Why Has Kubernetes Taken So Long to Cross the Chasm?

The News: Container native and Kubernetes has been in the market 10 years, since 2014. For many, the process of widespread enterprise adoption has been slower than expected. In this article, we look back and forward at the dynamics bringing Kubernetes to the early majority market.

Why Has Kubernetes Taken So Long to Cross the Chasm?

Analyst Take: I am often asked about enterprise IT adoption of container native systems. From the infrastructure perspective, some expected that we would have seen widespread deployments by now, requiring a persistent data infrastructure or data protection. And while my earlier blog post stated that we have arrived at the early majority phase, it will still take time. Why?

I have likened the transformation of container native environments to what we experienced with the shift from the mainframe to the open systems and three-tier architecture. Migration required a rewrite of the applications and, in the process, a revision of the user experience.

Container Native vs. Virtualization

Container native adoption cannot be likened to the adoption of VMware for virtualizing the environment. Virtualization is an abstraction of the hardware and was implemented by IT operations. It did not need redesign or reprogramming of applications, though we did see that supporting technologies, such as data storage and data protection, needed to adapt to the new architecture. In the process, we saw a rush to solve problems relating to IO performance bottlenecks and management complexities, and the creation of strategies for availability and protection. This progression was pretty rapid. Thus, with this memory of recent times, many thought containers were on the same trajectory.

Where We Are Today

Consider where we are today, the current applications require a rewrite to be container native and adapt to the Kubernetes world. We have heard numerous times that “lifting and shifting” to the cloud did not make life easier for IT. Just because the application was in the cloud, it was not cloud native/containers. As such, IT enterprises are working through a rationalization of the applications—what is to be rearchitected, what is to be maintained, and where to place the service for the most effective deployments. For those that have lifted and shifted, there is some talk of moving back, but for the most part, it is a review of how the application can be serviced and maintained the most cost effectively.

Am I conflating container native adoption with cloud native adoption? Yes and no. Containers have been the domain of the cloud (thank you Google for Kubernetes). Today, we see large-scale deployments on-premises and the number of such deployments is increasing. IT enterprises are seeing success with supported open source systems (Red Hat Open Shift, SUSE Rancher), and there is an increase of offerings specifically to stand up these environments (Dell APEX Cloud Platform, Hewlett Packard Enterprise (HPE) GreenLake, Nutanix, and IBM Fusion, to name a few).

Explosion of AI and Generative AI

Now we are into the explosion of AI and generative AI. These applications are built on container native platforms. For that reason, container native applications just got a huge tailwind for adoption. Still, this growth is in application-based adoption, not driven by IT operations.

Thus, what will we see in 2024 as IT operations and developers pick up the pace for container native applications? We will see the need for:

  1. More databases and database as a service. This need is not SaaS for Oracle, rather, it is a management of databases that can be selected by the developer. Another IT group is managing the versions and software upgrades and creating an environment for easy provisioning.
  2. Better observability. This functionality enables better understanding of the execution and operations of a container application. Be able to problem solve and triage a poorly functioning application.
  3. Availability and resilience. Although the nature of containers is ephemeral, there is still the need for resilience. This need will not go away, especially as complexity ensues and data continues to become more prevalent.
  4. Security. Need I say more than open source can become an exposure, and cyber threats will continue their march.
  5. Data storage. Data storage, as with the database adoption, comes with the need for persistence and management.
  6. Data protection. Once you establish data as an entity (not just scratch or training data), then preservation becomes important.

As with any new technology, an entire ecosystem is built around it, as gaps and issues appear. For containers, we are still in the beginning stages of determining what is needed and important. It is just slower than that previous transition with virtualized environments. Maybe AI will jettison us into this next dimension.

Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.

Other insights from The Futurum Group:

Unlocking the Gate to Digital Transformation – Research Study

Kubecon 2023 Chicago – Kubernetes Has Crossed the Chasm

Kubecon 2023 Live from Chicago – Infrastructure Matters, Episode 20

Author Information

Camberley Bates

Camberley brings over 25 years of executive experience leading sales and marketing teams at Fortune 500 firms. Before joining The Futurum Group, she led the Evaluator Group, an information technology analyst firm as Managing Director.

Her career has spanned all elements of sales and marketing including a 360-degree view of addressing challenges and delivering solutions was achieved from crossing the boundary of sales and channel engagement with large enterprise vendors and her own 100-person IT services firm.

Camberley has provided Global 250 startups with go-to-market strategies, creating a new market category “MAID” as Vice President of Marketing at COPAN and led a worldwide marketing team including channels as a VP at VERITAS. At GE Access, a $2B distribution company, she served as VP of a new division and succeeded in growing the company from $14 to $500 million and built a successful 100-person IT services firm. Camberley began her career at IBM in sales and management.

She holds a Bachelor of Science in International Business from California State University – Long Beach and executive certificates from Wellesley and Wharton School of Business.

SHARE:

Latest Insights:

Sovereign Cloud Deployments: The Race Among Hyperscalers
Steven Dickens, Chief Technology Advisor at The Futurum Group, shares insights on Oracle’s US$6.5 billion investment in Malaysia's sovereign cloud. This move, alongside strategic hyperscaler partnerships, positions Oracle to lead in AI innovation and regulated cloud deployments.
VAST Data Adds to Its AI Capabilities With New InsightEngine Targeting RAG Workloads
Mitch Lewis, Research Analyst, Camberley Bates, CTA, and Mitch Ashley, CTA, at The Futurum Group share their analysis on the VAST Data’s InsightEngine with NVIDIA announcements.
Krista Case, Research Director at The Futurum Group, overviews NetApp Insight 2024.

Latest Research:

In our latest Research Brief, Leveraging Intelligence to Maximize the Value of CRM Data, completed in partnership with SugarCRM, we discuss how AI can be harnessed to deliver real-world benefits quickly through intelligent summarization and cover how SugarCRM is making this capability within its customer relationship management offering.
In our latest research report, Boost Efficiencies and Optimize Performance with Turnkey AI Enterprise Solutions, we examine how organizations can plan and build private AI solutions and why private AI addresses the growing demand for greater control over AI solutions, including protection and transparency.
In our latest research report, Achieve Better Economics and Performance Through Hybrid AI, we examine why the Lenovo Smarter AI for All portfolio and vision can meet the unique demands of selecting and implementing hybrid AI organization wide.