KubeCon and the Container Native Computing Foundation (CNCF) was all about AI and how this next technology wave will be supported and driven by the community. Priyanka Sharma, Executive Director, CNCF, led off the keynote highlighting how end users had developed a sense of technical capability in the digital transformation phase and this experience is spilling into the AI age. The challenges, accurately observed, will be going from concept to production. In her observation, bringing in opinionated solutions (defined stacks and AI offerings) was building a walled garden.
In a sense, Priyanka is accurate. The speed of development and new approaches, tools, models, compliers, frameworks, etc. are spreading in the drive to capture the business value of AI. But she also acknowledged the complexity of the environment. There is the need for freedom to switch to the latest technology without worrying about optimizing the here-and-now investment. With an open market, the premise is that we will push the boundaries leading to more innovation. One of the resources CNCF just released is its cloud native AI reference architecture. Expect to see more tools in the form of training from CNCF.
Platform Engineering
The second major theme was platform engineering (PE), which we have written and spoken about for the past 24 months. PE is a very necessary function in the realm of scaling container environments. It is even more real with AI. One of the major sponsors, NVIDIA, highlighted the requirements for optimizing the use of GPUs, discussing new ways with Dynamic Resource Allocation managing with scheduling and hop distance, especially in the scale out environment. The need for fault tolerance and resiliency to repair at scale is the domain of the platform engineer, even with automation and observability. All areas require tools and investments. Despite this discussion, there is still a debate about whether platform engineers and developers are the same thing. This analyst group thinks not, especially at scale.
Container Native
Continuing on this general theme, there was a discussion on AI and container native. What are the differences and similarities and how do container-native platforms need to grow to accommodate AI? There was an interesting analog that if inferencing is the new web app, then Kubernetes is the new web server. The key distinction being in the requirements for managing data—collection, cleaning, etc. Why the analogy? We see most AI environments are built on Kubernetes. And just as the ecosystem grew up around web apps, the same will apply as AI grows around Kubernetes. But Kubernetes or cloud-native apps were built for CPUs not GPUs. As such, it is not designed with the data scientists in mind. They do not want to bother with the YAML etc. that is presented. Time for more innovation.
For AI, there is the need for the platform engineer, but in this case, they are calling the role the AI engineer or more commonly MLOps. Basically, the role is tasked with designing and managing a scale-out distributed heterogeneous systems complete with the massive data pipeline to support them. Hold on guys, we are about to get even more complex.
All in all, we are going for a very fun ride with AI and Kubernetes, and CNCF plans to be there every step of the way.
Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.
Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.
Other Insights from The Futurum Group:
The Growth of Kubernetes and the Platform Engineering Evolution
KubeCon 2022: From DevOps to PlatformOps
Exploring NetApp Astra and Kubernetes: Innovations and Insights – Infrastructure Matters
Author Information
Camberley brings over 25 years of executive experience leading sales and marketing teams at Fortune 500 firms. Before joining The Futurum Group, she led the Evaluator Group, an information technology analyst firm as Managing Director.
Her career has spanned all elements of sales and marketing including a 360-degree view of addressing challenges and delivering solutions was achieved from crossing the boundary of sales and channel engagement with large enterprise vendors and her own 100-person IT services firm.
Camberley has provided Global 250 startups with go-to-market strategies, creating a new market category “MAID” as Vice President of Marketing at COPAN and led a worldwide marketing team including channels as a VP at VERITAS. At GE Access, a $2B distribution company, she served as VP of a new division and succeeded in growing the company from $14 to $500 million and built a successful 100-person IT services firm. Camberley began her career at IBM in sales and management.
She holds a Bachelor of Science in International Business from California State University – Long Beach and executive certificates from Wellesley and Wharton School of Business.