Red Hat Virtualization and AI Impacts on DevOps | DevOps Dialogues: Insights & Innovations

Red Hat Virtualization and AI Impacts on DevOps | DevOps Dialogues: Insights & Innovations

On this episode of DevOps Dialogues: Insights & Innovations, I am joined by Senior Director of Market Insights, Hybrid Platforms at Red Hat, Stuart Miniman, for a discussion on Red Hat Virtualization and AI Impacts on DevOps.

Our conversation covers:

  • Highlights of Red Hat Summit
  • Impacts of Virtualization and AI on the market
  • Additions of Lightspeed into RHEL and OpenShift expanding on Ansible

These topics reflect ongoing discussions, challenges, and innovations within the DevOps community.

Watch the video below, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Listen to the audio here:

Or grab the audio on your favorite audio platform below:

Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this webcast. The author does not hold any equity positions with any company mentioned in this webcast.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.

Transcript:

Paul Nashawaty: Hello and welcome to this edition of DevOps Dialogues: Insights & Innovations. My name is Paul Nashawaty. I am the Practice Lead for the App Dev practice at The Futurum Group. And today, I am joined by Senior Director of Market Insights, Hybrid Platform at Red Hat, Stuart Miniman, for a discussion on Red Hat AI and virtualization impacts on DevOps. Stu, how are you?

Stuart Miniman: I’m doing great, Paul. Nice to chat with you.

Paul Nashawaty: Yeah. Great. It’s good to have you on the session on the program today. We’ve done a number of sessions in the past. It’s great to have you on this one as well. But when we look at everything that’s been going on, Red Hat Summit just happened a little bit ago. Lots of excitement. Lots of activity going on. Why don’t we start there and talk a little bit about Red Hat Summit?

Stuart Miniman: Sure. Thanks, Paul. Yeah. It’s hard to believe Red Hat Summit about a month behind us already. So in AI time, that means everything has changed since then, right? Paul, for us, if you go back a year ago and where we were with the overall discussion of AI, we were six months into the kind of post-ChatGPT discussion. So we’ve been talking about hybrid cloud for over a decade. AI has been one of those workloads that runs on Linux, runs on containers and Kubernetes that we’ve been talking about for years and years, but really, Red Hat Summit was a way for us to kind of fill out our portfolio when it comes to an AI standpoint, as well as really put a stake in the ground as to how we should think about open source in the AI world.

Gotten feedback from many of the analysts like yourself and be like, “Hey, I’d expect Red Hat to be a little bit more vocal on this stuff.” And the feedback in general we’ve got is we have a strong voice. Obviously, we’re doing a partnership with IBM Research and the like, working with all of our customers, the ecosystem out there, and that’s what Red Hat does. It’s open source. We help build the community. We help foster the collaboration and discussion of where we should be taking this as an industry.

Paul Nashawaty: Absolutely, Stu, and when I think about everything that you talked about and everything I learned when I was in Denver there and talking about with yourselves and the product teams at Red Hat, I look at our research and I see where is the market going? What’s happening? Right. About nine months ago, I ran a survey that asked about running AI and production workloads, and we found about 18% of respondents said that they’re running AI and production workloads. Then I re-ran that survey recently within the last month or so and found that that number jumped to 54%. So when we look at that and we look at the adoption of what’s happening with AI and production workloads and AI as an accelerator for businesses, it really is clearly an area where organizations are taking a lot of vested interest in. And it’s also very nice to see that vendors are in alignment to where organizations are going.

So when I look at the impacts to AI and the virtualization on the market, there’s a lot of changes that are happening. Right? There’s a lot of changes across virtualization. There’s a lot of changes across licensing models and deployments. When I talk to CIOs, their number one challenge and concern is to do application modernization. Right. They’re trying to grow and build their next generation to where they’re going, but they want to work with a partner that’s going to be there. You mentioned the ecosystem too, so let’s talk a little bit about that. There’s this growth. There’s this desire to modernize. There is skill gap issues. There’s AI. There’s tool stack and all this thing, but there’s also an ecosystem. So what are your thoughts and what’s Red Hat’s position around that?

Stuart Miniman: Yeah. Obviously, a pretty big topic that you’re talking about, Paul. Let me start with some of our customers and what they’re really doing out there. I had the opportunity to host a couple of panels with some of our customers, and one of them specifically talked about ecosystems. One of our partners is AI Sweden, which is a really interesting kind of government-funded collaborative in the Nordics. They have a platform and they allow different groups from all sorts of industries in Sweden and the surrounding companies to participate and leverage the AI tooling. They’ve started years ago on this, so before generative AI, but absolutely, they’re diving in with generative AI.

And that platform, they’ve been using OpenShift. They adopted OpenShift AI. They were really interested in some of the announcements that we made at the conference, including InstructLab and how they would be able to contribute to it. And they’ve got everything from agriculture, to education, to startups working on autonomous vehicles and actually doing collaboration between them. And what was interesting about this panel is they are a partner of ours and we had a number of the other ecosystem partners on stage talking with them. From the model side, Stability AI, a startup, was there, and from the infrastructure side, you would Intel with their Gaudi 2 and Gaudi 3 chips, as well as Dell with the hardware to be able to provide that platform, and kind of showed the full end to end from the hardware, all the way through the models, to here is a collaborative that is allowing people to get their hands on and use these environments. And AI Sweden was a great one.

I had another panel in one of the AI breakfasts that we did with AGESIC, which is a company down in Uruguay, and they also have a government-funded collaborative where you get research, public and private sector, all coming together, and in that case, it’s some of the Latin American companies coming together. So I was super excited to see just these hotbeds of innovation, where they’re lowering the bar to entry, allowing small companies and some government agencies that you might not think of taking advantage of AI, getting access to that, and being able to accelerate their journey going forward.

Paul Nashawaty: Yeah. Stu, what I liked about what you were just talking when you were kind of talking is you were talking about these case studies. You were talking about the ecosystem. And it wasn’t just the big, huge conglomerates that are out there. It’s the emerging companies too. And working with the emerging companies and kind of bringing it all together, it really helps with innovation. Right? It helps with filling those gaps where organizations are challenged with some of those things that we talked about, the complexity issues, the skill gap issues. What about service delivery partners? Do you have anything to add around that? I know that obviously Red Hat has a large partner ecosystem, but the service delivery partners, usually, organizations, when they say, “Oh, you have skill gap issues. We can’t move forward,” I advise to work with service delivery partners. What are your thoughts there?

Stuart Miniman: Yeah. It’s interesting, Paul. One of the big announcements we made at the show was InstructLab, which is how can I actually contribute and participate? And over half of the people that attended Red Hat Summit actually got hands-on with it, did a quick 15 minutes, how do I do it? So on the one hand, we want to lower the bar to entry so much that if I have a decent laptop with a GPU in it, I can actually get my hands on it. I know you saw RHEL AI. It’s one of the new offerings that we have and that’s just basically a RHEL operating environment that includes the Granite models in it. So from a delivery standpoint, we want to be able to bring as many people in that might not have the skill set and lower that bar to entry.

But you’re absolutely right. We had a huge focus at the show about how do we do enablement. AI, it’s still a little bit emerging as to who has the skill sets and who can drive us along. One of the places we’re having a large discussion with our service delivery partners is around the virtualization opportunities. You mentioned it in the warmup here. Everybody wants to do AI. It’s something that they might not have new budget or new headcount, but one of the top priorities from the C-suite is make sure you can take advantage of AI because if you don’t, our competition will, and there are those opportunities. But from a virtualization standpoint, there’s a large percentage of our customer base that are having to reevaluate what they’re doing based on some of the things that have happened recently in the industry. And the channel partners, the GSIs, consulting partners absolutely are fully engaged on that opportunity.

Part of it, I don’t like to look at those as two separate things, Paul. It’s many of the things that you were doing and you’re saying, “Hey, if I thought my job was being a virtualization admin, I’ve probably been looking at what the future of that was going to be anyway.” And taking advantage of automation, taking advantage of learning new skills like AI has something that many in that space have been looking at for a whole number of years. You and I have a lot of background in that community and have heard that message for the last at least five years, Paul. So that’s where I would say, today, the huge service opportunity is things like migration aren’t simple, and I need skills, hands, and places where I can actually engage with the customers, and we need bodies to be able to help with that.

Paul Nashawaty: Yeah. I want to touch on a couple of things you mentioned. One, I got into the InstructLab and actually, fingers on keyboard, so I actually did it, which was kind of cool and I really enjoyed it.

Stuart Miniman: Wait, wait, analyst, you’re allowed to touch that stuff, Paul. I thought we just talked about it.

Paul Nashawaty: I flipped my badge around and I went in. Then I grabbed your badge and I went in. Anyways, we kind of went in and we did it, but it was incognito. But it was fun. It was fun to go in and see hands-on what the user experience looks like and how do you reduce complexity. And it’s important because what we see in our research is 67% of organizations are actually looking to hire generalists over specialists. So that was one of the reasons why I kind of went in and I said, “Well, okay, I will consider myself a generalist. I’m technical by nature and I kind of can go in and see if I can figure some stuff out.” And I really could. It was there. It was kind of ease of use. Now, granted, I think my skill was kind of far below what a lot of these organizations are when it comes to developers, but it was definitely interesting to see how you were doing the deployments and reducing the complexity.

With that said, I also think of… You mentioned you need the hands, right? And you need the bodies to put at the projects. And when I think of that, and I think of organizations, where they’re going, I think of Stonehenge and I think of these big huge rocks, and I have no idea how, but somehow those rocks were put up on top of other rocks. Right? They were pushed up there. And I’m thinking, if you put as many hands as you can underneath one of those rocks, I still bet that you couldn’t lift that rock. So when I think about where we are in AI, in application development and such, I think about that example. The more hands you throw at projects, yeah, that’s important, but it’s also equally important to look at the tool and the tech stack in order to achieve your goals.

We’re seeing that applications are being developed far more rapidly. More applications are being developed than just a few years ago. So the more hands you throw is going to be one approach. Another approach is to use automation. And when we look at Lightspeed and how Lightspeed helps with some of those announcements like with Lightspeed into OpenShift, it was part of Ansible and the automation piece there. Can you talk a little bit about that announcement?

Stuart Miniman: Yep, absolutely. Right. If we look at AI, there’s a couple of things that everyone’s doing. Number one is where are we with models, some of the deliverables to help customers use AI? And then the second piece is how is every product being infused with AI? So I don’t know about you, Paul, but every time I do an update on my phone, I feel like the app is like, “Hey, I’ve got this new AI thing that’s going to help me along and do that.” And some of them are useful. Some of them have a lot of hallucinations and are a little bit weird, but it’s a journey we’ve actually been on for a while. You mentioned we started with what is now called Ansible Lightspeed. It was originally called Project Wisdom. We were working closely with IBM research on that and it was how do we help get greater productivity out of our people? And from an Ansible standpoint, it was let’s help generate playbooks. Through natural language, let me type something in and generate a playbook. It really has gotten to the point now we can almost automate that in full process for me.

From an OpenShift standpoint, it’s a little bit different. So when we looked at this and said, “Okay, how do we infuse AI into what we’re doing,” really what it is is a chatbot. So something that would be in our portal when you go in and I should be able to ask it questions and there’s general knowledge there, but I should actually be able to have a trained model based on my environment. So the case study that we showed, the keynote was it was an insurance company, and I should be able to build some things and the Lightspeed piece can help accelerate what I’m doing with OpenShift. For example, if I need to worry about scaling, you can give me a couple of options there, point me to the right tools, and help me take advantage of proper operators that I’m doing. So absolutely something we’re rolling out.

The OpenShift Lightspeed is something that… I believe we call it… It’s like an alpha at this point, so customers can get it. We really want their feedback and something over time that will be baked into the product because what’s nice is these Lightspeed products really, I should say, are features. They are not something that we’re necessarily looking to upcharge in the market, which I’m sure something you look at, and something we all look at is, “Okay, wait, is this something that’s just going to make my overall experience and make my product stickier or is it something that we’re looking to increase the price and per seed, I’m going to charge an extra bit for whatever that extra piece is?” So that’s where we have products like OpenShift AI and now RHEL AI, that are products. And then we have the pieces that are going to enhance our offering from the Lightspeed, which are just part of the overall portfolio itself.

Paul Nashawaty: No. There’s definitely a lot there to unpack. There’s definitely a lot to talk about, and unfortunately, we don’t have a whole lot more time. But if somebody were to get started… This is a DevOps series, people at the DevOps. One of the audience that views these sessions are trying to understand these different tech stacks and what they’re doing. How would they get started?

Stuart Miniman: Yeah. Of course, the biggest thing is where are you in your journey and which products are you using? The one that I’d recommend, I thought was one of the coolest things we talked about was InstructLab. The great thing about InstructLab, it’s up on GitHub. It’s also up on Hugging Face. I actually had a team meeting down at our headquarters in Raleigh, and our whole team spent a few hours. Paul, I got my hands on it, downloaded on my laptop. It did some work. It does some synthetic generation, so I created five or six queries in it, added a messaging dock on top of it. It was all done in YAML, and then it generated a hundred other things to help train that model. So it’s something that everybody… A developer can be like, “Hey, I want to understand what this does and everything.” I can actually go play with it.

Usually, if you want to get in the code, InstructLab’s a great place to do it. If you are an Ansible or an OpenShift customer, you can go look at those Lightspeed pieces. And of course, from Red Hat, you’re going to expect, what we’re doing, it’s all open source. So I’d love also the feedback from the community because lots of questions as to what gets open sourced, what about my data, what am I going to contribute, we know that today open source is the default development model for software and we want to have a robust discussion with the community as to how that fits into the entire AI world. It’s not just what we’re doing with the Granite models. You’ve got Mistral and Llama and others out there that can work with these toolings, and we’re working closely with our customers on that in the open source space, as well as of course, our cloud providers who have some more proprietary models that lots of customers are also playing with.

Paul Nashawaty: Stu, it’s always a pleasure to have you on. Always a pleasure to talk to you. It’s very insightful. Thanks for your perspective and insights. The show was great. I enjoyed it. I agree with you. InstructLab is a fun place to get started, but it also has a real solid practicality to it as well, which is awesome. But all the other announcements, there’s so much to talk about. For the audience, I want to thank you for your participation in watching our session today, but also, you can learn more about our coverage of Red Hat Summit at thefuturumgroup.com. We have a full research note that’s highlighting all the details of the announcements, as well as it ties back to our research study. We can show that what Red Hat’s doing and what the market insights are doing are in alignment, so it’s a good summary of the event. So I want to thank you for your time and have a great day.

Stuart Miniman: Thank you for having me on, Paul.

Paul Nashawaty: Thank you.

Other Insights from The Futurum Group:

Application Development and Modernization

The Evolving Role of Developers in the AI Revolution

Highlights from Red Hat Summit 2024: Expanding on Innovation – The Futurum Group

Author Information

Paul Nashawaty

At The Futurum Group, Paul Nashawaty, Practice Leader and Lead Principal Analyst, specializes in application modernization across build, release and operations. With a wealth of expertise in digital transformation initiatives spanning front-end and back-end systems, he also possesses comprehensive knowledge of the underlying infrastructure ecosystem crucial for supporting modernization endeavors. With over 25 years of experience, Paul has a proven track record in implementing effective go-to-market strategies, including the identification of new market channels, the growth and cultivation of partner ecosystems, and the successful execution of strategic plans resulting in positive business outcomes for his clients.

SHARE:

Latest Insights:

Novin Kaihani from Intel joins Six Five hosts to discuss the transformative impact of Intel vPro on IT strategies, backed by real-world examples and comprehensive research from Forrester Consulting.
Messaging Growth and Cost Discipline Drive Twilio’s Q4 FY 2024 Profitability Gains
Keith Kirkpatrick highlights Twilio’s Q4 FY 2024 performance driven by messaging growth, AI innovation, and strong profitability gains.
Strong Demand From Webscale and Enterprise Segments Positions Cisco for Continued AI-Driven Growth
Ron Westfall, Research Director at The Futurum Group, shares insights on Cisco’s Q2 FY 2025 results, focusing on AI infrastructure growth, Splunk’s impact on security, and innovations like AI PODs and HyperFabric driving future opportunities.
Major Partnership Sees Databricks Offered as a First-Party Data Service; Aims to Modernize SAP Data Access and Accelerate AI Adoption Through Business Data Cloud
Nick Patience, AI Practice Lead at The Futurum Group, examines the strategic partnership between SAP and Databricks that combines SAP's enterprise data assets with Databricks' data platform capabilities through SAP Business Data Cloud, marking a significant shift in enterprise data accessibility and AI innovation.

Thank you, we received your request, a member of our team will be in contact with you.