Curious about the future of mainframes? Six Five Media hosts Steven Dickens and Mike Vizard chat with BMC‘s Anthony DiStauro, AI Evangelist & Senior Product Development Architect and Dave Jeffries, Vice President of R&D for ZSolutions at BMC Connect. Discover how generative AI unlocks strategic insights, enhances security, and can shape the future of this critical technology.
Their discussion covers:
- The current state and evolution of mainframe DevOps and AIOps
- How generative AI services are revolutionizing mainframe operations
- The role of strategic insights in enhancing DevOps and AIOps practices
- The impact of AI on mainframe security and compliance
- Future trends and predictions for mainframe technology
Learn more at BMC Software.
Watch the video below at Six Five Media at BMC Connect, and be sure to subscribe to our YouTube channel, so you never miss an episode.
Or listen to the audio here:
Disclaimer: Six Five Media at BMC Connect is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.
Transcript:
Steven Dickens: Hello and welcome. I’m Steven Dickens, and I’m joined by my host Mike Vizard. We are coming to you from BMC Connect and this is Six Five on the road. We’re joined by Anthony and my dear friend, Dave Jeffries. Hey guys, welcome to the show. So let’s get started. Tell the viewers and listeners a little bit about what you do for BMC.
Dave Jeffries: So I’m responsible for really all the aspects of R&D, whether it’s the products that we’ve had for a number of years or whether it’s late and breaking AMI Platform, AMI assistant. It’s a fantastic place to be. So we just drive innovation and that’s my team worldwide that do that.
Steven Dickens: That’s not a bad way to describe your job, Dave driving innovation. Anthony, you?
Anthony DiStauro: So I’m architect for AMI platform and AMI assistant and an AI evangelist. Go out and talk to many customers about what we’re doing.
Steven Dickens: Is that a new cool job title for this year?
Anthony DiStauro: Yes, that just appended right onto my title.
Steven Dickens: Fantastic. So you took us there, guys in your introductions with Generative AI, we’re here at Connect this week, just come off the keynotes, lots of focus on Generative AI assistance. Where are you seeing that fit within the DevOps and that sort of space? I’ll go to you first, Dave.
Dave Jeffries: Right. We’ve done some fantastic announcements today. And if you haven’t seen them, it’s all about AMI Assistant, AMI Platform and bringing generative AI to where we think it’s needed the most. And obviously a lot of people talk about skills gaps and skills attrition on the mainframe. But you have to think, “Why is that an issue? Why is the skills challenge an issue on the mainframe?” Because surely it’s going the way the dodo maybe or something like that. Well, in reality, it’s completely the opposite. The mainframe has got an entirely new lease on life.
And those skills are probably or the lack of those skills are an inhibitor to transformation. People want the mainframe to do some new cool stuff because it’s got fantastic new technology in there. And we think generative AI is providing really that key to unlock what applications do. So therefore reducing the risk of changing those applications, what your systems do, how your systems need to operate, et cetera. And it’s allowing you to unleash transformation, which is bringing a whole new realm of possibilities to the platform, and how the platform can support business.
Steven Dickens: Bringing new people into the platform.
Dave Jeffries: Yeah.
Steven Dickens: Enabling them to get started faster?
Dave Jeffries: Yeah, and it’s making, I think the guys and gals that are already there helping them as well to innovate. Because sometimes you might be the last one there and you might be struggling in terms of the scale of the challenge in front of you. You need some help to go do it. And I think it’s not just unleashing the next generation of talent, it’s unleashing the talent that already exists, which is important.
Mike Vizard: You touched the customers a lot.
Anthony DiStauro: Yes.
Mike Vizard: What are some of the examples that people are actually using here? Because I think we talk a lot of theory with AI, but what are we actually seeing? Where is the manual effort and the scut work disappearing?
Anthony DiStauro: Yeah, so that’s a really good, interesting question. So from different customers, it means different things to them. So a lot of them, it’s capturing that tribal knowledge. Let’s just start there. Before they even go down the road of AI or generative AI, they have to take that step back and see what does it mean to their business? And if you just look at some of the tooling that’s out there today, currently when it comes to generative AI, it’s very agnostic. It really doesn’t mean anything specifically to a customer and their wants and their needs.
So the first thing that we had to do is take a step back and say, “Well, how do we infuse our generative AI with the knowledge from a customer’s environment? First of all, to make that AI relevant to them, to address their wants, their needs, their business direction?” I think you used the word “skunk works” when it comes to that. Well, how do we make it a reality? How do we make it relevant for the customers? So we built our platform of generative AI services in a way that it’s open. And customers can infuse it with knowledge that they need, then start applying it to things in the DevOps space, in the AI ops space, specifically around what they’re trying to get out of improvements in their business with generative AI, making it relevant in context for them.
Mike Vizard: So I can customize it and it meets me where I am, versus me being forced to do something.
Anthony DiStauro: That’s right. So you don’t want to ever leave your experience or your environment or your tooling to jump out into another platform or another tooling, cause that’s where you lose what we call context and the relevance of what it means to you. We like to say, “We meet the customers where they are in our product experiences with that context, with the understanding, and you get far more out of generative AI and much better results that are again, relevant for you and your business as you move forward.”
Steven Dickens: So Dave, key word there from Anthony was “specific,” making this specific for the particular shop, the operators, the developers.
Dave Jeffries: Yeah.
Steven Dickens: Can you just kind of double click on that as a phrase, contextualize it? What am I going to be actually doing with AMI Assistant or some of the AI ops stuff and how that’s going to work?
Dave Jeffries: I think one of the key aspects of all this is, and everybody’s, they have their approach to generative AI, there’s a world and a plethora of LLMs out there, large language models. We’re starting to see some language models being really good at certain things in certain areas. Some are good at code, some are good at other things, etc. So as Anthony was talking about, one of our kind of I think core traits that we bring to the platform and to our solution is allow you to take the right LLM for the right use and then obviously infusing it with your own information.
Steven Dickens: So that’s how you’re getting that specificity.
Dave Jeffries: Oh, absolutely. And then you can apply it to the code world. We talk a lot about generative AI in code in terms of understanding code, but there’s more than just code that runs the business. There’s the infrastructure, there’s the configuration, there’s the environment. And so not just applying generative AI to understanding what a COBOL application does or what an assembler application does, but what does the REX do? What does the JCL do? What does the expert back at base do in terms of how he resolves a particular situation that may appear in operations?
Steven Dickens: Exactly.
Dave Jeffries: So, bringing that subject matter expert to a wide variety of areas involves multiple kind of tribal knowledge, pockets being pulled together, multiple LLMs being able to be used for the right reason and the right purpose at the right time.
Mike Vizard: I think explaining code is great because a lot of folks, they didn’t document it in the first place, so they don’t really know how it works, but how do we go to the next level? Because I think what we’re moving now towards is real-time insights that are going to be surfaced as I’m trying to perform a task. So, we’re on this journey, but it seems like there’s multiple phases. What are they?
Anthony DiStauro: There is. So the way we are looking at the spectrum right now, and we started off in the area of what we all started to experience, like with chat GPT, right? It was the chat experience. So we go out there, we dump questions over: how to, what is, how can I type questions to chat GPT? That’s how we all started. So we also started that way with infusion within our products where it made sense. But the spectrum now we’re looking much wider, much far beyond just the chat experience moving towards something called agentics or Agentic AI or AI agents. And what we want to do there is really move towards more autonomy with our generative AI solutions, but also the point of hyper focusing in our particular product areas with AI agents to be super experienced, super knowledgeable, super capable within a given product area like AI ops, SecOps, to do things like the automation.
Let’s just take automation for, we have a lot of mundane tasks that we deal with day in and day out. Even if you look at us individually, there’s a lot we can start looking at for generative AI in our own lives to simplify things, nevermind in the business world. So we have these AI agents now that are focused in our product areas to allow customers to automate mundane tasks, to surface insights automatically. What do we really want to do here? We want to take the cognitive load off folks with generative AI, because we want them to focus on more important business issues. We want them to focus on innovation, they want to innovate just as much as we do. So how much could we help them with utilizing generative AI to push them to that journey?
Steven Dickens: Anthony, I think you said it well. We want them to focus on the good stuff.
Anthony DiStauro: Exactly.
Steven Dickens: Focus less on the operation stuff. What a great way to wrap. You’ve been watching us here on The Six Five coming to you live from Connect with BMC. I’ve been your host, Steven Dickens. Joined as always, by my dear friend Mike Vizard. Please click and subscribe and check out to the other episodes and we’ll see you next time. Thank you very much for watching.
Author Information
Regarded as a luminary at the intersection of technology and business transformation, Steven Dickens is the Vice President and Practice Leader for Hybrid Cloud, Infrastructure, and Operations at The Futurum Group. With a distinguished track record as a Forbes contributor and a ranking among the Top 10 Analysts by ARInsights, Steven's unique vantage point enables him to chart the nexus between emergent technologies and disruptive innovation, offering unparalleled insights for global enterprises.
Steven's expertise spans a broad spectrum of technologies that drive modern enterprises. Notable among these are open source, hybrid cloud, mission-critical infrastructure, cryptocurrencies, blockchain, and FinTech innovation. His work is foundational in aligning the strategic imperatives of C-suite executives with the practical needs of end users and technology practitioners, serving as a catalyst for optimizing the return on technology investments.
Over the years, Steven has been an integral part of industry behemoths including Broadcom, Hewlett Packard Enterprise (HPE), and IBM. His exceptional ability to pioneer multi-hundred-million-dollar products and to lead global sales teams with revenues in the same echelon has consistently demonstrated his capability for high-impact leadership.
Steven serves as a thought leader in various technology consortiums. He was a founding board member and former Chairperson of the Open Mainframe Project, under the aegis of the Linux Foundation. His role as a Board Advisor continues to shape the advocacy for open source implementations of mainframe technologies.