On this episode of the Futurum Tech Webcast – Interview Series, host Daniel Newman welcomes HPE’s Bryan Thompson, GreenLake Product Management and Dave Shore, North American Sales Director, Edge to Cloud Services for a conversation on how the evolution of AI and hybrid cloud impacts IT investments and the need for private cloud in every IT Leader’s cloud strategy. As the public cloud operating model has set a new standard for automation, self-service, scalability, and pay-per-use– now being replicated on-premises. With so many choices, how do you find the right cloud operating model to fit your economics?
Their discussion covers:
- The current macroeconomic environment and its impact on IT strategy and spending
- What to look for in a hybrid cloud operating model, both on- and off-prem
- How to ensure your private cloud is modern and optimized for a data-first, AI strategy
- Advice for IT Leaders as they look for effective ways to leverage AI
Watch the video below, and be sure to subscribe to our YouTube channel, so you never miss an episode.
Listen to the audio here:
Or grab the audio on your streaming platform of choice here:
Disclaimer: The Futurum Tech Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded, and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.
Transcript:
Daniel Newman: Hi everyone. Welcome to this Futurum Roundtable. I’m your host today, Daniel Newman, CEO of The Futurum Group and Chief Analyst at Futurum Research. Excited for the round table today. I’m going to be joined by two really big thinkers from HPE, Dave Shore and Bryan Thompson. And we’re going to be talking a lot about what’s going on from a very broad technology perspective, a little bit to talk on AI and the macro economy and its impact on IT. And then we’re going to talk a little bit about what CIOs are thinking about what tech leaders are doing and how they’re planning their investments in technology in 2023 and beyond.
Before I bring my guests into the show, I do want to set a little context here. We are in the middle of 2023 and if anybody hasn’t heard about what’s going on with ChatGPT, generative AI, transformer models and the impact that they’re likely going to have on the economy, you probably haven’t turned your computer on or haven’t watched any TV or haven’t looked at social media. The truth is this is happening really fast and we’re hearing from the new modern disruptive tech companies and we’re hearing from very well established companies that they all plan to make AI part of their strategy moving forward. We’re going to talk more about that.
And also the economy. We’ve seen a number of interesting inflections this year. Interest rates have been spiking at an unbelievable rate as the Fed has tried to get inflation under control. And don’t mistake for a minute that this doesn’t impact how technology is being invested in by companies. We know that technology has deflationary aspects and technology also has cost centers and companies are looking at the strategic value including AI, but so much more, whether it’s their cloud operating model, where to put workloads, how to invest to make the most of their data. These are things that are on the minds of CIOs and that’s going to be the final thing we’re going to talk about is what are the CIOs, the IT technology leaders all the way up to the boards doing when it comes to strategy around the investments in IT and the way they’re applying it to make sure their businesses stay resilient and in many cases stay in front of these technology trends that are moving so fast?
So without further ado, Bryan, Dave, welcome to the Futurum Roundtable.
Dave Shore: Thanks.
Daniel Newman: So Dave, I’m going to start out with you. First of all, I set the tone there. I didn’t ask you what the trends are, but with you guys talking to customers every day, just tell me a little bit, how did these trends that I brought up resonate with the conversations you’re having with customers?
Dave Shore: Yes, we are seeing AI almost overnight moving from a curiosity to an imperative. Virtually every customer I talk to is trying to figure out how to ensure that the inevitable destruction that AI will have in their industry is something they’re ahead of instead of behind. Last year, by contrast at how fast this has moved, almost every customer we talked to had small back burner AI initiatives that were all in some early pilot. Now they’ve become front burner prioritization. So the change is happening so quickly, faster than perhaps any trend I’ve seen in my career.
Daniel Newman: Dave, I’m so glad you said that. I am hearing and seeing the exact same thing. I’m talking to hundreds of companies every month, CEOs of dozens of fortune companies, even just this earning cycle, just seeing the amount of talk that went from being focused on parts of digital transformation to getting very specific, the speed of which open AI was embedded into products at companies like Microsoft has caused every company to just take pause and say, “This is a trend that one, we’re not putting this genie back in the bottle. This is not going to go backwards. And two, we need to figure out both internally and externally how this trend is going to impact our business. So first, how do we run our business more efficiently and upskill our workforce? And then B, how do we build these kinds of technologies into the products and services and especially in the tech industry, you guys are going to have to do this for yourselves and then you’re going to have to do it into products that are going to be verticalized across your industries.”
So Bryan, let’s talk about that. I alluded to it here, but AI is coming in hot. I think that’s a good way to put it. What are the challenges that you’re seeing and identifying for enterprises to be able to capitalize on this quickly?
Bryan Thompson: Much like David mentioned, you’re seeing this overnight change, which is really how do I leverage these types of technologies to gain insights from my data to drive decision-making? And in our context, what I’m really seeing is how do I actually do that decision-making where I’m confronting that data? The whole concept of driving actions and decisions and analysis to the edge.
Examples, we’re working with a retailer where across their stores, how do I enable real-time data analytics and AI analysis of video feeds to cut down on shoplifting or theft? Things that I’m not going to ship all that video image back to the public cloud or back to a central data center, try to process and then send activations back out. It’s how do I bring those resources to leverage that type of learning and automation to those edge locations to really drive that type of decision-making, realtime insights and analysis and actions, whether it’s manufacturing or any number of different verticals where we’re seeing it.
Dave Shore: Bryan, I want comment on something you just said. So it seems like as quickly as AI came on the scene, moving all data to the central cloud location or central data center location, which has been the dominant trend for decades, suddenly seems dated. Along with it, the concept that we have in IT that every piece of data just gets saved somewhere because storage is cheap, so why not? When you’re collecting terabytes of data per day, those concepts just don’t work anymore.
Bryan Thompson: Yeah, I think you’re exactly right and I think we see that the comment that has been so true for so long, data has gravity. For certain systems, that makes absolute sense, but the more and more that we see where we’re pushing out that real-time analysis, we think about internet of things and the amount of data that’s being thrown off, where do I ingest? Where do I collect? What do I activate on that? Then how do I quickly decision what I need to keep or not? Because we are producing so much of that and then where we can actually provide real-time automation and insights to that. It changes the way in which I think a lot of businesses are actually trying to treat or approach this problem.
Daniel Newman: Yeah, I’ve actually identified this trend for some time. It’s been evident to me that first of all, hybrid cloud was going to win. And so I’ve been pretty bullish for a long time about what HPE and your team is doing and I want to hear a little bit more about it, Bryan, with you being on the product management side, but I do believe that the cost of moving all the data. There’s companies like Snowflake and everybody thinks they’re dominating the data, but realistically they have less than 1% of all the data under management. It’s really the more traditional data management, whether it’s to do or whether it’s some of the traditionals, the teradata. These are the companies where the data still lives. It’s actually not living in the public cloud and it won’t because it’s too expensive to do that at scale.
Now, certain workloads, certain trainings, certain inference. So this is the same trend I’m seeing with public cloud. Yes, it will grow because the overall amount of compute will grow. This generative AI thing is going to drive a massive amount of compute into the market, but it’s not all going to be in the public cloud. It’s going to be a lot of it on-prem because training’s going to be super expensive. There’s so much to figure out about how to deploy these large language models and do it effectively, and then not even to mention the edge where more data lives than even in the data centers. So these are all going to be really big opportunities.
Bryan, I’d love to get your take though. You’re driving product for GreenLake. We’re talking about this is only ons. You’ve known this is going to happen, but the onset of this and the pace in the last six months, how is it shaping your decision-making about product development?
Bryan Thompson: Yeah, it’s interesting is it creates obviously opportunity is we look to help enterprises solve these problems. How do they embrace these technologies, leverage solve their own data problems of where am my ingesting data, how do I control the movement of that data, provide access to it to the systems I need? How do I bring things like AI and machine learning and things to that edge to act on that as part of that more efficient use of that? But even as we think about it within our own services, how are we leveraging AI and learning technologies to provide better solutions and services?
If we think about the GreenLake portfolio and providing that spectrum of different services and solutions and underlying infrastructure, there’s a ton of telemetry data that we pull back from that. How do we leverage that and provide automation and insights to help customers optimize workloads or think about workload placement? To your point before, and I think we’re solely in that camp as well, is hybrid cloud is the way to go. How do I think about smart and logical placement of the right workload in the right location and leverage that connectivity and visibility to data and insights across that broader estate? Doing that with automation in real-time is the only way anybody can be successful. So we really look at that. How do we leverage these technologies to provide better and more effective solutions to our customers?
Daniel Newman: Yeah, I think you’re absolutely right, but you heard me in the setup, Dave. I talked a little bit about the economy, the macro. There is a consensus that IT budgets are going to be more protected because of the potential productivity gains, the deflationary aspect, but at the same time, companies across the board have seen budgets slashed and of course greater scrutiny. They’re spreading out their time to investment. These things are all kind of happening concurrently. So how are your customers, Dave, transforming quickly, but at the same time being conscious of these more stringent budget constraints?
Dave Shore: So one of the things to think about, Bryan said it before that creating and training a new model is expensive. So are the data scientists and they’re hard to find, expensive to hire and difficult to retain. So one of the things that we’re being asked by our customers is could we please make sure the data scientists can spend their time doing data science as opposed to putting together the tinker toys of AI? We’re in the Wild, Wild West with AI. There are so many tools and finding how to integrate them, making them all work together so you can create your learning model. It’s very complex.
And so, one of the things HPE is asked to do and we’re delivering for customers is taking those integration and administrative tasks off of the hands of the data scientists so they can spend their time on what they’re good at. Not only does it make them more affordable, it makes them want to stay in their job because that job satisfaction is in many cases more important than the pay in terms of them sticking around, building and retaining those models that our customers are dependent on to accelerate.
Daniel Newman: Yeah. Dave, we’re going to have another conversation at some point about this. The last five years, it’s been all about the developer and the data scientist. I do wonder how this is going to start to rotate as we used to need to know how to build a model and now I wonder if open source and model efficacy is going to be built in these large language models that are going to generate code and improve themselves. It is going to happen, but I think your points are absolutely right that right now, those are key roles that companies need and it’s going to be all about taking these open source models, open internet data and combining it with this wonderful wealth of enterprise unique data that companies have in putting it together.
Bryan, I talked to you guys in the beginning about both the economics and what CIOs and decision makers, and one of the things I’ve said is that the mix of offerings in the hybrid space for consumption have largely been at the very base level, meaning storage, compute. That’s what most have come out and offered. That’s what most of the revenue dollars have been. But one of the things that I’ve said is that the crawl chart of the HPE portfolio has been impressive comparatively.
Now, the big public cloud providers have massive portfolios of services. Most of what I’d call the on-prem cloud, it’s like two or three services, and I think that’s really been a bit of a challenge for those businesses to pick up. I think GreenLake is far ahead, has a much broader services base. How is that playing into winning in this current economic situation where you have data services, you have security services, you have edge services? So you’re able to offer that more broad portfolio really making that hybrid opportunity more congruent. Is that the winning formula? Does that stay the winning formula, Bryan?
Bryan Thompson: No, I think you’re hitting a really key point, and I think this actually builds on the comments Dave made is if you think about even the overall constant pressures from IT organizations, I need to do more with less. That challenge never goes away. Nobody’s getting more budget, but it fits into that same thing as I look for solving those problems of either at the edge or in my data center or in colo, I’m looking for that same experience. How do I consume that cloud-like experience where I need it to be? And it goes back to let my data scientists be data scientists. I think we’re seeing the same trend and we’re seeing this… I saw coming out of the pandemic, this was interesting, right? Is working with some companies where for example, CTO of an airport, during the pandemic travel had all but stopped, laid off 50% of their staff. Pandemic’s over, travel resumes to normal. How do I now attract or retain that staff and build back up to those same states?
And it was that eye-opening moment of it’s more valuable for him to have his staff working higher up in the data plane, working on the application to your workloads, things that are strategic or serve their customers or businesses, not the care and feeding of that infrastructure. So where they can find those cloud services and that broader portfolio, not just the raw building blocks. How do I go beyond just those cloud primitives and enable more higher function, higher value services in that consumption cloud model? And then I can take that scarce workforce and move up into those areas that are more value or strategic to the company. I think this is that interesting trend where it’s like why would I build it if I can buy this as a service and consume it where I need it to be?
Dave Shore: I’d like to add an example to illustrate what Bryan’s talking about. So we have a customer who is in the hospital imaging business. We were able to help them create an AI solution that literally allows the patient to have less radiation exposure because the AI system is enhancing the quality of the image with less x-ray, but still giving the doctor better imaging so that they can diagnose and recommend treatment plan. We built the entire pipeline for that AI model. We didn’t do the AI itself. We brought them the end-to-end tool set of software infrastructure, provided all of it in an as-a-service model.
So they had that cloud experience, but they had it with dedicated infrastructure and it worked for them. They are now considering applying those models into other industries where they can enhance the quality of a radiation image to better in oil and gas and other industries even outside of healthcare.
Daniel Newman: Yeah, I love that you mentioned the industry bend. I do think the shift we’ve seen over the last few years has been a bit more of what I call a vertical facade on a lot of cloud or tech marketing. But I think we’re going to move from facade to very, very applicable use case driven business units, especially in highly regulated industries, but really across the board. As companies are trying to actually implement this tech and keep up with the pace, it’s going to be upon us in the tech industry to be very prescriptive and to make this stuff very accessible. The way developers are able to use GitHub to be able to start their code foundations is the way technology is going to need to build cloud first industry driven, and it can be hybrid, it can be private. I’m not saying all public to be very clear, but cloud first solutions say, “We’re financial services. What’s different about what you’re offering us versus the healthcare versus manufacturing versus retail?” We’ve sold the story, but now we actually need to really get better at making sure these solutions are truly differentiated.
Dave, you’ve been with HPE for some time, probably when it was HP. So talk a little bit about the fact that we talked about that pendulum from cloud first to suddenly the world was going to go cloud only, and actually, now we’re seeing the pendulum swinging back and if nothing validates it more than what you’re doing. Even the public cloud providers have all made big leans into hybrid architecture. So they get it, you get it, everybody gets it, we get it. But private cloud often gets the least. So everybody talks about hybrid public, but what about HPE’s approach to private cloud? Is that something enterprises are understanding and seeing value in the conversations you’re having with customers?
Dave Shore: So yeah, it’s often an eye-opener when we talk to customers about what we can do in that space. Private cloud is nothing new, right? It’s been around for at least a decade, but it had some attributes that we’ve changed. Private cloud used to be very complex to set-up and run. It required your own company CIS admin, your own company’s data center and physical assets that you own. So it might have been a nice piece of software, but it wasn’t giving the cloud experience that our customers want.
So what we’ve done with private cloud is we’ve turned it onto its head and we’ve said, “We’re going to give you everything that you want in the cloud experience, but bring it to you wherever you want it to be so you don’t have to own the assets under GreenLake, and you pay for how much you use.” There’s always extra capacity. So you have virtually unlimited capacity that will grow as you need. We have a cloud appliance that Bryan helped invent for the company. So hats off to you, Bryan, and that appliance means that we’ve already done the work of creating a cloud experience before you touch the keyboard.
And it doesn’t have to be in your data center. It can be in a colo that we have relationships with, which are some of the biggest in the world, or it can even be your edge if you need to extend the cloud experience to your store, to your financial branch, to your hospital location. So cloud can be anywhere and without need to own the assets, own a data center or do CIS admin, we’re truly providing the cloud experience.
One last thing, we can manage the public cloud from our cloud. So if you want to launch virtual machines or containers into make your favorite hyperscaler, we can do that from our cloud as well, which means we can provide the control plane to run your enterprise in a multi-cloud world. This is very different from the cloud of 10 years ago.
Daniel Newman: Yeah, you brought up some good points there Dave, and we didn’t talk a lot about multi-cloud, but let’s not for a minute mistake the fact that that’s going to be a big trend. And multi-cloud isn’t just multiple public clouds. We really see it as it’s a combination of your edges, your telco clouds, your obviously on-prem and private data centers as well as the use of the public cloud. And building structures to manage that is going to be very, very important because there’s so much you can benefit from being able to encapsulate and utilize every service that’s out there from every provider, but you do want to simplify the control. And so what you just mentioned there, the ability to simplify the control becomes a very compelling item for CIOs and IT leaders that are trying to get these very complex architectures under wraps so that they can focus on building the apps and deploying the experience is not focusing on how to make hardware work.
So Bryan, we hit on a lot here today. We started big, we’ve gotten narrower. Hopefully all the IT leaders that are out there are listening to this. I think everybody can relate that the impact of these broad technology trends are shaping and have markedly changed the strategies for IT leaders right now. Any final advice? Any thoughts for our audience out here today at this Roundtable?
Bryan Thompson: Yeah, I think we’ve touched on… We think of the emergence of these technologies and how quickly they are changing the way people do business and reinventing new ways and creating new problems to try to solve for. Where we’ve been out in front in some ways and in some ways just listening to customers and trying to see what are the problems that we need to solve, I just look at that same thing thematically. We see hybrid as that route. And to your point, which I think you’re spot on, it’s how am I managing my cloud estate across private, public edge locations?
And for us, we focused on how do we leverage things like open standards and consistency so that you can operate in the data plane? And with the common tooling that you want to use and enable your users with, but leverage that power of whether it’s again, in an edge location, I’m processing data in real-time at the edge, in my core data center with public cloud workloads. That type of visibility, interoperability is I think at the core of solving these challenges and being able to react to the emerging technologies and themes that they go, and I think that’s where we’ve found great resonance with our customers in jointly solving those problems together.
Daniel Newman: Well, Bryan and Dave, I want to thank you so much for joining this Futurum Roundtable. It’s a great partnership conversation here with HPE. Love getting the takes from the field, from the product development side, and of course, Dave, from you being out there in front of the customers. It’s always important to put those things together when it comes to sharing our experiences and helping the other companies and all of you out there that are listening get a better sense of how your peers, your competitors, and how the industry is approaching these big challenges that also create tremendous opportunities. So thanks guys.
Bryan Thompson: Thanks for having us.
Dave Shore: Thank you.
Daniel Newman: All right, everybody, thank you so much for tuning in. Check out all the show notes related to this Roundtable. If you do, there will be links that you can learn more about what HPE is doing and more about what Futurum’s take are on all these trends, from AI to the economy to cloud to CIO decision-making. For this Roundtable though, it’s time to say goodbye. Thank you all for tuning in.
Author Information
Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.
From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.
A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.
An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.