On this episode of The Six Five – On The Road, hosts Daniel Newman and Patrick Moorhead welcome Cloudera’s Chief Strategy Officer Ahbas Ricky, and AWS’s Director, Infrastructure Partnerships Mona Chadha, during Cloudera Evolve NYC for a conversation on building generative AI solutions on Cloudera with Amazon Bedrock.
Their discussion covers:
- The challenges customers face in getting their generative AI projects off the ground
- Cloudera’s & AWS’ collaboration to offer AI solutions for customers, including Cloudera CDP and Amazon Bedrock
- Some customer success stories on building generative AI solutions on Cloudera on AWS
- What the evolution of Cloudera and AWS looks like, and where the companies see the generative AI movement going next
Be sure to subscribe to The Six Five Webcast, so you never miss an episode.
Watch the video here:
Or Listen to the full audio here:
Disclaimer: The Six Five webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded, and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.
Transcript:
Patrick Moorhead: The Six Five is on the road in New York City for Cloudera Evolve 2023. Daniel Newman and I are talking about two of our favorite topics. First of all, generative AI. Dan, it seems like this is all we have been doing for the past nine months, but we’ve also been talking about the reality, to get the best results, you need to have the best data. And this conversation we’re going to have next is really the combination of those two.
Daniel Newman: I was looking forward to knowing what number two was. I knew you were going to hit AI.
Patrick Moorhead: Which one is he going to hit? It’s AI and what?
Daniel Newman: What is it going to be? But I think we all know that the foundational work that has to be done by an enterprise to get value from their generative AI is substantial. It’s palpable, in fact. And we hear a lot about kind of these outcomes. You’re seeing companies with trillion dollar market caps building things that can do generative AI. But what about the rest? What about every other company? What does it take to get that done? And I feel like that’s a lot of what we’re talking about here at the Cloudera Evolve 2023 event in New York.
Patrick Moorhead: It is. And one other thing that we’ve been talking about a lot is about partnership. Truly to deliver the value of generative AI to the enterprise, one vendor cannot do it all. And you need partners. It’s the reality and I think it’s the new normal of technology, but particularly now with generative AI. And probably a good time to introduce our partners. First of all, Mona from AWS. How are you doing?
Mona Chadha: Good. How are you guys?
Patrick Moorhead: Excellent. Ahbas, great to see you again. You are a veteran to The Six Five.
Ahbas Ricky: Yeah. Great to have me again. Thank you. We had a lot of fun last time we did this. We also had a lot of fun in Austin and Formula One.
Patrick Moorhead: It was great to see you. And you might see this guy in Vegas.
Mona Chadha: I want an invite next time.
Ahbas Ricky: Hopefully. You guys have the sponsors.
Patrick Moorhead: We did spend day one with AWS at F1, so I want to thank you. Hopefully I didn’t take your spot.
Ahbas Ricky: Thank you as well because-
Patrick Moorhead: Welcome to the show.
Ahbas Ricky: … I was there as an AWS guest. We had a great time.
Patrick Moorhead: Mona, this is not a good deal here. I was there, Ahbas.
Mona Chadha: I missed, I missed out. But as long as AWS was there and AWS was a sponsor, that’s great.
Patrick Moorhead: Good response.
Daniel Newman: And of course AWS did some really great sessions talking about the relationship between F1 and data, which, by the way, is massive. When you watch it from afar you don’t realize and then you see all those race engineers in front of the screens and you realize a crazy data-driven event. Anyways, we could go down a rabbit hole. That is not what we’re talking about here. We are talking about the challenges that your customers, both individually and jointly, are facing. So Ahbas and Mona, I’d love to start with you there. Talk to me a little bit about what the big challenges, what are the big struggles, the hurdles? Can I come up with any more words?
Patrick Moorhead: I don’t know, objections, maybe. I don’t know.
Daniel Newman: Difficulties that the customers are identifying as it pertains to them getting their generative AI projects off the ground.
Ahbas Ricky: I think there are three core things. One is majority of the customers who are large enterprises in regulated industries, they want to be able to train their models on the data that they have with the enterprise context that they have. Because no one wants to be necessarily going ahead and picking up something from the internet or even through a open API plugin. And therefore, they want to be able to train the data on the systems that it sits, but it’s super hard for them to do it at a price point of their choice. So in addition to enterprise context being number one, doing it in a sufficiently decent TCO is hard and therefore customers are looking for ways in which they can reduce their compute cost and they’re looking for ways in which the hardware acceleration goes down.
And thirdly, the biggest theme for us today, and we talked about it, is trust. Everyone wants to train the models on the data they can trust. Governance is a key issue. I gave the example of the keynote in the morning. There was a bank who asked me that, “Hey Ahbas, what’s your bank balance?” The chatbot should be smart enough to tell what their bank balance is, but it should be smarter to say, “Does this person even should have access or not?” So it’s an authorization governance lineage question as well. So I say a combination of enterprise context, TCF for compute, and trust and governance, are three of the largest themes we see at enterprise customers. But what about you guys?
Mona Chadha: I think one of the things that we’ve seen at AWS is really a lot of customers wanting to even more accelerate their migration to the cloud. And so that’s across all different industries, whether it’s financial services, healthcare, life sciences, manufacturing, et cetera, you name it, customers are now seeing that in order to get the TCO and that economic value and that flexibility and that scalability, the cloud is really where they want to move to and trying to migrate fast. And so a lot of what we’ve seen is that acceleration. I think the other area is helping though, to figure out not everything can move to the cloud as quickly. And so figuring out the preparedness and which workloads should be moved first to the cloud, and that’s where our entire partner network comes in, where our AWS partner network comes in to help accelerate that migration to the cloud.
The other thing that we’ve done is we’ve created a cloud adoption framework to really assess the preparedness of these workloads in the cloud. And then we have programs like workload migration program, the migration acceleration programs that really help customers migrate over to the cloud quickly. I think the other thing that you mentioned was around the infrastructure costs. And so one of the things that we’ve done, especially running these large workloads like those generative AI workloads, is that you need the right compute and having that infrastructure in place, but having it in a cost effective way. And so one of the things that we’ve done is we’ve looked at in order to run these generative AI type of applications, you really have to look at it in three major layers. Around your infrastructure, I would say LLM as a service almost, and then your applications. And I think one of the key areas that customers have been moving to is around the chip sets, getting that infrastructure, getting that processing power.
And so that’s why AWS has Trainium and Inferentia are two key chip sets where they then allow you to get these generative AI applications in the cloud and run them faster, more efficiently and in a cost effective way. So that’s what we’ve been seeing a lot of our customers working on. And then also I think the other key thing is generative AI can seem very complex, and so making it accessible to the right stakeholders and embedding those into applications. So embedding your LLMs or any sort of foundational models into applications in an easy way. So that’s what Amazon Bedrock has delivered in that regard and Cloudera is integrated to that. So having more partners integrate, really make LLMs more accessible.
Patrick Moorhead: In fact, both our companies have written not only on your CDP but also about Bedrock and everything you’ve done out there. And it really seems like a great partnership for two strategic reasons. So first off, and this is not an opinion, this is fact is that AWS runs more AI workloads in the cloud than any other IS provider. And Cloudera, even though it does have cross cloud, it does operate, at least by our estimates, manages most of the enterprise data than anybody that’s out there. And what I’m really interested in, what are the, I don’t don’t know if you remember the Wonder Twins Unite.
Mona Chadha: Wonder Twins. Activate.
Patrick Moorhead: There we go. So what are you doing together to get more folks to be able to activate their generative AI work? What are you doing together? And maybe Ahbas, we’ll start with you.
Ahbas Ricky: Yeah, absolutely. So I think you’re right. We have 25 exabytes of data under management-
Patrick Moorhead: By the way, just pause there. We’ve all seen the big data when you come to an AI or a data conference, but it’s a phenomenal amount of data and I’m really glad that you published that about six months ago. So I don’t really think people understood just how much data that is under management. It is a mind boggling amount of data and still quite frankly, we are again, 14 years into the public cloud and 75 to 90% of the data is still on-prem. So it’s amazing.
Ahbas Ricky: Yeah, thank you. And as I’ve said this before, just to understand the magnitude of that, it’s a hundred times bigger than one of our competitors who happen to be slightly more revenue than us. So for a period of time, that wasn’t necessarily useful because a lot of enterprise customers were struggling with making sense from unstructured data. But now with large language models coming to the picture, they’re now able to apply that enterprise context and get different outputs for different use cases. And some of the ways in which we’re working with Bedrock is Bedrock is a fully managed serverless offering, but they also have a set of models available, their own models, but also coherent and tropic and tight and everyone else. So if you’re a customer and if you want to leverage the power of the models available in bedrock through cloud or machine learning, which happens to sit on CDP, take the data which is sitting on the cloud or open data lakehouse, take that enterprise context, you can do that in a very easy to use fashion. And that is one of the primary ways in which we’re starting to go to market. That’s one.
But then on top of that, we also have our SQL assistant that is powered by Amazon Bedrock, and that’s one of the most common use cases that we see for application developers. I say this often, which is because of large language models, a lot of the application developers now don’t need to become a data scientist because back in the day, every time you trained a model or retrained it or change their attributes, you would have to go to a data scientist. Now you can apply the enterprise context and you can continue to be an application developer and get the outcome. And that is one of the key use cases that we start to see enterprises run with us and Bedrock and go through with it. One last point, so Mona mentioned Inferentia. We have applied machine learning prototype. There’s an entire library of use cases, think of them as blueprints which allow you to get started earlier. We are starting to test those amps on Inferentia too, in addition to some of our other partners as well. And that’s the hardware acceleration TC compute play. So this is purely the generative AI space that we’re trying to do. And obviously, so far as public cloud business concern as pass, we have our partnerships with whether they be S3 or EC2 and that stays as strong as ever was.
Mona Chadha: Perfect. Well said. And not much more to expand on. I think you really covered the gambit of our technology partnership and the integrations that Cloudera has with AWS. Spans from S3 to EBS to Bedrock and hopefully more to come, and then also starting to use Inferentia and our other chip sets that we have in order to run those large language models or any sort of foundational models. I think another key thing that we’ve been doing with our partnership is on the business side. So we have all these great technical integrations, but how do we now go to market and make them available to our end customers? And so recently Cloudera signed a strategic collaboration agreement and what that does is it’s really meant to create transformational initiatives that our end customers benefit from.
And so you’ll start to see more of that, providing our customers with those integrated solutions that let them run their specific large language models or any sort of foundational models according to their specific use case. And so we’re excited as we saw a bunch of customers here today. Like Illumina for one, it’s exciting to see how these customers are using our joint solutions together. So you’ll start to see more and more of that and just as our partnership evolves, I think we have such a great opportunity together, and also with some of the other partners that are here today to really bring those solutions across the different industries to our end customers.
Patrick Moorhead: Wonder Twin powers unite.
Mona Chadha: We do.
Patrick Moorhead: There we go.
Mona Chadha: It is, and there’s two examples that I did want to share. So there’s Terre Des Hommes, which is a company in Switzerland that does matching. What they have done is they’ve created an epidemic alert system specifically for children. And they started with malaria, but it’s on Cloudera, Cloudera on AWS, and it’s really starting to build some of those solutions. Then there’s Be The Match, which is also more of a donor type of matching system, that really there’s 41 million donors that now within seconds, I think it’s 35 seconds, they’re able to do a match. So it’s like if you think-
Patrick Moorhead: And again this is Cloudera and AWS?
Mona Chadha: Cloudera and AWS.
Patrick Moorhead: Okay. So these tools would be GA?
Mona Chadha: Yes. Well these are already customers.
Ahbas Ricky: These are already production customers.
Patrick Moorhead: I got you.
Mona Chadha: So customers are already-
Patrick Moorhead: That’s impressive.
Mona Chadha: … using Cloudera CDP on AWS. And what’s also cool is that these are real solutions that we’re building and we’re making them available to customers through AWS marketplace. So Be The Match actually is available, they actually purchased Cloudera on AWS marketplace. And that’s ultimately what we want, is to get technology in the hands of customers when they need it, in almost real time.
Daniel Newman: I think this is a very good example though of the pace of innovation as well. Now I know AWS, there’s been some pretty inaccurate depictions of AWS being late or behind because you didn’t join the party nine months ago for some rapid onset LLM announcement. And by the way, both of us, in varying ways, have debunked the myth. First of all-
Patrick Moorhead: And you characterize it as a marathon as opposed to a sprint.
Daniel Newman: You’ve been at it a little while and you’re the number of workloads on AWS infrastructure plus the amount of people using SageMaker and other AI is palpable, it’s substantial. And by the way, we’re so early that even if they did get any advantage, you’re talking this much of this market. But the speed that you can work with a large partner, get products in production, put them in the app marketplace and get them deployed, is very, very impressive. And probably a little bit of a sign of things to come. Now that allows me to take this home. So I’d like to hear from you both, and Mona I’ll let you go first because Ahbas has been answering these questions so well that he doesn’t leave you anything to talk… I’m kidding. I’m kidding.
Mona Chadha: I know. I was going to say, I don’t have much to say.
Daniel Newman: You found your words, but what’s next? And this is a moment to talk about the partnership, but also the world always wants to hear about AWS. So where does this evolve in terms of AWS Cloudera? Where do you see this whole generative AI movement going?
Mona Chadha: So I’ll talk about it in terms of our perspective, because you brought it up, I think you teed it up where there’s this perception that we’re late to market. And I’ll just say just on the Amazon front, we have been working with AI for over a decade and it is visible in some of the solutions that we have today around Amazon Go with the stores, the cashless, the-
Patrick Moorhead: Small little retailer.
Mona Chadha: –cashierless stores, a small little retailer, plus what we did with Alexa devices. But those are the things that come to mind as we’ve been at this for a long time. And as we’ve approached AWS with the whole generative AI lens, it really is around these three mega layers or three layers. The first thing we talked about, which is around infrastructure. And so having AWS Trainium and Inferentia has been a game changer when it comes to running your models, and you need that compute power. The second layer being that almost like a LLM or foundational model as a service and being able to integrate LLMs into your applications via an API, that’s Amazon Bedrock as an example. And then the third layer is really around application. So then being able to leverage generative AI into your applications and what we have with Amazon Code Whisperer. So being able to develop code really quickly, almost like a code assist to then develop innovation faster.
So to your point of acceleration, this is where we’re headed is to really, as a partnership, get customers and ourselves to innovate quicker and faster. So you’ll start to see even more integrations with Cloudera and AWS and it’s going to be around many different facets around those three layers. But also just the intent being that with all the data that you have, making sense of it and then using it in order to innovate faster. And then the other angle I would say is also bringing partners to the table that customers are going to have these complex use cases. So I think one of the things that’s really important is to also bring SIs, channel partners to the table that can really help deploy it quicker and faster to our end customers. So you’ll start to see a lot of that from us as a partnership as well. And I’ll leave you with some stuff what to say, because I’m kind like that.
Patrick Moorhead: Well it is a future question, so it’s pretty much boundless here.
Ahbas Ricky: No, I think you’ve hit the nail on the head. A majority of the points, just two points is to accentuate and double click on, which is the commercial agreement that Mona mentioned, that is an important milestone because for us as a hybrid cloud company, we are a hybrid data company, but we are public cloud first. We want to make sure the majority of our customers are able to A, get onto public cloud as fast as possible, but also be able to leverage the credits that they might actually have via AWS with them, and we’re the fastest way to get them to burn down those credits because of the strategy that we have, because of the data that we have. So this commercial agreement, the strategic collaborative agreement will actually create a push in the go-to-market teams, which was one of the key things that came out of it. Because we always talk about the cool product integrations in the story about that, but ultimately, as you guys know it better than anyone else, you have to sell it.
Patrick Moorhead: Well, and I’m so glad when you brought that up on GTM, I’m like, whew. And then you’re talking about it too. Because the best technology doesn’t always win. It has to be taken forward in the marketplace, positioned, well, sold, and there has to be economic opportunity all the way along the value chain. And again, I view that as the growing up of the cloud. It’s a teenager, it’s 14 years old, and we’re to the point where a lot of companies who are engaging in it, if you had asked me five years ago if certain companies would be, I’d be like, no way. And to your credit, I see names showing up in marketplace that I never would’ve expected.
Mona Chadha: You just talked about this, right? You were mentioning.
Ahbas Ricky: Exactly.
Patrick Moorhead: Imagine that we’re bringing this into the conversation here.
Mona Chadha: Yeah, exactly.
Patrick Moorhead: So yeah, sorry to cut you off of that.
Ahbas Ricky: No, no, totally fine. At least she was being kind. That is true.
Patrick Moorhead: Exactly.
Ahbas Ricky: No, I’m joking. No, I think that was the first point, the commercial point. And the only other thing which I said when Mona mentioned about further integrations, there might be, that is true. And we’ve even started to work with some of the vertical solutions teams because for financial services and telco, those are two industries whereby we have a lot of joint customers and we want to be able to get them to a point whereby they can develop end-to-end applications into an interoperable way at a price point of their choice. So that is one of the other things we’ll continue to develop.
Mona Chadha: And then that’s where a lot of customers in those industries have asked, “Hey, we want to purchase through Marketplace because of all the benefits that they get out of it.” So AWS marketplace makes it real. So all the solutions that we’re building, all the cool, innovative, transformative stuff that we have through the SEA, are brought to life now through the AWS marketplace to our end customers.
Daniel Newman: And dealing with the industries and dealing with the complexities and then providing a marketplace really does match what we see as the way the rising decision makers want to consume. Which is very important too in the AI era that we make things more frictionless. So it sounds like you’ve hit it on the head. We look forward to tracking that. And of course, our opinion will mean a lot in terms of whether it actually succeeds.
Patrick Moorhead: I thought my opinion meant more. No, I’m just kidding.
Daniel Newman: We know.
Patrick Moorhead: As AWS, yes.
Daniel Newman: We know, we know. Ahbas, Mona, thank you both very much for joining us here on The Six Five.
Mona Chadha: Thanks for having us.
Ahbas Ricky: Thank you for your time. And hopefully see in Vegas or maybe Abu Dhabi.
Patrick Moorhead: You will see me. Waiting for that invitation.
Ahbas Ricky: We’re all three looking at somebody.
Mona Chadha: I know.
Daniel Newman: We’ll definitely have you back soon.
Mona Chadha: All right, sounds good.
Patrick Moorhead: Thank you.
Daniel Newman: All right everybody, thank you all so much for joining us here at Evolve 2023 in New York City. Pat, it’s been a lot of fun. Hit that subscribe button, join us for all the other episodes that we did. We covered a ton of ground, but for this show, for this particular episode, Pat.
Patrick Moorhead: We’re out of here.
Daniel Newman: Thank you.
Patrick Moorhead: See you later.
Author Information
Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.
From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.
A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.
An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.