Search

Making Markets EP 44: Elastic CEO Ash Kulkarni on How Generative AI is Changing the Technology and Business Landscape

In this episode of Making Markets, Ash Kulkarni, CEO of Elastic, talks with host Daniel Newman about the company’s recent earnings and growth of Elastic Cloud, the increasing demand for its solutions in the areas of search, observability, and security, and how he expects generative AI to help Elastic and other companies get more out of information systems in a much more human, intuitive way.

You can grab the video here and subscribe to our YouTube channel if you’ve not yet done so.

You can also listen below or stream the audio on your favorite podcast platform — and if you’ve not yet subscribed, let’s fix that!

Disclaimer: The Making Markets podcast is for information and entertainment purposes only. Over the course of this podcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we do not ask that you treat us as such. 

Transcript:

Daniel Newman: In a search for relevance, Elastic has long been a leader. In this episode of Making Markets, I talk to Elastic’s CEO, Ash Kulkarni, about the business’s most recent quarter and full-year results. We also talk about the impact that generative AI is having on not only Elastic, but on the technology and business markets as a whole. We’re back, so stay with me here. It’s Making Markets.

Announcer: This is the Making Markets Podcast, brought to you by The Futurum Group. We bring you top executives from the world’s most exciting technology companies, bridging the gap between strategy, markets, innovation, and the companies featured on the show. The Making Markets Podcast is for information and entertainment purposes only. Please do not take anything reflected in this show as investment advice. Now, your host, CEO of The Futurum Group, Daniel Newman.

Daniel Newman: Ash Kulkarni, CEO, Elastic, welcome to Making Markets.

Ash Kulkarni: Thank you for having me.

Daniel Newman: It’s great to have a chance to sit down with you. Your pretty recent appointment is CEO at Elastic. Talk a little bit about that transition. I know you brought a lot of experience. McAfee… You came in to run products here at Elastic. How’s that transition going for you, Ash?

Ash Kulkarni: Yeah, thanks for asking. It’s been about a year and a half now, so it does not feel new anymore. But what made the whole transition very easy was even coming into Elastic, I came to Elastic having first been both a user and a buyer of Elastic. So I really knew the technology. I understood the market opportunity.

I always felt that when you think in terms of unstructured data and just how much data we are creating, Elastic is one of those rare technologies out there that allows everyone to really get the answers that matter to them in real time from any data that they might have, and that’s a really strong value proposition, and I’d seen that in the past. And so when I came to Elastic to be the chief product officer, it was knowing that this technology has tremendous opportunities ahead of it. And it was the team; it was absolutely one of the finest teams that I’d ever met.

And then, a year later into my job, as we were continuing to build on the foundations that had been set ahead of us, I had the opportunity to take on this role, and what made it much easier even then was the fact that it was a team that I knew. It was a company that I’d really gotten to understand. Shay Banon, our founder and then CEO, who brought me into the company, went back to his earlier role as CTO, and he’s back to driving some of the core forward-looking innovations that we are working on within the company.

And just being able to now lead this company to the next stage of its growth and evolution has just been a wonderful experience, so lots of new learnings, as you can imagine in any new job. And as you know, we’ve gone through some pretty challenging macroeconomic times. But all of that notwithstanding, I’d say it’s been a pretty smooth transition, and just it’s been a joy working with this team.

Daniel Newman: Absolutely. And you said a few things probably worth unpacking. First of all, the inflection point that we’re at, as it pertains to being able to find what you want within a massive exponential growth of unstructured data, cannot be understated, and I’m going to come back and talk to you a little bit more about that, Ash when we go a little bit deeper. But you did recently report Q4 and the full year. Just talk a little bit about the highlights.

Ash Kulkarni: Yeah. So very proud of how the team delivered, and we grew the business by 19% in constant currency in Q4. Elastic Cloud grew by 30% in constant currency year over year for the full year because our fiscal year ends in April. So also, this was not only Q4, but we reported the summation of the entire year. We crossed a billion-dollar mark, and for the whole year, we grew the business 28% year over year.

So really proud of the traction that we’ve had across all segments that we operate in, whether it’s search, and also our solutions in observability and security. We grew the total number of customers by over 300, and now we have over 20,000 customers, and customers that pay us over a million dollars, as well as customers that pay us over a hundred thousand, grew very nicely.

So, all in all, I’d say, given the environment around us, it was a pretty fantastic performance by the team, and it just goes to show the commitments that our customers are making as they consolidate onto our platform. Because that’s another big trend that we are seeing, that given the performance that we deliver for our customers, the value that we deliver, and the total cost of ownership, which is unbeatable, we are seeing more and more customers consolidate onto our platform for an increasing number of use cases, and that’s just the strength of our platform approach. So very pleased with the performance.

Daniel Newman: Yeah. Well, and a couple of things. As industry analysts, we don’t hyper-focus on earnings themselves, but I always say it’s the ground truth. So, while a lot of times industry analysts want to focus just on the product and equities want to focus just on the numbers, really, the two things tend to have a significant amount of interdependence here. And so when you look at these results, for instance, a couple of things that you said.

One, the large customers spending more than a million: always very indicative of how sticky the solution is, and oftentimes the what I call net revenue expansion, because a lot of times to get to those numbers, it means they’re adopting multiple parts of your portfolio, not just a single product, for instance. And then that overall customer growth tends to always tell a story as well. A, are you grabbing large customers? B, are new customers signing up and spending money with you? Those are things to me that are always indicative that you have a product that people are seeing as high value in terms of building, and in your case, building their products, which is often on top of the Elastic engine.

Now, one thing you said, and I just want to get a little bit… Because when I have CEOs on the show, I like to focus on… The macro for 2023 was a little bit tough. Coming into the year, there was a lot of recession expectations. Now, we still have pretty high inflation. We still have interest rates high, although to some extent the tenure seems to indicate the market doesn’t fully believe it. But at the same time, we’ve seen a little bit of this crazy run around tech, but it seems to be mostly on this GenAI, meaning a few names are getting a lot of value. But what’s your just feeling on the overall macro? Are we coming to the other side of it now that the supply-chain issues and the chip glut’s over and that we’ve got this cool GenAI thing that’s driving demand, or do we still have a little ways to go?

Ash Kulkarni: So the way I see it… And bear in mind that I talk to a lot of customers and I’m traveling constantly and meeting with our customers. And the thing that I’m seeing is across the board, customers haven’t stopped worrying about optimizing their spend. That’s something that they’re doing actively, and in some ways we are leaning in and helping them do it, because fundamentally the future isn’t exactly certain. People are seeing lots of positives, the ones that you talked about, but then there are also the concerns: the concerns about high inflation, the concerns about the war in Ukraine… Russia’s aggression there hasn’t fully hasn’t stopped.

So the question is how does this evolve from here? And the approach that I’m seeing customers take is one of very pragmatic, thoughtful, cost-conscious behavior, and that’s good. That’s not a bad thing. And so the way we are approaching it is actually helping them optimize their spend. But every time they optimize their usage for any given workload, it becomes an opportunity for us to also talk to them about: given the fact that we have this amazing total cost of ownership and we are in it with them, helping them really get the most value out of their investment in Elastic, how can we help them do the same with other use cases that they might not be running on Elastic today?

They might have observability use cases or security use cases that they might be using other incumbent solutions for that aren’t as flexible, that don’t give them that same total cost of ownership. And what’s been happening, at least in the last few quarters, is we’ve been leaning into this motion. We are seeing larger and larger commitments from customers to move those workloads onto our platform, and this comes from the intense customer-centric approach that we take to the market. We are constantly in it with them, not trying to stop them from optimizing, but really leaning in and helping them optimize.

So when I step back, what do I expect for the future? I think it’s really hard to know precisely how the macro environment will trend, but I do believe that technology overall is going to continue to be the winner, because no matter what, technology is helping customers get more value out of their investments: help them do more with less.

I think the old adage of software’s going to really eat the world or dominate the way we think about things, I don’t think that’s changing. I think that’s ever more true. And when you talk about things like generative AI, I think it goes even further in helping us approach information systems in a much more human, intuitive fashion; get more out of the systems and processes that we have; really become much more efficient. I think it’s a great thing, and companies like Elastic that have been investing in this area for a while, and others, I think, will be big beneficiaries in this.

Daniel Newman: Yeah, Ash, I like where you’re going with that. I actually published a piece at the end of 2019 / ’20 on MarketWatch, and it was funny. I said in 2020, semiconductors would eat the world, because you can’t run software on air, but you really do have the marriage between the two. And at what we’re seeing right now, for instance with NVIDIA’s recent run-up, there’s just the incredible power of hardware and software working together, and of course, things like Elastic sit right on top of that.

And by the way, I really wanted to mirror some of your comments about the customer-obsessed approach. For companies like yours that are building and growing beyond a billion dollars, I’ve truly seen that a customer-obsessed culture can be the differentiator between getting from that first billion to that next billion.

Another former guest of the show, Charlie Giancarlo, Pure Storage’s CEO… Actually, I don’t know if you’ve ever tracked their incredible focus on CX, and they actually post their net promoter score every quarter and they talk about how their net promoter score is four times higher than anyone else in the storage game.

So being able to quantify the customer-obsessed approach… Obviously, first it’s qualify; it’s anecdotal. You’re talking about customers coming to you and saying, “Look, we’re going to commit more,” and then eventually, it becomes quantifiable, where it’s like, “Hey, let’s go over and over again,” and then it becomes a momentum-builder, which it seems that with your growth rates and cloud and overall revenue that you’re there.

Now, we’ve flirted here, Ash, around GenAI. And so no more flirting. We’re going to go right into it here. In the GenAI realm, this has been the biggest thing, and you guys being the search within unstructured data and all data is a massive opportunity for you guys. Can you talk a little bit about your recent news launching into, what is it, Elasticsearch relevance engine? You announced it at Microsoft Build. Talk a little bit about that, what it means for your play in GenAI, and how you see that really aiding the next wave of growth for Elastic.

Ash Kulkarni: Yeah. So Elastic has always been about search, and we’ve always… If you think about search and you think about it from a historical perspective, we started 50, 60 years ago with SQL. It was one of the most structured and one of the earliest languages in terms of how you interoperate easily with information systems. And then when search came along, it was because there was more and more unstructured data that we were trying to mine and analyze, and search gave a much more human intuitive approach.

Now you can just type a string in a bar in a toolbar and you get a response, but that response ranks the answers in a way that is relevant to you. So search has always been about relevance, and now generative AI is the next incarnation. It’s the next wave of more human-centric, chat-like intuitive experiences, again, with these information systems.

And the perfect example of where we fit in is if you step back and say, “What do you need to build a generative AI application?” I think there are two components that are must-haves. You can’t do without these two. So first is a model itself: a machine-learning model. The ones that have seen the most prominence are obviously these transformer models. We refer to them as large language models like GPT-4, like Bard from Google, et cetera, and there are many more out there. And these models themselves are evolving. They’re getting smaller, they’re getting more domain specific, et cetera. But the first thing you need is this model.

The second thing that you need is if we think about these large language models, they are all trained on data that is publicly available. So if you go and ask… And try this experiment today. Go to ChatGPT and ask ChatGPT, “How many people have used ChatGPT today?” And quite literally, the response you get is something along the lines of, and I’m paraphrasing, but this is pretty much it: ChatGPT is a large language model and so does not have access to the information you’re asking about.

So the way to interpret that is ChatGPT does not have any access to the private data of its own creators, OpenAI. So that is a very interesting aspect about large language models. They have no understanding of a business’s proprietary data. And so if you want to build generative AI applications that are relevant to your business, you need to somehow bring the context of your proprietary data to these large language models.

An example of this would be if you go to ChatGPT today and say, “I live in Minneapolis and I want to build a irrigation system for my two-acre backyard,” it’s probably going to give you reasonably good answers but that aren’t quite specific. So it might tell you, “Oh, you need about four sprinkler heads and you need 80 feet of pipes, and the pipes need to be rated for this temperature because you live in Minneapolis, and you’re going to need some water regulators,” et cetera. But it has no idea what Home Depot sells in the store closest to your house.

Now, imagine if Home Depot wanted to provide that same experience on their website, where they wanted you to come in and type that same query. Now because Elasticsearch, which is behind the scenes… Imagine that they’re using Elasticsearch. We would have all of their inventory information, and now we could bring that together with something like OpenAI, GPT-4.

And now we could, on that website, when you ask that same question, specifically say, “Oh, you need four sprinkler heads of this particular type. Click here if you want them delivered to your home. You need 80 feet of pipes. They’re all available right now in the store. You can either pick them up or we can deliver them. You need these water regulators. Two of them are on back order, but we can have them delivered in a day. Click here and they’ll all be done.” That experience, every business will want to deliver.

So whether you’re a Home Depot, whether you are a travel company, whether you are a eCommerce store, it doesn’t matter what business you’re in. Every business will want to deliver these kinds of experiences, and the role that Elastic plays is in being that search technology that gives the private context of your information to these large language models and then brings these two things together to enable everyone to build generative AI applications. Effectively, we believe we are democratizing generative AI for every business. What others like OpenAI are doing for the consumer opportunity, we are doing for businesses, and that’s pretty exciting from our perspective.

Daniel Newman: Yeah, I think that’s a really nice connective tissue to something that I’ve been saying for a while, is that at this point, the open large language models, Bard, ChatGPT, Hugging Face, they’re table-stakes. Everybody can use them, they’re accessible, and the data that’s being utilized is mostly common, meaning they’re scraping the open internet for data.

Each does it a little differently, and having played with all of them now, I can tell you there are some things that they do each do remarkably well. Google’s Bard is great for SEO stuff. Of course it is, because it’s got the massive experience doing search for the longest period of time.

Having said that, what I’ve been saying for a while is it’s going to be the ability to layer in your private data, the enterprise data, the high-value data sets, and it’s going to be that waterfall from large language to medium to small to micro-type language models that you can then build and utilize and tie together. It’s about protecting the private data of your enterprise and the proprietary data, but being able to combine perhaps the private data with the language models so that the queries that you ping into your system are relevant.

And I’ll tell you this, Ash. Even my firm in the high-tech research information services, we launched something called Futurum AI, and the idea was the first AI analyst, because my belief is in this business, there’s a knowledge base and companies like Elastic want to know what does the analyst, what does the firm think about something, or what data do they have available?

And the point is that that query response needs to feel very much like you’re interacting with the analyst, and that means it needs to, A, understand the language, and B, understand the relevance, to your point, of the data. And if it can’t get its arms around the relevance of the data, it might say, “Hey Dan, what do you think about Microsoft Azure versus AWS?” And I’ve talked about that 50,000 times. So how does it know what’s most recent, what’s most relevant, what’s most similar to my current language? And a tool like Elastic coupled with the private data sets, coupled with the open-source large language is how that happens.

So it’s very exciting. We’ve just a minute or two left. I’d love to just wrap. How big does this get? There’s a lot of people that call this peak hype, and then there’s other people that I think are saying this is absolutely a trend that’s going to stay. Are you buying the hype at this point, Ash? Is this generative AI thing going to last, and how important is it for Elastic’s long-term business outlook?

Ash Kulkarni: The interesting thing about anything this transformation, I think we can all agree that this is big, right? People talk about hype or not hype, but nobody’s disagreeing that this is pretty amazing what this technology is able to do. And if you roll back the clock on every single major technology platform shift that has happened, at least since I started my professional career… So in the mid-’90s, people were talking about peak hype for the internet, and then there was mobile computing and then cloud computing and so on.

So if you think about the internet in those days, if you take the most widely optimistic assessments that people had made of what the market opportunity could be of what the internet is going to enable us to do, if you look now, you would say, “Oh my gosh. That was the most crazy underestimation of what the opportunity really ended up being.”

So it’s hard to tell. How many people today do not have a mobile computing device that they’re carrying around with them? And when Apple first introduced their phone, nobody thought that this made any sense. I mean, there were people who were super excited and there are people who were like, “I don’t know why I need this.” So I think time is going to be the one factor that’s going to truly demonstrate the enormity of this opportunity.

I’m a believer. I’m a believer that this is fundamentally going to improve the way we access information systems. It’s going to make information systems far more intuitively available to everyone. It’s going to democratize access the way it hasn’t been in the past. I think that the difference in some ways is, like with every new technology, the timing becomes one that you have to work through. Just like with the internet, there were questions of monetization. There were questions of, who is going to invest in the infrastructure? There were questions of, what kinds of business models are going to emerge?

You’re looking at similar and just as interesting, if not more interesting questions about data privacy and data security and data provenance. If I pass some data to a large language model that was proprietary and copyrighted by somebody else, do I now owe that person payment? What about model bias, and what happens if the models haven’t been trained on sufficient enough sets that you end up with responses that are truly leading you down the wrong path?

There’s a lot that we need to figure out, but all these problems are solvable. I think this is one of those areas where I’m a believer in our ability to overcome these kinds of technology challenges. In the long run, I think what we are looking at right now is going to make us truly believe that this was one of the most amazing times in technology history. But how big? How fast? I think to me, it’s wait and watch, because as customers start to implement these things, we are truly going to understand the scope of what we are looking at. I’m very, very excited about what this means, not just for Elastic, but for everyone as a whole. Yeah. I’ll pause it at that, but I’m excited.

Daniel Newman: Let me just say that you and I share a sentiment when it comes to generative AI and AI as a whole. I think the generative has democratized and commercialized and consumerized it to some extent, but as I said, Google Workspace has been finishing my sentences for a few years now. So there’s been generative applications in our lives. We just got an app, the killer app, and while generative AI is very real, we have a wait and to be seen on whether the metaverse ever makes a return.

And I think that’s the reason we end up asking questions like, “Is it hype or is it real?” is because a year ago we would have heard that the Metaverse… We would be doing all of our meetings wearing headsets. We were never going to go into an office again. And yet here we are, you and I running from city to city, hopping off planes, trains and automobiles, because we still go meet people. We still sit down and break bread. We still aren’t wearing goggles to our meetings. But we are using generative tools paired with our most valuable enterprise data to make decisions that put our businesses at a competitive advantage.

So with that, Ash, I just want to thank you so much for joining me here on Making Markets. We’ll have to have you back soon.

Ash Kulkarni: Thank you very much. It was a pleasure.

Announcer: Thank you for tuning in to Making Markets. Enjoy what you heard? Please subscribe to get every episode on your favorite podcast platform. You can also watch us on the web at futurumresearch.com/makingmarkets. Until next time, this is Making Markets, your essential show for market news, analysis and commentary on today’s most innovative tech companies.

Author Information

Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.

From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.

A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.

An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.

SHARE:

Latest Insights:

A Game-Changer in the Cloud Software Space
The Futurum Group’s Paul Nashawaty and Sam Holschuh provide their insights on the convergence of IBM, Red Hat, and now potentially HashiCorp and the compelling synergy in terms of developer tools, security offerings, and automation capabilities.
Google Announces Q1 2024 Earnings, Powered by Revenue Gains across Cloud, Advertising, AI, and Search
The Futurum Group’s Steven Dickens and Keith Kirkpatrick cover Google’s Q1 2024 earnings and discuss how the company’s innovations across cloud, workflows, and AI are helping it to drive success.
Intel Showed Progress in Q1 2024 Results Led by Double-Digit Growth in Intel Products and Intel Foundry Delivering Breakthrough Intel 3 Production
The Futurum Group’s Ron Westfall and Daniel Newman assess Intel Q1 2024 results and why Intel’s new foundry operating model provides transparency and the new Intel Products immediately bolster the Intel enterprise AI proposition.
Bedrock’s New Enhancements Are Designed to Streamline the Development of Advanced Generative AI Applications
Steven Dickens, Vice President and Practice Lead at The Futurum Group, provides his insights into the announcements from AWS on the enhancements to Amazon Bedrock, including Custom Model Import, Model Evaluation, and advanced Guardrails.