Search
Close this search box.

Making Markets EP48: Elastic CEO Ash Kulkarni on Generative AI, Elastic’s Recent Earnings, its Product Evolution and the Company’s Solutions for Search & Other Challenges

Making Markets EP48: Elastic CEO Ash Kulkarni on Generative AI, Elastic’s Recent Earnings, its Product Evolution and the Company’s Solutions for Search & Other Challenges

In this episode of Making Markets, Ash Kulkarni, CEO of Elastic, joins host Daniel Newman for a conversation about Generative AI, Elastic’s recent earnings, its product evolution and the company’s solutions for search and other challenges.

You can grab the video here and subscribe to our YouTube channel if you’ve not yet done so.

You can also listen below or stream the audio on your favorite podcast platform — and if you’ve not yet subscribed, let’s fix that! Click here to check out more episodes of Making Markets.

Disclaimer: The Making Markets podcast is for information and entertainment purposes only. Over the course of this podcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we do not ask that you treat us as such.

Transcript:

Daniel Newman: Generative AI is red hot and for Elastic it has been the driver of a great year. CEO Ash Kulkarni joins Making Markets once again to talk about the company’s recent earnings, its product evolution and why customers are turning to Elastic to solve not just search, but more challenges that can be solved with the Elastic platform. Join us for this and so much more. You’re tuned in to Making Markets. Elastic CEO Ash Kulkarni joins Making Markets once again. Ash, how are you doing?

Ash Kulkarni: Doing very well. Thanks for having me again.

Daniel Newman: It’s great to have you back. I recently caught up with John Fortt, who I actually watched you interview on Fort Knox not too long ago, and I told him, I said, “John, I’m a little jealous, but I’m going to bring Ash back and I’m going to talk to him too here.” So let’s just try to do a little better. I’m just kidding. John’s a good friend. He’s a great guy.

Ash Kulkarni: He’s a great guy.

Daniel Newman: You came off what I would say is a really strong quarter this past quarter, Ash. Talk a little bit about, just give me a little bit of the readout, the play by play and what led to this last quarter being so strong for you.

Ash Kulkarni: Yeah, I was really happy with the performance. The team delivered really well. We grew the top line by 17%. Cloud grew very well at 31% year over year, and our operating margin was 13%. So strong results across the board, both on the top line and the bottom line. And Dan, just to put it in context, in terms of the trends that we saw in the business, I’d say I call out three specific things that really drove our performance in the quarter. The first was generative AI, and I know this is something that I’ve talked about the last time you and I spoke. But we are really seeing good momentum in terms of customer adoption, our vector search functionality, our capabilities around what’s now well understood in the industry as retrieval, augmented generation, this idea of grounding large language models. So we are seeing a lot of interest in that area.

In Q1 of our fiscal year, I talked about the fact that we have hundreds of customers that are using us for generative AI use cases. In Q2, we saw hundreds more using us for generative AI, which is great from our perspective, we saw good momentum there. Now from a revenue contribution, these use cases are still quite early because we recognize revenue on consumption. So many of these use cases are still to ramp up. So from a revenue contribution, this wasn’t a very significant area, but just from an adoption standpoint, we see this as a really good indicator of future quarters in years to come. The second thing that we saw, which was wonderful to see, was our continued ability to get customers to commit to the Elastic platform to do more and more things. So consolidation onto our platform, especially displacing incumbents. And on the earnings call I talked about some really wonderful wins that we had in this area.

And then the third factor that really helped us was from a cloud optimization standpoint, things have stabilized. About a year ago we talked about the fact that given the macroeconomic constraints, we saw customers actively trying to really optimize their existing workloads. And in Q1 and in Q2, we saw that that’s stabilized. So customers have generally gotten to where they want to be in terms of optimization, and now they’re really focusing on these new workloads that they’ve committed to bringing onto our platform and those starting to ramp up. So all in all, I’d say the strategy that we’ve put in place is working well, and we saw some of those results.

Daniel Newman: Yeah, I remember talking the first time you were on the show, Ash. First of all, it was sort of a year that was a bit handmade for Elastic in the era as we see, because for LLMs to function, search is so important. And so people that probably had never even heard the term vector search this year probably heard it more times than they would care to count. But also just the idea that really generative AI wasn’t entirely a brand new thing. I mean people that have been using things like Google apps, maybe have seen it finish your sentences for you for some time. These things have been happening and the ability to understand semantic and syntax to be able to help people to predict and such has been something that’s been going on a long time. And Elastic has been behind search for pretty much everybody.

Anybody that’s using a meaningful search function has been using your technology. So you came into it at a really good time. I want to come back to that in a minute though. I want to zoom out. One of the great things about this show is I enjoy talking to you and so many of your peers is getting that broader look. You mentioned the early year macroeconomic view. To some extent, some may say the tech industry was saved by AI this year. You certainly know if you were heavily rotated to PC, CPU, traditional network or other infrastructure, this was not necessarily a great year. However, if you were well positioned to be part of this AI story, it was. So it seems that we’ve recently come to maybe the end of a very aggressive rate hike cycle. It seems that we came potentially to the end of a really tough inflation period that had brought prices of everything from rent to GPUs through the roof due to various supply economics. But at the same time, now we’re hearing about cuts, but you still have low unemployment, we still have GDP. There’s so much conflicting data. So without doing the full economic readout, what do you think? What’s going on out there? Do you think we’re about to see maybe a broader, stronger tech economy that won’t be so uniquely weighted to just AI or do you think ’24 may be a little bit more of the same?

Ash Kulkarni: Yeah, I can give you my read from all the meetings that I’m having with CIOs and other senior executives in the customer cohort that I speak with. We have over 20,000 customers and I’m constantly on the road meeting with customers. Generally, what I’d say is everybody’s trying to read the tea leaves on how the macro, especially in the context of quantitative tightening and just the rate hikes, how that starts to ease, when it starts to ease. There is hope, but nobody can quite predict the timing. Everybody’s trying to understand Chairman Powell’s comments and all of that is true for just about everyone that I speak to. In that context though, what I’m hearing from most CIOs is they expect budgets to largely be flattish going into the next calendar year. So nobody is anticipating a reversion back to the 2021 days when things were just going up into the right. And the expectation is that expenses will need to be managed very carefully and very thoughtfully. At the same time, what I’m also hearing is that there is this clear desire and just a mandate across many organizations to make sure that there is investment happening in these generative AI efforts. And it’s pretty early from every indication but if you think about what some of these use cases are about, when you talk about customer success and improving customer support and customer success in an organization.

One of our customers, Cisco, has talked about what they have done using Elastic to improve their customer support for one of their specific use cases. And there is a clear association with that generative AI application to efficiency. A business efficiency and a better customer experience just means that that is the kind of project that will get funded. So now you imagine the CIO’s predicament. You have a flattish budget and you still have to somehow find the dollars to invest in these gen AI use cases and that means that something else has to give. And that’s what I think is going to be what we see in 2024. There’s going to continue to be some degree of pressure, which means that there is going to be a selective focus on technologies that you can do a lot more with than just pure play one-off kinds of tools. And that to some extent has been playing to our advantage. That’s the angle that we’ve been leaning into. We tend to have a really strong value proposition in terms of value for price, and it’s a platform that lets you do so much and irrespective whether it’s search, observability, security and many more use cases. And that’s where we are leaning in from a macro standpoint. Dan, honestly, I’m not hearing anybody saying, “Hey, it’s the go-go days again.” So there is a lot of thoughtful introspection of budgets. I think that that’s going to continue.

Daniel Newman: Yeah, we might have to use a phrase like cautious optimism maybe.

Ash Kulkarni: There you go.

Daniel Newman: Because the weird thing is, we were the first to start to see the market go down, remember? Especially growth companies, and you were part of what I would consider to be an exciting growth company and it really goes back to November of ’21. I remember around Thanksgiving in the car and I was just watching the market falling and I’m like, “This is precipitous. This is crazy.” But at the same time, some of the big names took a lot longer to start. The big seven type names didn’t have that same fate so soon. It was a couple of years of this really aggressive drop and pull away from growth. And I think it really all happened when we saw the rates were going to have to go up. As soon as that happened, discount rates start getting calculated and the fund and all the analysts start changing the spreadsheets a little bit and boom, off it goes.

But the one thing you said though is that budgets, and I think this is what a lot of consternation, Ash, is going to be, if it’s still only a dollar, how does that dollar get spent? Because companies are not going to stop investing in AI, they’re going to further lean in. So in your opinion, is it all about the ability to invest in platforms and tools and technologies that give as much diversification as possible while enabling for AI? When I talked to your CPO, Ken Exner, he was really leaning into the fact that Elastic started with this kind of rich search capability, but that you see this whole stack sort of emerging. Are you seeing a lot of net revenue expansion, customers figuring out how to take that dollar they’re spending with you further in order to maximize these relationships and not see your business flat just because your client’s budgets are flat?

Ash Kulkarni: I think you nailed it. I think there are going to be two kinds of implications to this in the near term and then also in the long term. When you think about it, really what AI is doing is it’s enabling use cases that weren’t possible in the past. And effectively, a search is such an important component of generative AI because when you think about generative AI, these large language models are just fantastic at being able to predict what’s the next word and the next word, and they’re able to create these completely conversational experiences. But you have to ground them to make sure that they’re basing their response on the facts and context that are relevant to your business. And that idea of grounding is from a architectural standpoint, often referred to as retrieval augmented generation, and that’s all about search. How do you find the most relevant information from your organization’s data to then ground that large language model or effectively tell it respond to this question based on only this information and what that’s going to do because of all these new use cases that it’s enabling, it’s going to increase the total addressable market for search.

And I’m seeing just tremendous amount of interest in that area. So the Cisco example, but also DocuSign was another wonderful example. We talked about it publicly on our earnings call. They are now building full document search capability in their documents, so you can search across images in their documents, you can search across semantics, using the semantics or the context of what you’re looking for. Very rich functionality that wasn’t possible in the past is now being enabled. So I expect that the total addressable market, the TAM, will increase in the coming years. How big that’ll be for search, I don’t know yet, but I think it’s going to be very significant. This is a big platform shift. I feel that AI has the potential to be a platform shift just like we saw with mobile computing and other such really significant shifts in the past. But then when you take that same concept of search and what gen AI is doing to it and say, how does this now apply to other domains, security observability, risk analytics, so on and so forth?

You can see that all these domains have the ability to get better at what they do because of generative AI. And that’s the way we are seeing the world. So we have taken our own technology, all the vector search functionality that you mentioned earlier, and used it to build AI assistance for observability and security. And our field leads with them; our customers are finding them to be incredibly exciting. I’ll just give you an example of the kinds of things you can do with it. If you see an alert in Elastic security, you can, in a natural language form, ask it to explain to you what that alert means and it’ll break it down and it’ll tell you, oh, this is related to a particular vulnerability in Microsoft operating systems based on whatever it might be. And we detected a particular malware that might be on this device. You can then ask it for what do you do next? And it’ll create the appropriate underlying queries for you to say, just run this query. Do you want me to run this query? And I’ll be able to find where else this issue might exist within your overall IT real estate.

So in effect, it’s making the job of the analyst that much easier and we believe that this is going to allow us to compete a lot better in areas of observability and security because as data is growing, the challenge is not going to be about the tooling. The challenge is going to be about the data analytics and how do you use AI to get to better answers. So it’s not about the dashboards and the green and yellow lights and red lights, it’s about quickly getting to the actual root cause and diagnosing it and remediating it using AI. And we feel that we have an asymmetric advantage there given the investments we’ve made in technology, and that’s not going to be a TAM increase, but that’s going to improve our competitiveness. So it’s all about how do you deliver greater value to your customers in everything you do using AI and from that get a greater share of their overall IT spend, given the fact that there are these challenges with flat budgets.

Daniel Newman: Well, I’m actually glad that you said that. It’s kind of interesting. The ability to one, really think of things in 3 and 4D is going to change everything, because most of the observability opportunities that have existed is really it’s got to be rows, tables, numbers. Everything’s got to be very, very organized. The more we’ve gone to unstructured, the more complex and the more I think that opens the door for new types of companies to provide meaningful insights too, is the enterprise is just absolutely littered with data that’s historically been unusable. So the quicker you can provide a layer of usability so that people, images, notes that have been scribbled on whiteboards, so many great things on a day in and day out basis that just get lost, they’ve historically never been able to be utilized, can now be ingested, optimized and then used to provide language and graphic back to the enterprise. One of the things that you said though is compete, and I think everybody’s always interested in that. We both agreed, I think, Ash, that AWS was just a tremendous event. High energy, busy halls, the partners area was just crazy.

Having said that, one of the things I’ve noticed, there’s this converging trend line, whether it’s been multi-cloud over the last few years, whether it’s databases, there is a bit of a feature and partner and core relationship, and over time the hyperscalers keep adding services. I’m hearing some things in what you’re saying, but how do you, in your mind, Ash, remain ultra competitive? As you know, the hyperscalers will continue to make their databases more sophisticated, their generative AI tools more valuable. They’ll always give you the option to point to Elastic, but they will also at some point probably compete. What’s your way that you maintain differentiation and stay really competitive as new competitors and partners become competitive?

Ash Kulkarni: That’s a great question, and if you think about the breakdown of a hyperscaler, a cloud hyperscaler’s revenue, the biggest aspect, the biggest component of their business is all about the compute at the end of the day. So they want to make sure that they have workloads running on their infrastructure, they are taking a greater and greater share of a organization’s IT spend, and that’s really where they make the greatest money. And their competition isn’t the Elastics of the world and all the software players like Elastic and there are many like us, but it’s not us, it’s each other. So they want to make sure that they are getting the largest share of the overall compute spend that they can get. Now, like you said, they always add more and more features, but at the end of the day, their focus is on making sure that they have a rich partner ecosystem so the workloads stay on their cloud.

What AWS or GCP or Microsoft would hate is for the customer to choose Elastic running on one of their competitors because they aren’t providing that facility for Elastic to work closely with them and run very well on their platform. Because our customers know that the technology that we’ve built, the capabilities that we’ve innovated on and created over the years, they see the value in it, they understand it. So that’s where really the partnership ethos comes from. I’ve seen all three Hyperscalers do a wonderful job at that. We have to invest in it as well from our end, and that’s something that we’ve been hyper-focused on. Just to give you an example of our relationship with AWS and how much it has strengthened over the years, a few years ago we were a silver sponsor at best, and I’m just referring to AWS re:Invent.

You were there, that’s what you were referring to, the user conference last year. We were a platinum sponsor this year at the re:Invent that just ended, we were a Diamond sponsor. That’s the second-highest level of sponsorship. The booth presence was fantastic. We had this big, big presence and we were given the opportunity because they see the value that we bring to the table and they see the advantage in partnering closely with us. And I think that’s going to continue. And with AI, these hyperscalers are seeing an even greater opportunity and a need to partner with Elastic because they see the key role that we play. They see the strength in the vector search capability and everything else that we have built. We are really differentiating ourselves and each of them sees how we can help them, and that’s resulting in an even stronger partnership. I’m quite excited about that.

Daniel Newman: Yeah, I think that’s a really good way to answer it, Ash. I’ll draw a parallel to a lot of questions I’ve taken as an analyst from the silicon space over the last few months, and the market really wants a winner take all. The market wants. It’s very provocative to say, “Hey, NVIDIA will be the only provider of GPUs forever.” The bottom line is NVIDIA can keep growing at high double digits in AMD, Intel and AWS and Google and Microsoft can all develop silicon and software because we’re really in the earliest innings of the market for AI. I think there’s something to be said similarly about database vector search, generative AI, observability is that one, is it’s not a winner takes all. It’s not a zero-sum game. It’s not you’re only going to use one tool, and I think you have a genuine market benefit that you were early and often as the partner for many companies, but that there will be new tools that will be used in parallel and in conjunction with yours. And that’s not always going to be a displacement, but rather an augmentation, which is very common as companies’ software and IT ecosystems grow. So if I’m hearing you right, it sounds to me like you genuinely believe you can grow side by side and of course competition will rise, but that the overall market size, market opportunity and the need for technology like that you offer here at Elastic will provide you the runway you need to continue to grow the company and meet the market’s expectations.

Ash Kulkarni: Exactly. I see a massive, massive opportunity ahead of us. I see the opportunity for us to build a multi-billion dollar business, and it’s all about delivering the kinds of capabilities that your customers need, looking ahead at what’s the next horizon and making sure that you’re ready for it. We are talking here about vector search. The whole momentum around large language models really broke out into the public imagination with ChatGPT. It hasn’t been that long, it’s been a little over a year, but we were investing in vector databases in the vector search functionality years before that because we saw what was happening in the foundation model space, all the research that was going on.

So we were tracking to that and seeing the value and adding to these innovations. And because of that, we were well-prepared, and that’s been our motto, that we need to make sure that we are looking around the curve, we are looking around the bend, so in our space of expertise, we can have that asymmetric advantage. And then we are very open. We’ve always had that open mentality. So we’ll partner with all the hyperscalers, we’ll partner with others in this space so we can make sure that we are seeing the ability to grow ourselves, but we can keep becoming and staying a critical part of the overall IT infrastructure as the space evolves because it will evolve, like you said, very early innings.

Daniel Newman: I love that and I think that’s a great place to end, Ash. It’s early innings, for generative AI. I think the market opportunity around the broader AI, which includes, like you said, MLOps, AIOps, of course all the applications and then the generative stuff, that sure does make for a great conversation. But we will see a lot of growth. We will have a trend line to watch. I think you and I will both agree, we hope budgets do grow, but if they don’t, we’ll always do our best to continue to deliver outsized results for our clients so that they’ll spend an outsized proportion of their budget with us. But Ash, let’s have you back soon. It’s always great to talk to you about what’s going on over at Elastic.

Ash Kulkarni: Likewise, Dan, thank you very much for having me.

Daniel Newman: Thanks so much for joining Making Markets.

Author Information

Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.

From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.

A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.

An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.

SHARE:

Latest Insights:

James Wynia and Hemal Shah, leading minds at Dell and Broadcom, join David Nicholson to share their insights on building AI fabrics utilizing the groundbreaking capabilities of Dell PowerSwitch and Broadcom Thor 2.
Tim Shedd, Engineering Technologist at Dell, joins Keith Townsend to share his insights on the pivotal roles of power and cooling in computing, highlighting Dell's commitment to sustainability and efficiency.
Eduardo Mota, Senior Cloud Architect at DoiT, and Jobi George, Global Head of Partnerships at Weaviate, join Mitch Ashley to share their insights on mitigating cloud costs with GenAI tools while maximizing ROI through strategic reinvestment.