Elastic: Transforming Real-time Search Analytics From ESQL to GenAI for performant analytics – The Six Five On the Road at AWS re:Invent 2023

Transforming Real-time Search Analytics From ESQL to GenAI for performant analytics - The Six Five On the Road at AWS re:Invent 2023

On this episode of The Six Five – On The Road, hosts Daniel Newman and Patrick Moorhead welcome Ken Exner, Chief Product Officer at Elastic for a conversation on their recently unveiled Elasticsearch Query Language and how it fits in with enterprises’ continuous deployment of GenAI.

Their discussion covers:

  • The main challenges enterprises are encountering when interacting with data
  • Elastic’s newly unveiled Elasticsearch Query Language (ESQL), dedicated to simplifying data investigation
  • How Elasticsearch Query Language can be used as enterprises continue deploying GenAI
  • Elastic’s outlook on GenAI’s impact on search and data management in the coming two to five years

Learn more about Elastic’s Elasticsearch on the company’s website.

Be sure to subscribe to The Six Five Webcast, so you never miss an episode.

Watch the video here:

Or Listen to the full audio here:

Disclaimer: The Six Five webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded, and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.

Transcript:

Patrick Moorhead: The Six Five is on the road at AWS Reinvent 2023 in Las Vegas. We’ve had some great conversations with just so many players out there. And Dan, I mean, just when you thought, we’ve had every conversation. We’ve had a few conversations on data, but not as many as we should, especially when it is the driver and the feeder into this generative AI conversation.

Daniel Newman: Yeah, look, the trend line has been visible this week. Clearly, the headlines are generative AI, AI, all things data that enable AI, infrastructure that makes AI possible, some networking to make sure data can move for AI purposes. The edge, of course, feeding AI, and we go so on and so forth. But yeah, it’s great to be here in The Six Five Media Lounge here at AWS re:Invent 2023. Look, I think the future is about getting the data to the compute to be able to give outcomes. And so talking to the people that are making that happen is the best thing for everyone out there that’s trying to figure out what’s going on.

Patrick Moorhead: It is, and it’s my pleasure to introduce Ken with Elastic, an absolute leader in data. Ken, welcome to The Six Five.

Ken Exner: Thank you very much. Glad to be here.

Daniel Newman: We appreciate you joining the show. You kind of heard my editorial to start the show. What I’m seeing.

Ken Exner: Yeah.

Patrick Moorhead: His feet wants to-

Daniel Newman: Are you seeing, I wanted to be the guest. Are you seeing AI as kind of a-

Ken Exner: Yeah. It’s funny, we’re at a cloud computing conference, but it kind of reminds me of what was happening in cloud computing about 15 years ago where pretty much every executive you talk to is trying to figure out their cloud computing strategy.

Patrick Moorhead: A hundred percent.

Ken Exner: Same thing is happening right now, with generative AI. So every executive I talk to, they’re trying to figure out what their generative AI strategy is going to be. They’re trying to prototype, trying to figure out how to experiment and figure out what are they going to do, how are they going to take advantage of this boon to productivity that is generative AI? So I’m seeing sort of a mirror of what was happening 15 years ago in cloud computing now happening for generative AI.

Patrick Moorhead: Isn’t it? I think it’s ironic too, that we keep talking about data. I mean, when I’m in college, I am learning garbage in, garbage out, on a deck VAX, and it’s just like-

Ken Exner: Aging yourself.

Patrick Moorhead: It’s just like the themes continue.

Daniel Newman: He looks good for 77 though.

Patrick Moorhead: Yeah, exactly, and three grandkids. No, but here we are in 2023. Again, not having the exact same conversations, but we’re having this big data conversation again, now, still.

Ken Exner: Yeah. Well, there’s more data. There’s massive amounts of data. It used to be that you couldn’t get enough data out of your systems. Now I think people are kind of drowning in data. There’s logs for networking, logs for applications, there’s metrics, there’s traces, there’s just tons of data.

Patrick Moorhead: Well, we fractilized everything. We fractilized infrastructure from on-prem, one app on one server in your own data center to Colo, hybrid multi-cloud, Sovereign Clouds, to the Edge. We fractilized he Edge.

Ken Exner: You forgot microservices, managed services.

Patrick Moorhead: Applications have fractilized too, as well.

Ken Exner: That’s right. You kind of long for the days of nice mainframes where everything was just in one system?

Patrick Moorhead: Scale up 80, right?

Daniel Newman: Well, I think we’ll get back to that though, because don’t we always kind do that? We have –

Patrick Moorhead: To accordion.

Daniel Newman: … flow, we got this, we’re going to do everything out. We’re going to do terminal services, and then we’re going to go back and put everything… Then we’re going to move everything.

Patrick Moorhead: Yeah. There was an announcement today from AWS on basically a terminal service. So there we go.

Ken Exner: Back to the future.

Daniel Newman: We’ll get back to that point. It just one giant, they’ll turn one of these hotels into just one computer that will feed the world. Just be full of GPUs and CPUs. And listen, I’d like to talk enterprise sort of the ground truth always comes from the CIOs, CTOs that are solving the problem you have. They’re trying to solve a productivity challenge. They’re trying to drive more revenue to their companies. They’re trying to make customers happier, but in order to get to these AI euphoria, they have to get their data right. They’ve got to get all their systems organized and such. So what are the challenges that you’re seeing at Elastic as you’re sort of working with these companies? What are the challenges with DevOps, with SREs and of course with the executives to kind of get their ecosystem right?

Ken Exner: I’ll answer for generative AI first. The big thing that we’re seeing with generative AI is companies are trying to figure out how to take advantage of these technologies with their private data. So they have all this private data, they have these public LLMs, how are they going to work with these public LLMs? And that’s what they’re looking for. How do I use without having to train my data, train these LLMs on private data without having to fine tune, which can be extremely expensive and require loss of specialized skills. How do they do that? So what we’re finding is companies are looking for a way to create a bridge to these LLMs, and that’s where Elastic has been coming in in the context of generative AIs, helping create the bridge to these large language models where anything that has been indexed and managed through Elasticsearch can suddenly be accessible to these large language models via a process called retrieval, augmented generation or RAG. So we’re seeing a lot of companies who have already invested in Elasticsearch and already indexed their data for search, suddenly are able to take advantage of the power of generative AI without having to do much at all because they’ve already indexed and managed that data.

Patrick Moorhead: Are they just magically going to upload all the data they haven’t uploaded for 50 years? And how do we have our cake in and eat it too here? And everybody’s trying to work through that. And you recently came to the table with a new query language, Elasticsearch query language, I know to try to knock down some of these challenges, and I know it’s trite to say, Hey, everybody needs a new language to work with as opposed to something different. But why did you create it? And what are some of the specific pain points that you’re solving for here?

Ken Exner: Yeah, so ESQL is a new query language and a new query engine. It’s built directly into Elasticsearch. And what we were trying to solve for this was trying to be able to use one query language for all the different types of data you have. So typically, if you are working across, say, metrics and logs, you often had to have different tools for doing that. So with ESQL, you can have one language, one query engine that works across all the different types of data you have in the different data silos. It is a piped query language, so it’s meant to work on unstructured data, but it also works on structured data. So you can use it for structured or unstructured data. And we’ve also borrowed a lot of the things that people love about structured data tools like the ability to do joints and lookups and unions of data, the ability to do aggregations and math, the ability to create fields on the fly. So there’s a lot of new powerful capabilities in this that I think are going to allow operators and practitioners, DevOps practitioners, as well as security practitioners to be able to use one tool to pull different data sources and create new experiences on top of that.

Daniel Newman: So, the company’s expanding its capabilities. Obviously it has a very rich heritage with Elasticsearch, it’s well known. And coming into the… You were sort of the platform that was really kind of built for this. I remember talking to Ash about this. You were sort of doing this, what was going to be needed, the early era of gen AI. I was saying it’s really all about enterprise search. It’s all about the ability to really get through all that data, make it very well organized, searchable, getting different structures and unstructured. And you’ve been thinking about solving this problem at Elastic for a long time. And now as you’re moving into these new areas, you’re trying to democratize the AI capabilities, you’re building assistance. You’re talking a little bit more even about observability now, which is different. You’re the product head, so how are you guys kind of thinking about the evolution of the product? And then how does the evolution sort of fit in with this generative AI boom in terms of building the company’s service addressable market, serviceable market?

Ken Exner: So the company began with Elasticsearch, which is the most popular search engine of all time. It’s been downloaded 4.2 billion times. It’s used by-

Daniel Newman: It’s powering search everywhere.

Ken Exner: Yeah, a hundred percent of the Fortune 500 use it in some way. You never get a hundred percent of anything. But so it’s becoming a ubiquitous part of the modern tech stack for application search and product search. But one of the things that happened a few years ago is people started using it for log analytics. They realized you could search through logs. And as people started building experiences on top of logs, we realized that we could expand into observability and we could expand into security. Security threat hunters are essentially looking through logs. They’re trying to find potential malicious activity in logs. And that took us into observability as we started to look at other areas of observability that we could go into. So we started acquiring companies to help us with profiling and tracing and other aspects of observability. And the same thing with security. Through acquisition or through development of our own tools, we expanded into endpoint protection and cloud security and identity analytics. So today we have three businesses, essentially. We have the core search business, which is focused on developers that want to add search through applications, the observability business, which is focused on sort of the SRE DevOps persona, and the security business, which is focused on tools for the modern day SOC.

Daniel Newman: So just a quick follow on there. I mean, it was an interesting pivot, not because you were so successful in that one area and you kind of also had all the share, right? At some point you’d say, well, how do you grow? Obviously as data sets grow, data size grow, you’re growing Elasticsearch because people need more. But getting into that though, there are a lot of companies that have tried to get into the monitoring and logging. You must have felt you had something in the stack that you’ve built in the technology that you built that really made you unique to say, we’re going to enter this market because it’s fairly crowded over there. There’s a lot of companies entering mine spec.

Ken Exner: We knew we had something special because customers were building their own solutions on top of it, and a lot of the other vendors were using us too to build on top of.

Daniel Newman: So the ones that are calling themselves observability?

Patrick Moorhead: Because by the way, isn’t a terrible thing.

Ken Exner: No, which is fantastic. So we knew that search had this powerful application in these other spaces that the other vendors had recognized, the customers had recognized, and we realized that we could do something new because we controlled the data store and we could get more efficiency out of it. We could build things like ESQL, which would be a query language that could be sort of a powerful way of looking across the different signal types in observability. So there was some things that we could do that would add value both as sort of a foundational layer, but also as an experience for practitioners.

Patrick Moorhead: And does the fact that ESQL crosses those three business areas, is it an efficiency play? What kind of play is that? Because sometimes people might say, Hey, a technology that is being used for three different things, how can it be best for a certain area? And that’s typically the response of point product vendors. But what’s the value of having ESQL across the stack?

Ken Exner: There’s lots of commonality between observability and security, for example, where if you think about detection rules, you are going to create a detection rule that is going to run proactively in real time or reactively, it’s going to be based on a query. And that query, it’s going to pull from various different data sources and look for some pattern. The same thing happens in observability where you’re creating an alert or an alarm, which is what? It’s looking at a few different data sets, running a query and running that in real time to create an alert. So there’s lots of commonality in sort of the patterns that these systems are looking for. And if you can create a tool that allows you to do that efficiently across different data stores, it gives power to both sets of practitioners.

Patrick Moorhead: Gotcha.

Daniel Newman: So what’s the future forward outlook like? Generative AI, everybody? It’s happening so fast, and I think Pat and I both really enjoy talking to people that are product builders because there’s a combination of having to be a visionary, having to be hands-on, and then there’s obviously these two schools of being the, we want to build everything our customers want. At the same time, it’s that real innovators don’t always ask permission. As you’re thinking, you built a product that ultimately became the product. If you use search on pretty much any site in the planet, you’re sitting behind it. How do you see this sort of gen AI thing playing out because it’s come on so fast?

Ken Exner: Well, we’re kind of operating at a couple of levels. One is sort of a foundational, primitive level, but we’re also using some of these capabilities ourselves in our solution. So in the observability and security solution, we use some of these foundational capabilities to deliver a generative AI based experience to our customers. But at the foundational level, we’re trying to make it possible for companies to take advantage of generative AI in the applications that they’re building. So if they want to add semantic search to their website, we’ve made it easy for them to add semantic search, which is sort of natural language question answering. We’ve made it easy for them to connect to generative AI models. We’re making it easy for them to run some of their own models with Elasticsearch. But also on top of that, we’re allowing customers to have sort of an observability assistant experience and a security assistant experience using the Elastic AI Assistant, which helps people with detection, diagnosis, and remediation of issues. And this is, I think, an area that’s going to get transformed a lot over the next few years, both security and observability. There’s a couple of things that are good conditions for disruption here. One is there’s a lot of pattern matching that happens in both observability and security. These practitioners are looking at data and trying to do some pattern matching to find our vulnerability or to find the root cause of an issue.

There’s also a lot of specialized skills that are required. So these practitioners typically have a lot of experience working with types of errors, and they know things because of the experience that they have. So I think those two conditions make both observability and security ripe for a little bit of disruption from generative AI. And I think it’s going to happen in a few different ways. One is, you’re starting to see it already with detection, automated detection. The whole AI ops idea was to automate detection through anomaly detection, automated detection rules, but it’s going next to diagnosis or getting to the root cause of something. So using pattern matching to figure out what is the root cause of an operational issue or what is the root cause of a potential security breach? And then finally getting to remediation. How do you remediate something more easily? And these are the areas that we’re playing in right now, trying to help customers automatically remediate issues by showing them what a potential playbook or runbook would be to solve something.

Patrick Moorhead: Ken, final question here. You don’t lack awareness because you have so many customers and there’s so many downloads are out there, but what do Elastic customers need to be doing more of now to get them ready for this future in two to five years that maybe they’re not thinking of, or maybe they’re not moving quickly enough that you’re maybe seeing the rabbit’s moving quickly?

Ken Exner: My best recommendation is if you’re not experimenting with generative AI now, you’re already late. So first and foremost, start experimenting. Start figuring out how to take some of the things that you’ve been trying to automate, like customer service or marketing automation, and start looking at potential uses of generative AI. If you’re working with Elastic already, we are already working with your data, we’ve already indexed your data, and you can automatically connect to a large language model to start building generative AI applications.
The other thing is, while you should experiment, move fast, break things. Remember that you want keep your private data private and you want to make sure you’re respecting the permissions of what you’ve done. You don’t want to just copy and paste things into a ChatGPT and be away with it. But you need to think about what is the permission and security structure you want to maintain? And if you’ve been using Elastic and Elasticsearch, you’ve already been indexing and using our permissions and security models, you’ve already been using RBAC or document-based permissions, all that gets preserved as you move towards generative AI as well.

Patrick Moorhead: Appreciate that.

Daniel Newman: It sounds like you’ve made a big commitment to taking the kind core architecture and then expanding it to these new trusted solutions and tools that people A) really need and B) really benefit from. And of course, it’s a great way to diversify the portfolio, expand the revenue base, and be there to support customers. Because all this generative AI also means this huge, massive influx of data. It also means just structural changes to how companies IT environments look, feel, and operate.

Ken Exner: Yeah. And also just making it sort of push button simple. If you want to go from lexical search to semantic search, that should be easy. If you want to suddenly go to generative AI, that should be easy. If you’ve already worked in structuring the permissions and document-based permissions on your data already indexed with Elasticsearch, you should be able to do those things, and it should be easy. And we want to make it easy for you.

Daniel Newman: Well, it’s going to be great to watch the continued evolution. We’re already seeing search as we sort of know it changing radically because of generative AI. And I’m sure that underneath it, it’s the type of technology that you built to enable search that’s going to enable generative AI to work well, which is still the step that most of us need is not just to have it work, but it needs to be right. It needs to be safe. The data needs to stay private. These are things-

Ken Exner: It needs to be private. You want the most relevant answers. You want your LLMs to be grounded in truth in the context of what you want to pass towards.

Patrick Moorhead: And architecturally thinking about the future. So again, we all can’t predict the future with a hundred percent precision, but I think people should sleep better at night knowing that you’ve thought about the next two to five years. So this isn’t some point type of solution that they’re signing up to. So, appreciate your time.

Ken Exner: Thank you.

Daniel Newman: Thanks a lot, Ken. All right, everybody hit that subscribe button. Join us for all of these Six Five On the Road episodes here at AWS Reinvent 2023. We’re here in The Six Five Media Lounge. Thanks for joining us today. We’ll see you all really soon.

Author Information

Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.

From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.

A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.

An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.

SHARE:

Latest Insights:

New pNFS Architecture Addresses Data Storage Needs for AI Training and Large Scale Inferencing
Camberley Bates at The Futurum Group covers Pure Storage FlashBlade //EXA announcement for the AI Factory.
Strong ARR and Margin Expansion, but Investor Concerns Over CapEx and AI-Driven Shifts Remain
Olivier Blanchard, Research Director at The Futurum Group, shares insights on Samsara’s strong Q4 FY 2025 earnings, the 11% stock drop, and key investor concerns over CapEx slowdowns and AI-driven edge computing. How will these factors shape Samsara’s growth?
Google’s Latest Pixel Update Brings AI-Driven Scam Detection, Live Video Capabilities in Gemini Live, and Expanded Health and Safety Features to Pixel Devices
Olivier Blanchard, Research Director at The Futurum Group, examines Google’s March Pixel Drop, highlighting AI-powered Scam Detection, Gemini Live’s updates, Pixel Watch 3’s health tracking, and Satellite SOS expansion.
Discover how AI is driving major shifts in tech earnings on this episode of the Six Five Webcast - Infrastructure Matters. Learn about Broadcom's AI-fueled growth, VMware's Private AI Foundation, Salesforce's Agentforce, and the satellite internet race, and the impact of tariffs and the future of AI in business.

Thank you, we received your request, a member of our team will be in contact with you.