Top AI Trends for 2024 | The AI Moment, Episode 7

Top AI Trends for 2024 | The AI Moment, Episode 7

On this episode of The AI Moment, Mark Beccue discusses the latest developments in enterprise AI, & the top 7 AI trends for 2024.

The discussion covers:

  • Top AI Trends for 2024: After the year AI had in 2023, what could possibly be next? I will give you a hint, AI won’t slow down much in 2024. There are seven key trends I think will impact the adoption of AI in 2024. I walk through what those trends are and why they are so important.

Watch the video below, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Listen to the audio here:

Or grab the audio on your favorite podcast platform below:

Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this webcast.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.


Mark Beccue: Hello everyone, and welcome to episode 7 of The AI Moment. Today’s title, I hope you’ll be interested, is Top AI Trends for 2024. So I’m Mark Beccue, Research Director for The Futurum Group. Welcome to The AI Moment, which is our weekly podcast that explores the latest developments in enterprise AI. It’s been a year, hasn’t it? I keep saying this, it’s almost impossible to understand the pace and comprehend the pace in which AI is morphing as a technology and as a application. And I’ve been covering AI since 2016. I’ve never seen anything like this. But here we are. And the podcast that we would like to… our goal is really to distill information, try and separate the real from the hype, and provide you with some analysis around where the AI market will go. So typically, we deep dive into the latest trends and technologies, and that covers everything from advancements in the technology to parsing the vendor landscape on and talk about big announcements. We also look at regulations from time to time, ethics, risk management, and a lot more. So we cover a lot.

Today, I have one segment, the segment’s going to be looking through… I’ve put together my list of what I think are the top trends for AI in 2024. It was hard to get this down to a few, and it was one of those where I had to think about distillation. What is a bigger theme? What’s a trend? What’s not of the moment? Let’s call it a fad. So thinking about it from that perspective. So the first trend we’re going to talk about is the LLM market. And I believe the LLM market, the valuation of companies that are offering LLMs is going to shift down. And what I think is happening is we’re seeing a commoditization process and that the emergence of these open-source models is forcing pricing competition. Note to you that just a few… maybe a week ago, Anthropic announced that they were dropping the price for their Claude 2 LLM product. And that is directly due to competition, whether it’s, mostly, I think these open-source models which are cheaper, free, there’s some back and forth about that, whether you’re paying for tooling or in-app pricing, let’s call it. But that is happening and I believe that models, we’re going to get more of them, they’re going to get more specialized, they’re going to get narrower, they’re going to get smaller and smarter. And so this idea where we see these unicorns based on just a model, I won’t name them but there’s a couple of different companies that are out there that are seen as this big valuation. I think that’s just going to start to shift down because competition’s going to mean there’s a lot of different options and nobody’s going to be worth more than somebody else. I don’t think there’s a way to win that. That’s trend number one, LLM market valuations will shift down.

Number two, the AI chip market is going to stay hot. And this is because we’ve talked about this a little bit before, when we walked into the generative AI moment, there weren’t really AI chips that were purpose-built for AI. In other words, there weren’t compute chips built purposely for AI. There were GPUs. It was very used quite a bit for AI compute because of their parallel, the ability for them to process in parallel. Those chips were built for other things like graphics, gaming graphics and autonomous cars, autonomous vehicles and things like that. Just so happened that they work pretty well for AI. So you’ve had all these different companies, including NVIDIA, thinking about how do we build chips that are purposefully built for AI that will make this processing more efficient, cheaper, faster, better, all of those things. And that market is fairly open. The chip design and the product development of this takes a while. So if you haven’t been thinking about it for a while, it might be a bit before we see other things, but I believe the market’s been working really hard at this, players to watch include NVIDIA and AMD, smaller ones like Grok, other companies that build chips like Qualcomm. And I’ll mention Qualcomm for an instance here and say, in terms of this AI chip development, two companies really had some interesting leverage. So NVIDIA was building these GPUs for different purposes and were able to leverage it into the AI business.

Qualcomm was very focused on mobile, so mobile chips and what we have there is that those chips had multiple units on them. They had CPUs and GPUs for different purposes for helping on-device processing. So again, they had a head start in a thought because they had been building chips for some time with GPUs in them for mobile. So they’ve been able to leverage their knowledge and their head start here, and they are making significant progress in enabling on-device AI, which we’ll talk about in a minute. So the AI chip market stays hot. We’re going to see a lot of new progress there. We’re going to see performance capabilities going up. The chips are from everything for data centers to on-device, and all of that’s going to be a big year. There’s money to be made here. And I think that across the board, we’re going to see these chip manufacturers, these chip players see a lot of attention, number two. Third trend is that I believe there will be no real dominant AI cloud player winner. And I believe that the competitive battle for these AI workloads and the AI stack business is just a competitive thing. So when I say that, I’m talking about the major hyperscalers, so Microsoft, AWS, Google Cloud, and IBM. All of them have very compelling product stacks up and down that piece where you’re doing AI compute, you have development platforms, you have applications, all of the different tools that you need are there. All of them have competitive options, and I think that they are working to push each other to have that competitive option be easy to each… they’re one up in each other but everybody’s catching up quickly. So there’s no real big competitive advantage there, I don’t think.

And at the end of the day, when companies are looking at AI compute and AI workloads and AI stacks, there are a lot of different decisions that they’re making about that. So there are bigger decisions about, overall, their general compute. It has to come into play. So the moats that are around these players and what they’ve built as far as their business comes into play. It’s not just an AI decision. The reason I say it’s not going to be dominant, I think there’s a lot of room in here for everyone to compete. And I think it’ll be less about takeaway business and more about growing the pie with their current customers and maybe looking at enterprises that are new to AI cloud play. So in that competitive space, I think that’s really where we’re going to see movement is better penetration. So per customer, these cloud players are going to see higher revenues, they’re going to pay them more to do these things, and they may start to pick up enterprise players that are new to the cloud compute space. So that’s number three.

Number four is on-device AI. So I mentioned this a little earlier. Let’s talk about Qualcomm. We’ve done several podcasts about this, webcasts about this, but I think that, again, this global install base of PCs and smartphones is just an opportunity that is too hard to ignore. There are some challenges to implementing AI on these devices because of the limited compute power. So those workloads are difficult and that ability to think about use cases that require AI online versus offline, those are things that are getting worked through. And there are certain players that are very focused on this agenda. It’s Qualcomm, Dell Technologies, Apple. There’s just so much that could happen here. And I think we’re going to talk a little bit more about what will actually promote that has to do with how the models are morphing, which we talked about a little bit upfront, we’ll talk about a little bit more here in a sec. But I would say that these models that we’ve seen, they’re getting smaller, like I mentioned, and narrower. Just this week had more and more come out. Google talked about Nano. Nano is less than 3 billion parameters. They have one that’s less than 1 billion parameters. There’s Phi 1.1, there’s Mistral. All of these are smaller than 7 billion parameters and seem to be ideal for enabling on-device AI. So let’s watch where that’s going to go. It’s going to be a hot year, I think, for on-device AI.

Next one on-prem AI emerges. So as the generative AI era has begun, all this discussion was around the hyperscalers and what they were offering and the big LLMs and all these things. But at the end of the day, many enterprises are building their own AI tech stack on-premise or in the private cloud. And there are economics that drive that. There are certain drivers for it. It’s protection of this data that they don’t want to put into the cloud is one piece. There are other economics that are driving it but the economics are starting to shift to potentially favor on-prem. So some of the arguments against on-prem would be that you have to have a lot of expertise across the stack. There might be short-term costs savings but in the long term, running these things through a public cloud, is there scalability and economies of scale for that? It goes back and forth. But there clearly is momentum around this. And there are some interesting vision happening from certain players who are thinking about the agenda in a large way. And that really is around Dell Technologies, HPE, Oracle, there’s others like Teradata and Pure Storage that are helping to think about on-prem. So just think of it as we’re going to hear more about how enterprises have maybe built their own AI stack and are running these things outside of these bigger, more flashy, I don’t want to say more flashy, but the more popular ideas around whether it’s something that AWS or Google or IBM or Microsoft will handle for them. So I think we’re going to see that.

Last two, this one, I think, is really, really key and is misunderstood a bit. Now, there’s always been, data management is going to become a paramount focus. There’s a very popular saying that data is the new oil, and we tend to get that lost in the shuffle. The big data and how companies manage their data and how they access their data is still a mess. There are proprietary issues, there’s silos, there’s how to pull it out, how to use it, what kind of data is readable. There’s structured data and unstructured data. It’s nuts. There’s a lot that’s going on here. And the end of the day, what really makes a difference in AI, it’s particularly generative AI, is not going to be the fancy models, because models will commoditize. We’re going to commoditize on all of these kinds of tools. There’s not going to be a differentiator on what a model is. It’s going to be how you use your proprietary data to build better insights to get a competitive advantage, because the enterprises, the differentiator they have is their data, the data that they own and have. So that’s really the issue. And what we have going on is this idea that companies, enterprises need help. They need help in, let’s call it, managing that overall data management assistance. It’s a lot of people that can help here. There’s a lot of players focused on it, and it works everywhere from Databricks and MongoDB, Single Source, AWS, Google Cloud, Microsoft, IBM, Snowflake, HPE, Dell Technologies, Elastic, Oracle, Salesforce, Salesforce talking about data management. So maybe the big battle we see and the real competitive marketplace for generative AI over the next year is going to be around how you leverage and get at and access and use your proprietary data.

All right, last trend I’m going to talk about has to do with the end results of some of these AI initiatives. And I think that content generation will be the most contentious play and the impacts for marketing and advertising. So talk about this for a second and say, right out of the gate, when ChatGPT came out, everyone… if you think about it broadly, they talked about content generation, whether it was images or texts, and the idea was, well, we can be more quick to market, produce more marketing and advertising content. And the way that the LLMs work and can generate text is contentious. I think it’s contentious. And there’s going to be continued resistance in the marketplace to content generation for a couple of different reasons. Copyright and IP issues. And those are going to continue to haunt around image and video and all of the types of content generation we have on that end. The use cases will have to evolve. Providers are going to have to offer indemnification or some way to walk back who’s responsible. But I’ll go a little bit further and talk more about the text generation for content. And I just think it could have a chilling effect because I think that the outcomes, the outputs for AI, particularly around content text generation need to be thought about as far as quality, imagination. It’s just difficult for me to see why we need more marketing and advertising text content. We need better and more targeted marketing and advertising content. We don’t need more of it. And there’s a lot of resistance and discussions within just general conversations about, are we unleashing just more dumb and bad marketing content? And that needs to be addressed and thought about because there’s implications for the companies that do it. Spray and pray is not a good idea. Just don’t think it is. So I don’t think it’s a good use of generative AI, and I think this is going to be a place where people need to think and we’re going to see some pushback in the marketplace around those kinds of issues.

All right, so there you go. And in summation, here are your trends for 2024, LLM market valuation shifts down. Two, AI chip market stays hot. Three, no real dominant AI cloud player winner. That means a competitive battle for AI workloads and AI stack business. Four, on-device AI emerges. Five, on-prem AI emerges. Six, I can’t keep count, data management becomes a paramount focus. And finally, content generation will be the most contentious play and impacts for marketing and advertising. So those are my trends for 2024. Let’s see how we do. We can check back at the end of next year and see how we did. That’s it for the AI moment this week. Thanks for joining me. Be sure to subscribe, rate, review the podcast on your preferred platform and we’ll see you next time.

Other insights from The Futurum Group:

The AI Moment, episode 6 – On Device AI part 2

The AI Moment, episode 5 – AI Chip Trends, RAG vs. Fine-Tuning, AI2

The AI Moment, episode 4 – On Device AI part 1

Author Information

Mark comes to The Futurum Group from Omdia’s Artificial Intelligence practice, where his focus was on natural language and AI use cases.

Previously, Mark worked as a consultant and analyst providing custom and syndicated qualitative market analysis with an emphasis on mobile technology and identifying trends and opportunities for companies like Syniverse and ABI Research. He has been cited by international media outlets including CNBC, The Wall Street Journal, Bloomberg Businessweek, and CNET. Based in Tampa, Florida, Mark is a veteran market research analyst with 25 years of experience interpreting technology business and holds a Bachelor of Science from the University of Florida.


Latest Insights:

Oracle Exadata Exascale Debuts Aiming to Unite the Best of Exadata Database Intelligent Architecture and Cloud Elasticity to Boost Performance for Key Workloads
The Futurum Group’s Ron Westfall examines why the Exadata Exascale debut can be viewed as optimally uniting Exadata with the cloud to provide customers a highly performant, economical infrastructure for their Oracle databases with hyper-elastic resources expanding Oracle’s market by making Exadata attractive to small organizations with low entry configuration and small workload affordability.
Brad Tompkins, Executive Director at VMware User Group (VMUG), joins Keith Townsend & Dave Nicholson to share insights on how the VMware community is navigating the company's acquisition by Broadcom, focusing on continuity and innovation.
On this episode of The Six Five Webcast, hosts Patrick Moorhead and Daniel Newman discuss AWS Summit New York 2024, Samsung Galaxy Unpacked July 2024, Apple & Microsoft leave OpenAI board, AMD acquires Silo, Sequoia/A16Z/Goldman rain on the AI parade, and Oracle & Palantir Foundry & AI Platform.
Camberley Bates at The Futurum Group, reflects on NetApp’s Intelligent Data Infrastructure across hybrid and multi-cloud environments, enhancing operational consistency and resilience.