In this episode of Enterprising Insights, The Futurum Group Enterprise Applications Research Director Keith Kirkpatrick is joined by Mark Beccue, Research Director, AI, with The Futurum Group, to discuss the trends in Software-as-a-Service pricing, focusing specifically on enterprise applications that are or are in the process of incorporating some type of analytics-based, predictive, or generative AI. They will cover common pricing models and approaches in use today, discuss the impact AI has on software delivery economics, and will assess how advances in the training of AI models may change the calculus around AI pricing.
Finally, the pair will close out the show with the “Rant or Rave” segment, where Kirkpatrick picks one item in the market, and asks Beccue to either champion or criticize it.
You can watch the video here and subscribe to our YouTube channel if you’ve not yet done so.
Listen to the audio below:
Or grab the audio on your favorite streaming platform:
Disclaimer: The Enterprising Insights webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we do not ask that you treat us as such.
Transcript:
Keith Kirkpatrick: Hello, everybody. I’m Keith Kirkpatrick, research director with The Futurum Group, and I’d like to welcome you to Enterprising Insights. It’s our weekly podcast that explores the latest developments in the enterprise software market, and the technologies that underpin those platforms, applications, and tools. So this week, we’re going to take a deep dive into SaaS pricing, and one of the key drivers of pricing over the next several years, which is the integration of artificial intelligence into those platforms. And then we’re going to, of course, close out our show with our Rant or Rave segment. We’re going to pick one item in the market, and either champion it or criticize it. So I’d like to welcome back Mark Beccue. He is a Research Director for artificial intelligence at The Futurum Group. He has more than a decade covering artificial intelligence, and some of the underlying technology approaches. And he also has the distinction of being my first cohost for Enterprising Insights. So welcome back, Mark.
Mark Beccue: Hi, Keith. How are you doing?
Keith Kirkpatrick: I’m doing well. And you?
Mark Beccue: Very good.
Keith Kirkpatrick: Good. Well, why don’t we just get right back into it here, taking a look at the various pricing models that are being used for SaaS platforms. So as we’re both familiar, there are a few different approaches that are being used by vendors in the market. You have this flat rate pricing, where it’s a set amount for a product or service over a given time period. Essentially, your subscription model. You know, it seems like the vendors really like this because it’s a predictable flow of revenue. And if you’d like, you can actually segment your customers by different tiers. So if you have three specific tiers, you can break those out based upon specific target groups, or just by increasing the amount of functions there. So some easy examples are Dropbox, they use that sort of a model. Microsoft Office 365 does that, MailChimp, that sort of thing. And of course, going along with that, we also have feature based pricing. So again, you’re dividing up the offering into different tiers based on features sets. And here, it’s if you want to target specific segments of net value, some feature set over another, you can segment those users, and then of course, create a targeted revenue plan. So if you had some users that are very, very focused on, let’s say, access to a large amount of that data storage, you might create a plan where the pricing is advantageous to the vendor, while giving benefits to that user. And then of course, you could have another tier where it’s about specific features. So all of these different approach are really designed to create benefits for both the vendor, as well as for the end user.
A couple others that are common are user based pricing, which is commonly referred to as seat licenses. And then of course, the one that I’d really like to dive into is this pay-as-you-go or usage model, where customers are charged based upon the usage of that software. And again, that could be based on the number of transactions, the amount of resource used, or really any kind of other metric that’s relative to that software. So, I really want to dive into this, Mark, in terms of how is AI starting to really impact the way that vendors are thinking about pricing? And certainly, we’ve seen some vendors that have come right out and said, “We’re going to incorporate generative AI technology, and we’re going to charge a premium on top of whatever pricing model we already have.”
Mark Beccue: Yeah. It’s easy to say that it’s early days, right?
Keith Kirkpatrick: Yeah.
Mark Beccue: We’re really in an experimentation place, I really think. You’ve mentioned before that you have cost drivers that you have to account for, obviously. And, I just really think there’s a lot of experimentation. They don’t know. I really don’t think they know.
Keith Kirkpatrick: Right.
Mark Beccue: But they’re trying to balance what that cost might be. You know, you’re looking for a profitable piece. Some of them might be doing more of a baked-in approach, I would guess. We could talk about this as you go through the questions. If it’s something that, I want to say a feature of a product and it’s embedded, why would you charge more for that, right?
Keith Kirkpatrick: Right.
Mark Beccue: It’s really more of a competitive advantage to the SaaS player, and I think there’s a lot of that already. You and I have talked about that before. There’s a lot of people, they’re not charging for this. They’re thinking it’s a competitive advantage.
Keith Kirkpatrick: Right, right. But, Mark, why don’t we take a step back here because you mentioned something pretty interesting, is there is obviously a cost component, particularly when we’re thinking about things like generative AI. What goes into that, in terms of why is there even a cost associated with generative AI?
Mark Beccue: Yeah. That’s been the white elephant in the room, isn’t it? The big elephant in the room has been … Or, however you say that. Gorilla, elephant.
Keith Kirkpatrick: Yeah.
Mark Beccue: Big thing. What we don’t really know is what is the cost of AI compute associated with these products, and how do you parse it out. I don’t think the market was served very well by the free models that we saw early on from ChatGPT, and those kinds of things, where the cost, you’re not necessarily reflecting cost because it’s free to do something. The compute loads are massive. I’ve talked, again, about whether it’s a training, which is the one that’s really eye-popping for how much that compute is. But really, inference, which is the everyday AI run workloads, are not small. It depends on what goes into it, but all of these SaaS players are having to look at their cost basis, and I think the biggest one is the compute load at this point.
Keith Kirkpatrick: Really? Okay. I have to imagine, though, when we’re talking about AI, and in particular generative AI, isn’t there a cost associated with the training portion? Particularly when you’re going and you’re looking at it saying, “I’m not just going to use this open AI model, I’m actually going to be,” whether it’s tuning my model, or grounding it in a specific corpus of data, I assume that there are some costs there. Particularly when we get into very, very specific types of functions, as opposed to just using the open AI to type in something on art, or something like that.
Mark Beccue: Absolutely. That’s what I meant is the training parts don’t happen all the time. You typically are not going to train a model constantly, right? That’s every once in a while.
Keith Kirkpatrick: Right.
Mark Beccue: You think about the cycles that ChatGPT goes through, because we know that a cycle is, “Well, here’s GPT5,” or four, or three.
Keith Kirkpatrick: Right.
Mark Beccue: That’s when you’re regenerating things. So it’s not all the time, right? The one that’s really the day-to-day load is really the inference piece.
Keith Kirkpatrick: Okay.
Mark Beccue: Yeah. There’s certainly … The scary part about these models are where do we go. But, I will say this. This will impact the SaaS players, absolutely. There’s has been … You’ve seen how it’s mutated.
Keith Kirkpatrick: Yeah.
Mark Beccue: How there’s efficiencies. There’s a lot of what I call hacks and things, where you’re really getting these models more efficiently run. Whether it’s a hack to do that, fine-tuning is a way, RAG is a way, smaller models is a way, so you’re not running such big loads. So, there’s that more.
Keith Kirkpatrick: Yeah.
Mark Beccue: And then, you have I think within those data centers, we’re getting more purpose-driven AI chips are coming.
Keith Kirkpatrick: Yeah.
Mark Beccue: Instead of just GPUs, they’re thinking about how to run those things more efficiently. So everybody’s pushing towards driving costs down for compute.
Keith Kirkpatrick: Right. Well, digging into that a little bit, it sounds like the idea is to figure out a way to optimize it. There’s no reason to use a large language model for some very specific use cases. And I would assume that, both in terms of actual AI providers as well as the SaaS providers, are going to be looking at ways to really make sure that they deploy generative AI in the most efficient way possible because, to your second point about it’s really the inference and the number of hits every time you go out to try to do something that’s going to run up cost.
Mark Beccue: Yeah, for sure. I think the one that gets a little tricky, you talk about use cases. Lots of really very pragmatic use cases, for the SaaS players. The ones that are the heaviest load are really those assistants.
Keith Kirkpatrick: Yeah.
Mark Beccue: So when you’re doing that, they’ve got to retrain, the inference is heavy. So I’m wondering about how far we go, what happens with those for SaaS players. And those are, actually to your point, I think those are the ones that they’ll end up trying to, we’ve already seen, that they will charge for. And that makes sense because it’s more of a high cost run.
Keith Kirkpatrick: Right, because I think if you look at whether we’re talking about what Microsoft is doing with Copilot, or ServiceNow, any of these vendors. Now, of course, Amazon has gotten into the game with Q. That seems like there’s a real potential for a lot of usage, because every time you want to summarize a conversation, every time you’re looking for a prediction or a suggestion, that’s essentially you’re looking at real dollars and cents there.
Mark Beccue: Yeah. To me … You and I have talked a lot about these leaders in SaaS space, and I look to Salesforce as an interesting example of … The Einstein assistants are purpose-built more.
Keith Kirkpatrick: Yeah.
Mark Beccue: So they’re not general purpose, they’re really more specific.
Keith Kirkpatrick: Yeah.
Mark Beccue: And that makes sense to me. I think that that’s easier for the SaaS player to say, “This is what this does, here’s what you can do with it, and we’re going to charge you this for that.”
Keith Kirkpatrick: Right.
Mark Beccue: When it’s a Q or those kinds of things, and we’ve seen what … I think Microsoft’s admitted that the $30 charge is a test. And by the way, that’s on enterprise only. There’s no … They haven’t figured what they’re going to do with consumer yet. But like we said up front, is that going to stick? We’re not sure.
Keith Kirkpatrick: Right. Well, it’s interesting. Both you and I cover Adobe fairly extensively. The interesting thing there is if you look at what they’re doing with generative AI, particularly within their Creative Cloud. If you were to, say, use generative AI to basically type in a text prompt to create an image, you could be really … That’s an interesting use case there because you could wind up going down a rabbit hole. And I know that, in talking with them, they’ve taken the approach of being like a candy vendor with kids. “Let’s give out a little taste, let’s get you hooked on it.” And then we’ll figure out, if you’re going back to that candy store every day, 18 times a day, then they’ll have to figure out a way to charge it. But it seems like what they’re trying to do, and some of these other vendors as well, as you mentioned on the consumer side, they’re saying, “Hey, let’s get people using this, demonstrate the benefits, and then we’ll worry about actually matching that pricing to actual costs.”
Mark Beccue: Right. In-app purchase, right?
Keith Kirkpatrick: Yeah, exactly. Exactly, exactly.
Mark Beccue: Some of this gets tricky to me, because there’s so much more to this than pricing. I think of Adobe, and you and I both agree, that just one of the better AI thinkers, AI innovators, period.
Keith Kirkpatrick: Yeah.
Mark Beccue: If you take it back all the way, let’s say, they found a very compelling use case, let’s say for Firefly. That makes so much sense to streamline workloads for creatives, marketers, whoever it is that’s using those kinds of tools, and they really thought through it well. If it comes to the pricing part, it’s like, “Well, they want to try it,” but I think what’s driving that business … They’ve told us, both of us, that this is really showing a lot of pick up and some lift, that it’s because it’s such a compelling use case and the value is there.
Keith Kirkpatrick: Right.
Mark Beccue: I think they’re going to get … It really depends on how valuable it is. At the end of the day it’s like, “Well, there’s price, but is it worth it?”
Keith Kirkpatrick: Right.
Mark Beccue: “It looks like, in your case, it would be.”
Keith Kirkpatrick: Right. But if you contrast that with, let’s go back to those assistants again, which are very, very intensive in terms of every time you get a recommendation, boom. I guess we’ve talked a little bit about, potentially, some vendors going to that consumption model. Every time that the model is hit, it costs something. I think the jury’s still out on how effective that really is, based on what is the expertise level of the worker that you’re assisting. If you’re brand new into a job, of course these things are going to help you, because you don’t have that months, or years, or whatever of experience. If you’ve been in a particular role or job for a long time, my sense is that you’re going to know a lot of the stuff. Will it save a little bit of time to do a summarization? Maybe, maybe not.
Mark Beccue: Right.
Keith Kirkpatrick: I think the jury’s still out there.
Mark Beccue: For sure. It’s just because we don’t know how valuable these services will be.
Keith Kirkpatrick: Right.
Mark Beccue: But, if you go back and look at historically what happens with pricing, is you have entry level thinking, then you have this premium price for a little while.
Keith Kirkpatrick: Right.
Mark Beccue: And then, it settles down because there’s scale of all types. There’s competition, which drives down the price. With this stuff, I think we’re going to see the same idea. It doesn’t change it any, it’s not any different. When these vendors are able to get scale, price comes down. So does that mean they change the model? Probably.
Keith Kirkpatrick: Right. I think we can see that happening, not only with technology, or software or anything like that, you look at any kind of technology, you look at the cars that you buy today. There are all of these features that, when they were initially introduced, you had to pay extra. Now, it’s like ABS brakes, every car has them, you don’t even think about it until you’re in a puddle and wind up careening into a barrier. So, I guess to wrap up this section, I guess the big question is how far are we away from achieving that scale where it becomes generative AI is not something that we’re chatting about endlessly here, it’s just part of a SaaS platform and it’s baked in?
Mark Beccue: Yeah, I think it’s a couple of years. But, I’m going to give that as a qualification because, I think you and I both have again discussed, the pace of innovation in the space, and the mutation has been … I’ve never seen anything even close. For example, how quickly those models have morphed to this idea, it’s mind-numbing. It’s head spinning. You could put any kind of thing in there. But, it takes time. I think that the pace of innovation is so good across the ecosystem that there’s good chances these are going to come down pretty quickly. Probably, the toughest part has to do with the chips because those cycles to development are so long, right? There’s been an awful lot of investment by the bigger players, from AMD, to Intel, to NVIDIA themselves. They just put out a new chip that’s even more efficient than the last ones they were doing, even though they’re not purpose-built for AI, those GPUs. Anton at Qualcomm, that’s a really interesting thing. There’s another little monkey wrench for all this is if the SaaS players start to think about on-device AI, so let’s say it runs a different kind of models, much smaller, offline. Well, that changes the game, too.
Keith Kirkpatrick: Right.
Mark Beccue: We’ll see what happens, right?
Keith Kirkpatrick: Yeah. Well, that begs another question. I know you’ve covered this pretty extensively. Look at, let’s say, on your phone you have on-device AI. Then, the question is who adopts that, given that you’re going to have power issues as well, and power demands?
Mark Beccue: Right.
Keith Kirkpatrick: There’s a lot of stuff that’s still-
Mark Beccue: A lot of stuff.
Keith Kirkpatrick: Up in the air. Up in the air. Well, I’m sure we could go on forever about this. I want to put a marker here of let’s check back in, let’s say, six months, and see where we are because I suspect that the world will look quite different than even right now.
Mark Beccue: No doubt.
Keith Kirkpatrick: Based on what you were saying about the pace of innovation.
Mark Beccue: Yeah.
Keith Kirkpatrick: So in the few minutes we have left, I want to again move to our Rant or Rave section here. And of course, for anyone who hasn’t been following along, this is where I throw out a topic, and you get a couple of minutes to either rant or rave about it. Now, because you are our internal generative AI guru, I wanted to talk to you about how companies, generally SaaS companies, are positioning or trying to differentiate themselves based on the generative AI, whether it’s the assistants they use or whatever functionality. And, one of the things that’s really interesting is that they seem to keep talking about the amount of data that they have, that they train their models on. And they’re saying that, “We have more data than any other vendor, that’s why our AI is better than XYZ Vendor.” The thing that I’m going to rant a little bit here is I feel like it’s hard to prove that, unless you’re doing just a one-to-one vendor comparison. And honestly, I’m wondering, is that really the thing that enterprise buyers should be doing, is comparing just based on training data and the amount of training data, or are there other factors that really they should be looking at?
Mark Beccue: Right. Yeah. I think the rant would be we’re in an education stage, very much, and you have to depend on these companies that understand us to navigate that space. That’s why we have, in our show, it’s called the Adults in the AI Rumpus Room, we like to talk about people who are leaders and thought leaders in the space. It’s such an education process that’s needed, they need to think about this. Now, there’s a friction within all companies, between marketing and the pragmatics of what actually happens, so sometimes I think we’re getting in the rant side, just a lot of FOMO about generative AI in general, and getting left behind, and we’ve got generative AI, and all that kind of stuff. So saying that aside, what I think is going to happen, and there’s some interesting thinkers around this, is that all these models are new to us. But, they’re going to be commoditized. The models don’t make a difference, and they’re not a differentiating factor for enterprises, at the end of the day. What is going to be the differentiator is how they use, let’s call it proprietary data.
Keith Kirkpatrick: Yeah.
Mark Beccue: Not large language training, but what they do and how they leverage all this proprietary data that they might have. So again, Salesforce talks about this, Adobe talks about this, there’s others in the SaaS space that do. So how they leverage that, I think that’s the words we use is what are we doing with your data to make this more personalized, more relevant to you. The outcomes are something that that company now does and leverages that puts them as a differentiator with what they’re doing.
Keith Kirkpatrick: Right. I guess my rant centers on the fact that we are hearing from a lot of vendors saying they are trying to do that.
Mark Beccue: Right.
Keith Kirkpatrick: I’m looking for, I guess, a more specific answer. And also, one of the things that some vendors have done a pretty good job about is saying, “Look, we’re not trying to use generative AI over every single thing. We are focusing on these specific industries, and these specific use cases, and making sure that this is where we’re putting our resources. We may not be able to do every single potential use case, or every particular industry out there, but the ones that we’re going to focus on, we’re going to do well because that’s where our expertise is.” And really, as you said, making sure that they’re able to provide real outcomes from this data.
Mark Beccue: Yeah. Maybe if I backtrack a little bit. For SaaS, this is a little different than maybe … The trend right now, and it makes sense is that, let’s say you’re an enterprise, that enterprise and not a SaaS, wants to leverage their data. So how they use generative AI, that kind of thing. So these SaaS players, they’re hands are a little tied as far as how much data they can ingest that these customers will give them. That also, to your point, begs the question that how much are they exposing. It’s specific to the SaaS, usually. It’s not, “Well, this is all our general data, so PII, and our ERP stuff,” and on it goes. I think there’s some players in the space that maybe will look for signs, maybe like SAP. Because they’re more general purpose in their SaaS approach, maybe. They’re good at AI, so it’ll be interesting to see how that works out.
Keith Kirkpatrick: Yeah, absolutely. Well, I think like all this, there’s a lot of stuff that’s still to be decided over time. All right. Well, again, thank you very much, Mark, for joining me here on Enterprising Insights. Everyone out there, make sure that you tune into Mark’s podcast as well, called The AI Moment. Next time here on Enterprising Insights, we’re going to be doing a year-end ramp up in the enterprise software market. I’ll have a couple of other guests to join me, to keep this webcast as lively and interactive as possible. Thanks, everyone, for tuning in. Be sure to subscribe, rate, and review this podcast on your preferred platform. Thanks, and see you next time.
Author Information
Keith has over 25 years of experience in research, marketing, and consulting-based fields.
He has authored in-depth reports and market forecast studies covering artificial intelligence, biometrics, data analytics, robotics, high performance computing, and quantum computing, with a specific focus on the use of these technologies within large enterprise organizations and SMBs. He has also established strong working relationships with the international technology vendor community and is a frequent speaker at industry conferences and events.
In his career as a financial and technology journalist he has written for national and trade publications, including BusinessWeek, CNBC.com, Investment Dealers’ Digest, The Red Herring, The Communications of the ACM, and Mobile Computing & Communications, among others.
He is a member of the Association of Independent Information Professionals (AIIP).
Keith holds dual Bachelor of Arts degrees in Magazine Journalism and Sociology from Syracuse University.
Mark comes to The Futurum Group from Omdia’s Artificial Intelligence practice, where his focus was on natural language and AI use cases.
Previously, Mark worked as a consultant and analyst providing custom and syndicated qualitative market analysis with an emphasis on mobile technology and identifying trends and opportunities for companies like Syniverse and ABI Research. He has been cited by international media outlets including CNBC, The Wall Street Journal, Bloomberg Businessweek, and CNET. Based in Tampa, Florida, Mark is a veteran market research analyst with 25 years of experience interpreting technology business and holds a Bachelor of Science from the University of Florida.