A Key Trend, Enterprise-grade Generative AI SaaS Applications, and Adobe’s Blueprint for AI Success – The AI Moment, Episode 3

A Key Trend, Enterprise-grade Generative AI SaaS Applications, and Adobe's Blueprint for AI Success - The AI Moment, Episode 3

On this episode of The AI Moment, we examine a key trend, the emergence of enterprise-grade generative AI SaaS applications and why Adobe provides a blueprint to enterprises for AI success.

The discussion covers:

  • The key Generative AI trends – the emergence of AI-powered applications. Using OpenText’s Aviators and Adobe Firefly as examples, a look at why SaaS applications with embedded AI will be a critical element of enterprise AI market adoption.
  • A company we like doing AI. Adobe’s Firefly is the most successful generative AI product ever launched. Enterprises looking to operationalize AI can learn from Adobe’s approach. We look at three key lessons to learn from Firefly’s success.

Watch the video below, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Listen to the audio here:

Or grab the audio on your favorite podcast platform below:


Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this webcast.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.


Mark Beccue: Hello, I’m Mark Beccue, Research Director for AI at The Futurum Group. Welcome to The AI Moment, our weekly podcast that explores the latest developments in enterprise AI. The pace of change and innovation in AI is dizzying and unprecedented. I’ve been covering AI since 2016. I’ve never seen anything like what we’re experiencing since Chat GPT was launched this time last year and kickstarted the generative AI movement. It’s why we call this The AI Moment.

With The AI Moment podcast, we try to distill the mountain of information, separate the real from the hype, and provide you with shorthanded analysis about where the AI market will go. In each episode, we’ll dive into the latest trends, the technologies that are shaping the AI landscape and discussions can go anything from the latest advancements in AI technology and the parsing of what I call the mutating vendor landscape, including the big announcements to things like AI regulations, ethics, risk management, and a lot more. So we’re going to cover a lot.

Each podcast is about 30 minutes, typically made up of three or four segments. A guest spotlight where we have a special guest and we chat about what’s going on with them, usually a vendor company about how they’re operationalizing AI, key trends in generative AI where we look at trends, and I’d say trends instead of fads. So we try and narrow that down a little bit for you.

One of my favorites, the Adults in the Generative AI Rumpus Room, and that’s really because the generative AI moment has produced a lot of chaos and disruption and some impatience on behalf of lots of players. So some people have been calm, thoughtful leaders in the midst of that and what we call them, the Adults in the Generative AI Rumpus Room. And the last segment we like to do is called a company we like doing AI. Fairly straightforward. So those are our typical segments.

Today we have two segments. We’re going to do key trends in generative AI, and in that section we’re going to talk about this progression of the market into the rollout of what I’ll call generative AI enterprise applications. And that’s going to be on a focus on two, Adobe Firefly and OpenText Aviator applications. The second segment we’re going to do today is companies we like doing AI and we’re going to stick with Adobe and we’re going to look under the hood of how Adobe is leaning into generative AI and how their journey is a blueprint for how enterprises can roll out AI applications. All right, so that’s the lineup. Let’s get started.

So our first segment is about the move into applications. Now, if you think about where we’ve gotten so far in the AI moment, the generative AI moment, a lot of what we’ve seen so far is been around what I’d call development tools, things that allow people to make things with AI, but they’re not very specific. They may have some vague references to some use cases, but generally you see have an LLM or find development tools that allow you to do what you want.

And not a lot of people have put stakes in the ground, said, “Well, this is what we think is an application that uses generative AI or uses AI and can be used in that sense.” So let’s talk about that and what I meant by that are those are tools, right? You see, we call them the picks and shovels that enterprises need and that kind of thing.

Well, why is it that we’re at that point? Why are we at that point right now? Part of the reason is that no one has really productized any surefire silver bullet killer use cases, but there are several with great promise, and I think that those would include code development, which is really an amazing idea for generative AI. And then I would also include commercial image generation and companies like Adobe and Shutterstock and Getty Images have put out those kinds of things.

I would be remiss in not mentioning that both of those are legitimate applications with very well-defined ROI, but both code generation and image generation continue to face some market headwinds and barriers, particularly around copyright and IP issues. So those will continue. There are some others that use cases that make sense, and we might see a silver bullet kind of magic application of generative AI collaboration tools where you have players like Zoom and Cisco and Microsoft are working on things like meeting summaries and transcriptions of meetings and translation, all of those, they have a lot of promise.

But I’m going to step over and talk a little bit more about one other thing that I’m very skeptical of, and we’re going to use a whole podcast on this at some point, but I’m really still skeptical of the broader text generation use case. You can read my argument against text generation. It’s in a piece called a Manifesto Against Generative AI Writing. It’s on a website called AI Business. If you guys send me a message, I’ll be happy to put you in touch with how to get to that. But I would add that I don’t think there’s a legitimate market for this automated robo sales emails, marketing emails, that maybe it’s really just the idea that there’s so many applications out there thinking about this kind of thing that it’s really solutions looking for a market.

So that brings us back to an idea. I want to talk about OpenText and their Aviators. And what I think we need more of is in the marketplace are SaaS companies that have done AI use case thinking for enterprises. So they’ve done the thinking about how this would be used. We need more companies like OpenText to show the market that there’s a very specific use for this AI that we’ve designed an AI that makes the applications you buy from us better, that improves our application.

So I sat with OpenText’s Chief Product Officer, Muhi Majzoub, and he was really clear about they are embedding AI into their systems and solutions. They call them Aviators. And what they want to do with those is to automate things. And they gave a couple of really cool examples. One was there’s an Aviator that automates my IT trouble tickets. And I think for most people I can understand what that does. Okay. It’s like, okay, you’re going to give me something, automate something, I don’t have to do it. Great value. And it helps introduce AI to people. They can see what it can do, right? Makes sense. Smooths out.

They had another one. It was an Aviator that smooths out project management and it offers the same kind of understanding and assurances. It’s very specific. So what I think is going to happen is you’ve got all these companies that are going to experiment with AI and they’re going to build IP around AI capabilities, but they aren’t going to do it for everything. They’re not going to do it for everything. I’m going to stop there for a second before I go down that road and give you another example is about Adobe. And so Adobe built Firefly and it is a image generation tool for the creative industry.

It was soft launched in March and it was publicly available, generally available in June. And it’s done really, really well. The people that use that are cutting lots of time off of what it takes to do graphic art and working in the creative community and the digital media communities, it’s very helpful. So I just want to mention that that’s another one I think is really good.

But like I said, most companies are starting to experiment with AI and they think they’re going to build this IP around AI capabilities, but they’re not going to do it for everything. And I think that enterprises really will increasingly rely on their SaaS partners, SaaS vendors that they use to provide them with stuff that’s purpose-built and ready-made, that tackle capabilities and functions that’s not core to their business. So you’re always going to have some things that you just use anyway. I’m going to use these SaaS companies. And now those SaaS companies, their applications are embedded with AI.

So I think that that’s one area that we’re going to continue to see growth in is these ideas where applications will be used that embed AI and that’s how AI is going to get introduced into the marketplace. And I think that when we have those kind of applications from these SaaS leaders, it calms the market down. It reassures people, okay, this is something I can trust. I can see how it works, I understand what they’re trying to do with it, and I think it’ll give a lot of definition to the marketplace as a whole.

Our second segment today is about a company we like doing AI and today we’re going to talk about Adobe. Recently, Adobe announced at one of their shows called Adobe MAX, it was last week on the 10th, they revealed a bunch of different capabilities that Firefly can do now, very interesting stuff. It’s a whole new slew of different models they’re using that Firefly can do. So for those of you not familiar with Firefly, it’s an image generation tool used within the creative industry we talked about in the last segment.

So a few things that they did that are updates, right? It was three different models they’ve introduced within Firefly, I’ll give you a little bit. One’s called Firefly Image 2, and this is the next generation of the image model. And it generates what they say higher quality images including like improvements to human rendering quality. It’s got better colors, greater ability to control the outputs. And this improvement that they’re getting from this is because Adobe has increased the data that Firefly trains on.

And they also said that there’s new text image capabilities being added to the Firefly web app. And they also noted interesting that 90% of Firefly web app users are new to Adobe products, which is interesting. They added a piece called Generative Match, which enables users to apply the style of a user specified image to generate new images at scale, A whole bunch of different things. So all that’s within this Image 2 model.

They did another piece, I’ll just mention these real quickly, one’s called Firefly Vector Model, which is something important to creative designers. Vector graphics allow for patterns to be done infinitely. And so they added generative AI capabilities to that. And they had another piece called Design Model, which works out of templates and did some upgrades to that for how that works with Firefly.

So here’s what’s interesting about that. Like I said earlier, Adobe introduced Firefly is a beta product in March and it went into general release in June. And here we are in October. And one of the things that Adobe shared with everyone at their MAX show last week was that over 3 billion images have been created with Firefly since March.

That’s pretty phenomenal. So why is it such a hit? It’s interesting, we mentioned this up in the earlier segment, it’s because it’s a really practical and logical use case for the creative industry. Adobe’s been working with the creative industry, that’s their primary target and primary customer base for years. So it makes sense. They’re thinking about what their customers need. So first thing I think Adobe does right in AI is they ask themselves this question, which every company should do when you’re thinking about AI. And the question is, what problem am I trying to solve with AI? That’s where you start.

So you start with the problem and then you apply a potential remedy or answer, which this image generation is enabling creatives to work more quickly. Like I said earlier, it’s automating these routine graphics. And what Adobe said is that it’s made creative work and content generation has to expand. So their vision is that the use of content generation expands through the organization to marketers just beyond creative professionals and Firefly, because it’s a democratized way of making images and making creative art, is a direct way for Adobe to help those companies accomplish that goal of spreading content generation outside of the creative groups.

Another thing that Adobe does well is, I put it this way, do you as a company have the people and processes to execute AI? And I wrote this back in June when I did a research note about Firefly. I’ll read this to you what I wrote. It says, “It’s difficult for companies to duplicate Adobe’s AI experience. Many Adobe competitors are likely not as far along in their AI experience. Some might be thinking about AI for the first time because of the generative AI phenomena. To leverage AI, companies must step through the AI lifecycle process. What AI are we using? How does this AI help us? Should we use this AI? And what are the risks of using this AI?” Those are all sorts of questions you asked. So I wrote that in June and I’ll elaborate on that a little further and say Adobe has lived with AI and worked with AI and experimented with it for nearly 10 years.

So they’ve had the opportunity to figure out by trial and error what was needed in terms of the people and processes that make AI work. So beyond the technology. By lifecycle, what we mean is that through experience, Adobe understands the infrastructure needed to execute on AI. And that has more to do with organizing people into the right teams. Things like data management, data governance, and lastly something building a framework to manage AI risk.

None of those things have to do with the technology and rather with developing the people and the processes that points the technology in the right direction. So that’s point number two. Do you have the people and processes? And then finally this last piece, you have to have time with AI and experience with AI. Time with AI and experience with AI counts. So when you have experience with AI, you move faster.

So those folks that are starting right now, they don’t have experience, they’re not going to move very fast. Experience means that you have the better understanding of the AI issues, particularly how they impact your communities of interest. There’s lots of examples of speed. Let’s talk about that for a minute. I was meeting with leaders of the AI tech within Adobe recently, and they talked about how much they learned between the March beta launch of Firefly, particularly because of the heavy use, what they learned from that, and one thing they said they learned is that Adobe needed to figure out how to run Firefly AI inference at scale. So they wanted to do it faster and scale it up.

And that was part of the impetus for the Firefly Image 2 model, which brought in more data to train on, allowed them to do inference at scale. So think about this, again, I’m going to keep mentioning it, they launch in June, there’s an update here in the middle of October, and that’s speed. In terms of understanding the issues impacting communities of interest, As Firefly has rolled out, there’s been a lot of feedback from creators, different kinds of feedback. Adobe knew that the creators would feel the need to be compensated and protected when image generation rolled out.

So there were multiple fronts where Adobe went to work. One is called the Content Authenticity Initiative. This is a standalone initiative. Adobe’s, they’re a big member of it, but it’s a separate group and Adobe’s been working within that to make sure that creators who don’t want their work to be part of generative AI training are protected. Now, I will note that in terms of what Adobe does within Adobe, Adobe stock creators have already signed agreements where their work can be used for training. So it doesn’t apply to that piece.

In terms of compensation, a prickly issue, it’s been bandied about between the different commercial image generation players. Adobe has a path that they built what they call the Generative Credits Program. Last piece to that is, again, in terms of protection, Adobe also made indemnification pacts with their creators and they’ve pledged that they will cover any legal costs for IP or copyright disputes. And so it’s called the AI Art Indemnification Policy. And it says “we’ll protect, defend, and hold harmless” customers who license Adobe generative AI art that infringes on third party copyrights.

The last piece we talked about this last week, the company’s created language in a US federal law framework called the FAIR Act, which would protect generative AI creators from deliberate attempts to steal their work. So the language in that framework is being socialized by Adobe with the White House and a US Senate committee. It’s not anything near a law yet, but they’re working on it. So that gives you an idea. Those three things, right? Really, I’ll reiterate what those are.

So in summary, three things that Adobe does that people should aspire to in rolling out their AI initiatives. Does deploying AI make my company better? That’s point number one. Number two, do I have the people and processes to execute AI? And three, understanding that with time with AI and experience with AI counts in understanding what markets you’re trying to work with.

So I think that wraps it up for this week. Appreciate everyone taking a listen. Thanks for joining me here on The AI Moment. Be sure to subscribe and rate and review the podcast on your preferred platform. And thanks again and we’ll see you next week.

Other insights from The Futurum Group:

The AI Moment, Episode 2

The AI Moment, Episode 1

Author Information

Mark comes to The Futurum Group from Omdia’s Artificial Intelligence practice, where his focus was on natural language and AI use cases.

Previously, Mark worked as a consultant and analyst providing custom and syndicated qualitative market analysis with an emphasis on mobile technology and identifying trends and opportunities for companies like Syniverse and ABI Research. He has been cited by international media outlets including CNBC, The Wall Street Journal, Bloomberg Businessweek, and CNET. Based in Tampa, Florida, Mark is a veteran market research analyst with 25 years of experience interpreting technology business and holds a Bachelor of Science from the University of Florida.


Latest Insights:

CTERA Adds Cyber-Resilience with New Honeypot Functionality
Mitch Lewis, Research Analyst at The Futurum Group, covers CTERA’s new honeypot cyber-resiliency functionality.
Quantinuum Announced a Dramatic Improvement in Error Rates that Should Lead to Faster Adoption of Quantum Error-Correcting Codes
The Futurum Group’s Dr. Bob Sutor discusses Quantinuum’s announcement of achieving better than 99.9% 2-qubit gate fidelity and what this means for quantum error correction.
On this episode of The Six Five Webcast, hosts Patrick Moorhead and Daniel Newman discuss Apple Vision Pro developers losing interest, U.S. awards Samsung and Micron over $6B in CHIPS Act funding, does AMD have a datacenter AI GPU problem, Adobe’s use of Midjourney, Samsung knocks Apple off of number 1 market share, and Arm says CPUs can save 15% of total datacenter power.
In Recent Years, the Concept of a Sovereign Cloud Has Gained Significant Traction Among Nations Seeking Greater Autonomy and Security in Their Digital Infrastructures
The Futurum Group’s Steven Dickens observes that Oracle's recent $8 billion investment in Japan not only expands its cloud infrastructure but also strategically aligns with the growing global trend toward sovereign cloud solutions.