Search
Close this search box.

2023 AI Product of the Year, AI Company of the Year | The AI Moment, Episode 9

2023 AI Product of the Year, AI Company of the Year | The AI Moment, Episode 9

On this episode of The AI Moment, we discuss my picks for 2023 AI Product of the Year, AI Company of the Year

2023 AI Product of the Year, AI Company of the Year: In this time of AI disruption, who has shined? In my view, it was those which are building and delivering pragmatic, enterprise-grade solutions. Those who have been invested in and have understood AI for some time. Those who have clear visions and goals for end results. They are AI innovators. With that in mind I developed a list of companies and products that I felt were the AI innovation leaders of 2023.

Watch the video below, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Listen to the audio here:

Or grab the audio on your favorite podcast platform below:

Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this webcast.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.

Transcript:

Mark Beccue: Hello, everyone. I’m Mark Beccue, Research Director for AI at The Futurum Group. Welcome to The AI Moment, our weekly podcast that explores the latest developments in enterprise AI and we literally are in a moment. The pace and the change and innovation in AI is dizzying, unprecedented. I’ve been covering AI since 2016, never seen anything like it, what we’ve experienced since ChatGPT launched in October ’22 and kickstarted the generative AI era.

At The AI Moment Podcast, we try to distill the mountain of information, separate the real from the hype, and provide you with shorthanded analysis about where the AI market will go. In each episode, we dive deep into the latest trends in technologies that are shaping the AI landscape. We cover things from analyzing the latest advancements in the technology and parsing through the mutating vendor landscape to things like AI regulations, ethics and risk management. So each show is typically made up of three or four segments, two or three segments, three or four segments, anything from a guest spotlight to our regular features about Key Trends in Generative AI and our popular Adults in the Generative AI Rumpus Room.

Today we have a special presentation. Since it’s been the end of the year and we’ve been thinking a lot about 2023, today I’m covering two pieces. We’re going to call it the 2023 AI Product of the Year and 2023 AI Company of the Year. Now, these are my own opinions as Research Director for Futurum Group, not The Futurum Group as a whole. I want to make that clear. That’s what we’re going to do. So first, we’re going to start with product of the year, and give me one second here, guys. So I want to give you some background on how we went about this and what the thinking was. I thought about it this way. I said, “We’ve had this disruption in AI, and during all of this, we just mentioned it being such a crazy year, who shined?” In my view it was those who are building and delivering these pragmatic enterprise grade solutions. They’re basically companies that have been invested in and understood AI for some time. That generally tends to be the case. They also tend to have clear visions and goals for end results. In my opinion, they’re the AI innovators. They’re just people that have thought about this thoughtfully and well.

With that in mind, I’ve developed two lists. One was a list of companies and a list of products that I felt were the AI innovation leaders of 2023. Now, there were really two factors that went into my thinking heavily. One was they had to be enterprise focused, and the second was that there was a general availability of something during 2023. So the two categories, Product of the Year, Company of the Year. We’re going to go through product of the year first. So I had really five nominees that really floated to the top, and I’ll just name them for you. They were Qualcomm’s Snapdragon X Elite and Snapdragon 8 Gen 3 systems-on chips. That’s number one. Number 2 is Meta’s family of the Llama AI models. Number three was Adobe Firefly. Number four is Microsoft Pilot, Copilot, excuse me. The final one is IBM’s watsonx.governance. So of those five, thought about all of them, and I felt like my winner for the year is Adobe Firefly, and I’m going to explain why.

Firefly is, in my opinion, the most commercially-successful generative AI product that’s ever been launched. Since it was introduced in March in Beta and it made generally available in June, in the last count in October, it has generated more than 3 billion images. Adobe says that Firefly has attracted a significant number of new Adobe users, and that makes it hard to imagine that it’s not aiding Adobe’s bottom line. They haven’t really shared all of those details yet, but clearly it’s making a difference in the company. To me, this is not a fluke, the runaway success of Firefly. It’s the result of a long-term investment by Adobe in what they saw as the promise of AI in their business. They have a methodology and a culture and a process that they bring to bear for them that just everyone should aspire to how to do this. It’s really well thought through. We’ve talked about this before, but I think it’s that part of it is that they think of, “How do I solve an issue I have, and what is the best thing to technology or process I’m going to do to address that?”

So if you think about their business and what they do with images and how they serve creatives and marketers and the folks in those types of businesses and what image generation can do in that business to save time and expedite more redundant processes, it was really a home run hit. So they are my product of the year. We have lots of details about this on our website. If you go to futurumgroup.com and go to our Insights page, you can search by my name, and you’ll see all the research notes I’ve written over the past year. One of them was titled Adobe Firefly: Blazing a Generative AI Application Trail. That’ll give you a lot more detail about the product and why we think it’s so cool. So the others were really runners up, and I’ll give you a little bit of an explanation of why I thought they were great and good products. So the Qualcomm Snapdragon SOCs, that was interesting because I think that their introduction is something that’s going to trigger legitimate on-device AI.

This is a concept that might seem to defy the current convention of compute munching AI workloads, so on-device, thinking of smartphones and laptops and that kind of thing. So remember this, not many players make GPUs and even less make GPUs for mobile devices, so those PCs and smartphones. There are really four that I could think of. There’s Qualcomm, Intel, AMD and Apple. Of those, the ones that have been thinking about on-device the most and the longest is probably Qualcomm and Apple. So if you look back under the covers of this product, Qualcomm has been working with AI for more than 10 years. All that time, they have been thinking about on-device AI, and the only difference today is it’s really a bigger opportunity because of generative AI. It seems to me that Qualcomm is further along in terms of the partnerships and the ecosystems to support on on-device AI than others, and this includes the entire AI stack. Maybe they have a bigger development ecosystem, particularly around mobile than a lot of other players.

I think they’ve really thought about this pretty deeply, about the pragmatic logical use cases for on-device AI such as camera-based use cases and something like video conferencing. So those chips are really been sending some sensation through the marketplace of how powerful they are and what they can process with low power. Really cool stuff, so they were one of our runners up. The second runner up is Microsoft Copilot. The way I look at it is this, is that if Copilot rolls out smoothly, AI will become a mass market technology within the next 18 months. If you look at that Copilot, if it works, if they can orchestrate the apps that they’re embedded into Microsoft as they’re envisioned, you have work and personal productivity that will rise simply because of the Microsoft users alone, there’s so many and fundamentally change how we interact with that software. Its success or even the promise of success will spur greater investment by enterprises to leverage power of AI.

I think that because Copilot’s success will really do another thing, and that’s solidify Microsoft’s stranglehold market share for Windows OSS and Microsoft’s 365 applications. It could create greater opportunities for Microsoft to gain market share in enterprise applications. The company doesn’t currently dominate things like sales and marketing and CRM solutions such that Salesforce and Adobe offer or ERP, enterprise resource planning type software that come from places like SAP and Oracle or ServiceNow. Even Microsoft Teams becomes more powerful and increases Microsoft’s potential to grab more market share and collaboration tools. One last thing. It’s possible that Copilot success might give Microsoft a chance to break Google’s dominance in search. So if you want to read more about that idea and what we thought there in terms of Copilot, again, if you go to our website, search under my name, there is a research note I wrote called Microsoft Copilot will be the AI inflection point. So that’ll give you more details there. The last two runners up for product of the year were Meta’s Llama models and watsonx for IBM.

So let’s talk about Metis Llama models. They’ve been a champion of really two important trends this year. Open source AI models and smaller language models. The Llama models, particularly when you start to look at Llama 2 in July, have enabled countless developers and enterprises to launch AI initiatives and to experiment with how to leverage their proprietary data in that regard. Perhaps more importantly, the Llama models have quickly become proof points for the impact and effectiveness of smaller language models delivering better results and requiring significantly less compute power to do so. So really, they’ve been instrumental in a really key part of the market these products. So Meta’s work has really sparked more open source AI models, and it’s really opened the door for that piece we talked about before on-device AI.

The last one in the group is watsonx.governance. I think this is important and a really key product for the year because AI risk management is foundational and critical to operationalizing AI. Enterprises are either going to learn this the hard way by ignoring it or the easier way by embracing it. IBM is in this great position to help enterprises navigate AI risk management. They’re one of the AI pioneers and they’ve been an innovator in the AI space. Along with a handful of other companies IBM has thought about and worked with AI for many years. That experience comes into play when thinking about how to operationalize AI and what it takes to be successful in AI. They’ve gone through this process, and they’ve had the time to think about what it is, and they’ve had time to experiment with how to use it. The benefit of that experience, they understand AI risk and the lifecycle. What’s cool about the product is it really knows all those things, but it gives you some automation and speed and some guardrails around how to do this. So companies don’t have to work from scratch, and it allows them to move forward and put guardrails themself in place, and it’s a great tool. So again, there’s a research note on that. It’s called IBM watsonx.governance Tackles AI Risk Management. So if you’d like to see that, we have that. All right, those are our products.

The second thing we’re going to talk about is Company of the Year. This one, a little different criteria, it’s the same criteria, but I’ll give you reasons why I thought they were Company of the Year. It’s really around not particularly the one product, but how they addressed how they went about the market. So let’s go through that. I had four nominees for this. It was Databricks, LangChain, Microsoft and Hugging Face. My winner for Company of the Year is Microsoft. Here’s why I think that there’s really no company has taken a bigger risk on a promising but untested AI technology partner than Microsoft has done with OpenAI many times such a gamble can go sideways. But Microsoft has, I would say masterfully managed their OpenAI relationship since 2022. I include the Altman debacle in that, debacle, let me say that debacle, in November. The crowning achievement of that partnership is the successful channeling of OpenAI’s IP to produce the Copilot suite of initiatives across these wide range of Microsoft products that we talked about a little bit earlier.

So if that continues to roll out smoothly, Copilot, they’re going to change the market like we said, and I think that’s important. These things we talked about with productivity are going to come to bear. So again, the importance of Copilot is huge. I think that what’s interesting about Microsoft was they were able to move very quickly, sure handedly forward with generative AI because they’ve been so heavily invested and you’re seeing this trend, and we’ve talked about this in the product side, they’ve been invested in AI for more than 10 years as well. They’ve built the expertise not only to understand what the technology can do, but to how to build the proper guardrails. We mentioned this with IBM and how to leverage it responsibly and at scale. So I wrote a few pieces about this. I mentioned the Copilot launch before, but I took a deeper look at how they have thought about taming LLM issues, so went into detail on that.

There’s a research note called Under the Hood: How Microsoft Copilot Tames LLM Issues. If you’re interested in that, it’s really good, talks about how they tackle these things and using their experience. Another piece to this I’d say is they have this super hot long history with enterprise software security, and that factored into these guardrails that they’ve built. So another element of Microsoft’s AI investment that’s not only that they’ve helped them but the entire market move more quickly is the company’s AI research. Particularly around evolving in these smaller language models we mentioned earlier, there’s a few that they’ve worked on. One’s called ORCA 2, another’s called Phi-2. If you’d like to see some details about that, I wrote a research note about Microsoft research and particularly about Microsoft Orca 2. It’s called Microsoft Orca 2: The Biggest Generative AI Breakthrough Since ChatGPT. I still stand by that.

Then finally, if you’re putting all these pieces together about Company of the Year, I think that their work expands beyond the cloud to on-device or edge AI. We’ve got millions of these personal computers out there that are tablets, laptops, desktops that run on Microsoft OS and /or their enterprise software that they like to run. So what’s interesting is Microsoft is working with their OEM partners to enable on-device AI. The way they’re doing this, one of the more intriguing initiatives I think they have in that regard is the launch of Windows AI Studio, which just flew under the radar. It wasn’t really mentioned a lot this fall. But what it is is it’s a new AI experience to help enterprises and developers jumpstart local AI development and deployment on Windows.

So it’s geared and keyed towards how do you build these applications specifically, AI applications specifically for Windows? I wrote a research note about that. It’s called Windows AI Studio: Jumpstart for On-Device AI? So you can read more details there. So you can see I think a really comprehensive thoughtful approach to AI for the year puts Microsoft on top. So we had some runners up, and I’ll talk about them real quickly in no particular order. I didn’t like to do that. They’re all runners up equally, right? Same as with the products. So the first one we’ll talk about is Databricks. What I think is interesting about Databricks is their mantra is, “To bring AI to your data,” that’s being said a lot now, but they said this early on in June, “and to unify your data in AI through governance, warehousing, ETL, data sharing and orchestration.” I’m using their words there.

I really think that the company was ahead of its time in this thinking. They are in essence a data management company purpose-built for AI. It’s interesting because they’ve recently been valued at more than $43 billion doing this kind of work. Let me give you a little color to this, why I think they’re thoughtful. The companies had what they call the Data + AI Summit in June, and this is early. So if you think about where we’ve come this year with generative AI, these things were happening by the moment as we speak. Here it is, it’s June, and at the summit they talked plainly and very knowledgeably about how enterprises should think about AI. They offered some very pragmatic guidance at the time when there were a lot of unknowns about generative AI. I’ll give you two examples. In one of the keynotes that the CTO, his name is Matei Zaharia gave, he put up this graphic and it says, “Problem, native adding…” Excuse me, “Naively adding an LLM assistant doesn’t work.” Great stuff. This is June, June when somebody’s saying this. So that’s point one.

Second example, during another keynote presentation, a speaker put up a graphic with the following texts, and I’m quoting, “Modeling techniques will quickly commoditize. Your data is your competitive edge.” Now this is contrary to a lot of the conventional messaging around LLMs, which is that more data is better. We’re really learning more along these lines now, but this was June, so interesting. I think these are just examples that give you an idea of the approach that was taken during the event and that really, Databricks has the knowledge and the culture and the vision to become a leader in helping enterprises navigate generative AI, so they were a runner up. Two more runners up, Hugging Face, and I think they’re important and a nominee in our space because since open-sourcing the model behind there, they did this, they open sourced the model behind their, what was a teen focused chatbot.

Since then, Hugging Face has been well-regarded, but a little-known champion of open source development. All that changed in 2023 as the company became this powerhouse player in the generative AI market ecosystem. Throughout 2023, Hugging Face has really tirelessly pursued a range of strategic partnerships to grow the AI open source community, and it includes several bilateral joint initiatives, one with AMD, another with Intel, NVIDIA, AWS and Dell. There’s no question that a significant amount of the rapid innovation in development of AI models that we saw in 2023 came from the Hugging Face community. So I think they deserve a lot of credit for what they brought to the table.

Finally, in my last company years, a little bit of a dark horse, and there are tons of companies that are deserving, but I wanted to point out a small startup called LangChain. Here’s why I think they’re important. They’ve developed this what’s become an indispensable framework that enables developers to easily link LLMs to external data sources. This gives the models knowledge of that recent data without any limitations. So one of the challenges of using many LLMS is their knowledge is limited, right? They’re trained and they’re retrained periodically, but not in real time. So LangChain’s framework addresses this, and it enables models to produce more accurate results. This is amazing because the company was really launched, it wasn’t even a company, it was launched as an open source project in 2022, and they were incorporated as a company, LangChain was, in April of 2023. So amazing work by them, really interesting stuff.

So those are our two awards. We’ve got Product of the Year is Adobe Firefly. We have AI Company of the Year is Microsoft, and that’s where I’m thinking that’s where we’re going. I appreciate you being with us. Thanks for joining me on The AI Moment this week. Be sure to subscribe and rate and review the podcast on your preferred platform. It includes YouTube, which is where we tend to get most of our action seems like. So thank you again, and we’ll see you next week.

Other Insights from The Futurum Group:

Adults in the AI Rumpus Room: The Best of 2023 | The AI Moment, Episode 8

Top AI Trends for 2024 | The AI Moment, Episode 7

On Device AI Part 2 | The AI Moment, Episode 6

Author Information

Mark comes to The Futurum Group from Omdia’s Artificial Intelligence practice, where his focus was on natural language and AI use cases.

Previously, Mark worked as a consultant and analyst providing custom and syndicated qualitative market analysis with an emphasis on mobile technology and identifying trends and opportunities for companies like Syniverse and ABI Research. He has been cited by international media outlets including CNBC, The Wall Street Journal, Bloomberg Businessweek, and CNET. Based in Tampa, Florida, Mark is a veteran market research analyst with 25 years of experience interpreting technology business and holds a Bachelor of Science from the University of Florida.

SHARE:

Latest Insights:

Sovereign Cloud Deployments: The Race Among Hyperscalers
Steven Dickens, Chief Technology Advisor at The Futurum Group, shares insights on Oracle’s US$6.5 billion investment in Malaysia's sovereign cloud. This move, alongside strategic hyperscaler partnerships, positions Oracle to lead in AI innovation and regulated cloud deployments.
VAST Data Adds to Its AI Capabilities With New InsightEngine Targeting RAG Workloads
Mitch Lewis, Research Analyst, Camberley Bates, CTA, and Mitch Ashley, CTA, at The Futurum Group share their analysis on the VAST Data’s InsightEngine with NVIDIA announcements.
Krista Case, Research Director at The Futurum Group, overviews NetApp Insight 2024.
HPE Aruba Networking Central: Now Scintillating Yet Smoothing
The Futurum Group’s Ron Westfall examines why the new HPE Aruba Networking Central solution can deliver the purpose-built AI, contextual observability, architectural expandability, and improved configurability key to swiftly improving network management, security, performance, and visibility.