In this episode of The 5G Factor, our series that focuses on all things 5G, the IoT, and the ecosystem as a whole, I’m joined by my colleague and fellow analyst, Olivier Blanchard, for a look at the top 5G developments and what’s going on that caught our eye.
Our conversation underscored:
Qualcomm Spotlights Benefits of Running Gen AI on Devices. We see the cost of running generative AI models on-device versus the cloud translates directly to the amount of power required to run these models. Edge devices with efficient AI processing offer leading performance per watt, especially when compared with the cloud. Edge devices can run generative AI models at a fraction of the energy, especially when considering not only processing but also data transport. This difference is significant in energy costs as well as helping cloud providers offload data center energy consumption to meet their environmental and sustainability goals. We examine why on-device processing performance of mobile devices has increased by double-digits with each technology generation and is projected to continue this trend, and how Qualcomm’s portfolio is integral to optimizing generative AI models across the global digital ecosystem through edge device optimization.
Ethernovia: Accelerating Vehicle Data Architecture Transformation. New entrant Ethernovia is charting a portfolio development course that aggregates enormous quantities of data and routing between AI and SoC chipsets, which requires high bandwidth, low latency intelligent fabrics. As vehicle architectures evolve from domain-centric controllers toward zonal architectures, networking solutions must concurrently evolve to support higher data rates of advanced vehicle applications while meeting demand for improved reliability and security. We assess why zonal architectures are critical to advancing and scaling fast-evolving software-defined vehicle (SDV) applications such as Advanced Driver-Assisted Systems (ADAS), autonomous driving (AD) and a rich ecosystem of customer software delivered Over the Air (OTA), including 5G connectivity.
Qualcomm’s Snapdragon Digital Chassis Will Power the 2025 Escalade IQ. Qualcomm’s announcement that the 2025 Cadillac Escalade IQ will leverage several core automotive platforms from its Snapdragon Digital Chassis solutions stack in one vehicle points to both the growing appeal for automakers to streamline the design of their software-defined vehicles and to the accelerating maturity of Qualcomm’s portfolio of automotive solutions. The four pillars of Qualcomm’s Snapdragon Digital Chassis stack focus on connectivity, cockpit tech (digital instrument clusters and infotainment), Car-to-Cloud features (think OTA updates), and ADAS (the Snapdragon Ride Platform delivers a broad swath of driver-assist and autonomous driving features). We review GM’s decision to power the 2025 Escalade IQ with key Snapdragon automotive platforms and why it sends a clear signal to the industry that it trusts Qualcomm to deliver the goods when it comes to uncompromisingly redefine the Escalade for the EV era.
Watch The 5G Factor show here:
Or listen to the full audio here:
If you’ve not yet subscribed to The 5G Factor, hit the ‘subscribe’ button while you’re there and you won’t miss an episode.
Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.
Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.
Transcript:
Ron Westfall: Hello and welcome, everyone, to The 5G Factor. I’m Ron Westfall, research director here at The Futurum Group, and I’m joined today by my distinguished colleague, Olivier Blanchard, and he’s our research director for all things devices, policy and automotive. And naturally, it’s just great to see him on the show today, and we’ll get to him introducing himself. But as a quick overview on The 5G Factor, we dive into all things that are related to the 5G ecosystem and the IoT and the major developments that have caught our eye. And so with that cue, Olivier, welcome back, and so many thanks for joining today’s webcast. Please, reintroduce yourself and give folks an update on your coverage areas and so forth.
Olivier Blanchard: All right. Well, it’s good to be back. Thanks for the warm and accurate introduction. My name’s Olivier Blanchard, I am research director, and my focus is on devices, automotive and policy and regulations as well.
Ron Westfall: Right on. Thumbs up. And yes, this will be like a back to the future discussion here because one thing that definitely has caught our eye is AI. Imagine that. As we know, AI is a white-hot topic and it’s been dubbed, the summer of AI, more like the year of AI, or potentially, the decade of AI.
Olivier Blanchard: The decade, yeah.
Ron Westfall: And so as a result, it just makes a lot of sense to… Let’s look at some of the most important AI developments as we revisit what’s going on with AI across the 5G ecosystem. And to start off, the cost of running generative AI models, on-devices versus the cloud, I believe, translates directly to the amount of power that’s required to run these models. That’s a simple logic. And as we know, inference processing of large generative AI models can require the use of several AI accelerators such as GPUs or tensor processing units or TPUs. And for that matter, even several servers. Now, with edge devices, efficient AI processing can offer leading performance per watt, especially when compared to the cloud. And this is something that is, basically, I think, critical because as we all know, organizations out there, of all stripes, of all sizes are looking to improve their energy efficiency capabilities.
And that is naturally to fulfill sustainability goals, ESG objectives and so forth. And it’s just great business all around. And so what I’m seeing is that edge devices can run generative AI models at a fraction of the energy, especially when you factor in, not only processing, but also the data transport that’s involved with the cloud. Now, this difference is significant because the energy costs, as well as helping cloud providers themselves offload data center energy consumption, again, can make a huge difference in meeting their organization-wide environmental goals. In addition, I’m anticipating that AI performance be critical or already is, but it’ll become all the more so. And so as such, the transfer storage and use of data on multiple platforms and cloud services increases the potential for data tracking, data assessments, and well, for that matter, data manipulation and data theft, which, obviously, nobody wants, at least the good guys. As such, I believe on-device AI intrinsically can help protect users’ privacy, since the queries and personal information range solely on the device.
And that’s part of the beauty of having SIM protected or SIM enabled devices such as smartphones and laptops, and for that matter, the iPad and so forth. Now, this is important because consumer data, as well as providing an additional level of protection for all important information such as medical data, enterprise data, top secret government data, and basically, any sensitive application out there that demands this extra layer of security. And I’m expecting that a programming assistant app for generating code can run on a device without exposing the confidential information to the cloud. So what’s not to like there? And that’s just the tip of the iceberg in terms of why generative AI and devices is such a great idea, breakthrough for the industry. And we see Qualcomm driving a lot of this conversation, and basically, gaining some thought leadership in this regard. And with that, Olivier, what are you seeing in terms of on-device generative AI developments? What do you see that’s, basically, standing out and jumping out at you?
Olivier Blanchard: Well, it’s going to be a huge push in the next year, for sure. And I think it’s going to be an opportunity for some the silicon manufacturers to differentiate themselves from the rest of the market, especially in the high ends. But I think I want to preface all this with just a reminder that AI, not necessarily generative AI, but AI, an extremely powerful and versatile AI has lived on-devices for quite some time. We’ve seen it with on-device translation, like live translation with Qualcomm, SoCs on mobile phones for the last few years. You’ve seen it, even if you haven’t necessarily understood that it was AI, pretty much every camera and image editing software, whether it does something automatically or helps you do it with manually added filters, all of that is controlled by AI. Even power consumption and optimization on your phone, on your devices has been handled by AI.
So AI and GPUs have been getting a lot more high performance on phones and other devices for quite some time. So the trend has been moving in that direction for the last three or four years, at least, where it’s been noticeable and where specific designs and functionality has been focused towards building out this on-device AI capability for a lot of things. So what we’re seeing now with generative AI is that we’ve essentially taken the next step in the evolution of AI. And generative AI means a lot of things to different people. On the one hand, you have the large language models, the GPTs, but there’s a lot of other stuff that falls into the category of generative AI, whether it’s creating images based on text prompts or enhancing images or essentially, creating emails out of just a few key parameters or tasks.
It can be a lot of different things. It can be personalization, it can be an onboard digital assistant that teaches itself to understand your needs and anticipate them. So you still have a need for cloud-based generative AI infrastructure, but a lot of the applications that involve generative AI, whether it’s just an element of the application or the core of the application, can live on devices, whether it’s your watch, your phone, your laptop, whatever else.
Ron Westfall: Right on. And that, I think, was very valuable, the fact that this has been going on for several years. So AI had this tremendous breakthrough earlier this year, naturally, when Microsoft, basically, came out and demonstrated, hey, generative AI can definitely transform the search experience, for example, and ChatGPT as being a prime example. But it’s more than that. Already, your device is supporting millions of parameters in terms of AI capabilities and it’s, potentially, going to hit a billion, if not already. So that’s just one example. But it’s also about enabling, for example, coding and other applications.
So this is something that is definitely going to be dynamic for quite some time, and it’s also very exciting. And so I think this is something that people need to, I guess, gain more context on. It’s not just about AI training, AI clusters, using massive GPU clusters, it’s also about its fact that it’s highly distributed. It’s relying on a fabric, basically, to be able to execute both the training, which is intensive and can be cloud centric, but also, it can definitely be offloaded or shared or distributed on devices. And that, I think, is going to be a major difference maker for the reasons we touched on, energy efficiency gains, flexibility, security and so forth. So-
Olivier Blanchard: One thing that you mentioned about security, I think it’s one of the… It’s not, necessarily, the sexiest aspect of this. People look for generative AI on-device in terms of productivity and creativity. They don’t necessarily think about security, but security is going to be a huge thing because depending on your query, depending on even just the type of generative AI that you’re talking about, for instance, your device has a digital assistance that is getting better and better and better at predicting your needs, at understanding your needs and understanding your purchasing habits, at making purchases for you, creating lists, for example, of a shopping list, whatever it is. A lot of that stuff doesn’t necessarily… first of all, doesn’t need to live in the cloud period because there’s no need to offload it to the cloud when the device can handle it itself. But also, by keeping a lot of this stuff on device, it protects it better.
And the devices themselves are getting much better at creating security layers for the data that’s being contained and processed on the device. So that’s, I think, a huge understated aspect of on-device generative AI and AI, in general. I think another one too is going to be personalization, which goes hand in hand with that. As you use generative AI on your device and through your device, having the ability to keep all this stuff contained, all of your queries, all of your preferences, all the things that it knows about you, that it understands about your process, about your work, about the type of work that you’re doing through generative AI, all of that stuff staying on the device is a huge plus. And then the last thing is lag, plain and simple. There are times when you’re not going to have access to the internet, for whatever reason, or it’s going to be choppy, you may have to go into airplane mode.
There’s so many different reasons why, if you’re in a busy airport that doesn’t have 5G or millimeter wave and it’s just, there’s a bottleneck, you need to be able to not lose productivity, not lose the ability to use generative AI with your devices. And this includes laptops, it’s not just phones. And so having the ability to have a lot of that processing on the device as opposed to exporting it to the cloud or talking to the cloud is a huge plus productivity-wise, because you’re not going to be sitting there waiting for something to happen. And even with some simple processes like chatbots, the ability to have a conversation at the pace of conversation, as opposed to waiting for the queries to be interpreted and just come back through a connection from the cloud, the user experience and the productivity gains are going to be massive.
So there’s a huge push to put this stuff on device or to put as much of it on device as possible, just because it’s more practical to do it. And thankfully, because chip makers have been working on putting a lot of AI on chip for years now, and they’ve become very good at it, the transition from traditional AI on-device to generative AI on-device isn’t particularly difficult, not in the way that it would’ve been three years ago. So it’s actually a natural transition for them.
Ron Westfall: Yes. And I mentioned Qualcomm, I believe it’s an opportunity for them to capture mindshare in this all important space. And I think we can wax eloquent the entire episode, just on gen AI on-device. And that, I think, is, clearly, a takeaway, that this is something that is already having tremendous impact. We’ll jump on this again, certainly, over the next month plus. And so that, I think, is something that will definitely be moving the needle in terms of what we’re going to be paying attention to. And with that in mind, Olivier, we promise we’ll cover other topics. So with that, I’m seeing AI, also, driving innovation across the automotive chipset market. And one example that comes to mind is that there’s a new entrant, Ethernovia that is, basically, charting development, of course, with their portfolio that is aiming to aggregate massive quantities of data and routing between the AI and SoC chipsets.
And naturally, this will requires high bandwidth, low latency, intelligent fabrics. And I prefaced that. And as background, Ethernovia completed its A round financing of $64 million. And I believe this is an impressive funding round because it’s, basically, a who’s who, companies you wouldn’t see backing this type of technology, includes Porsche, and that’s the Porsche Automobil Holding part of Porsche. So it’s really the key part of Porsche that is investing in this Qualcomm Ventures, VentureTech Alliance, AMD Ventures, Western Digital Capital, Fall Line Capital, et cetera. It’s really quite an impressive set of investors for an A round.
And what I’m anticipating is that as vehicle architectures evolve from domain centric controllers, which have worked fine so far, but we definitely need to take it to the next level. And that means that networking solutions must concurrently evolve to support those higher data rates so that advanced vehicle applications can meet the demand for what we’ve been talking about in terms of reliability, security, certainly, autonomous vehicle security comes to the top of mind. And such, these applications include what is known as advanced driver assistance systems or ADAS, as well as autonomous driving and other customer software capabilities that are delivered over the air.
And as a result, I believe Ethernovia is developing a comprehensive, yet streamlined hardware and software system that can meet these demands. And they’re doing that by integrating, basically, the network features that are purpose-built, purpose-developed for software-defined vehicles, SDVs. And I know we’ll be talking a lot more about SDVs, not only on this webcast, but certainly, as it evolves during the course of this year and after this year. And so I’m kind of bullish on Ethernovia. In fact, I am bullish on Ethernovia because I can see that it has impact, certainly, on 5G networks, that is transforming the automobiles communication network to provide the reliable standard space, high-speed connectivity that’s enabled or that is really essential for assuring that software-defined vehicles can meet all of these different demands. And with that, Olivier, is there anything about Ethernovia that stood out, from your perspective?
Olivier Blanchard: Yeah, two things, two things. One is the Qualcomm investment. Qualcomm’s been making a huge push in the automotive sector for the last few years, and they’ve been really successful with it. And so this has been a problem that’s been looming in the digital chassis, not just for Qualcomm, for everybody, for the entire industry, as a practical problem. And it’s the fact that you need faster switches and faster connectors to connect all these systems together. You don’t want to sensor to detect a pedestrian and be slow to send these signals to the processing unit and the processing unit to be slow sending it to the brakes and everything else. You want everything to be instantaneous. So lag is an issue. And one of the challenges that I see in this transformation or digital transformation of the automotive space and vehicle design, especially, is, essentially, we started with a domain based architecture, where, basically, every system has all of its connectors, its processing, and they get connected somehow.
And that causes two problems, one, a lot more hardware than you really need. We could do it more elegantly. And the other thing is a lot more cabling and a lot more connectors than you want in a car because suddenly, you have a car that has a lot of wiring that it didn’t have before, on top of huge batteries, if you’re talking about EVs and electrification. And so the weight of the vehicle becomes massive, exponentially heavier, and you start to run out of room as well if you have to constantly connect cables and you have all this domain based architecture. So what I’m seeing is a trend in the automotive industry is a switch to zonal architecture, which is much more efficient.
So it uses fewer cables, fewer connectors, fewer switches, it reduces the weight and the amount of wiring in the vehicle. And so if you’re going to do it that way, you need faster switches and faster connectors than you had before, less lag. And what I see here is, not just a company with a solution that addresses this problem, but I also see Qualcomm being one of the main purveyors of this new digital chassis investing in this particular solution and in this particular company. And I find that extremely interesting and very encouraging. And I’ll just stop there for now with that.
Ron Westfall: Yep, that’s excellent insights, Olivier. And yes, zonal architectures, that is really the future for automotive design. And I agree wholeheartedly. Ethernovia is coming up with a proposition that will inject well needed competition into advancing zonal architecture capabilities. Naturally, Marvell’s Brightlane Automotive Ethernet portfolio, I think, is driving this. It’s really raising industry awareness to global prominence. And so now, we’re seeing, okay, this is something that definitely has wheels, pun intended, and as a result, Qualcomm and certainly, I think the entire ecosystem, certainly, audio manufacturers and so forth, understand that they have to move away from the domain constricted, siloed type of architecture that did introduce all that complexity. It’s weight, just the weight of the wiring alone can actually impede the energy efficiency of an automobile. And so this is fully aligned with the migration toward EVs and toward, simply, better performing, more fuel efficient cards. And so speaking of-
Olivier Blanchard: And more cost-effective on the manufacturing side too, I imagine.
Ron Westfall: Oh, exactly. I mean, what’s not to like? I mean, this is what we like to see. This is the innovation that is capturing many of the best of many worlds. And so that, it’s about energy efficiency, it’s about cost saving, it’s about better performance, better safety capabilities and all that. And with that in mind, speaking of auto manufacturers, Cadillac has leapt out, and working with Qualcomm, Olivia, I bet you have some thoughts on that partnership.
Olivier Blanchard: Yeah. So first of all, I want one. So what we’re talking about is the new Cadillac Escalade. The EV version for 2025 was just not released, but introduced, announced, revealed, unveiled this past week. And first of all, it looks really good. It is 130K though, so it’s a little bit out of my price range. But it claims 450 miles on one charge. I think there are probably a lot of asterisks to that. It, probably, has to be empty on flat ground with a tailwind or something. But still, it’s incredible feat. But it comes with a lot of interesting bells and whistles. But one of the things that caught my eye about that was the fact that it uses almost all of Qualcomm’s digital chassis platforms. So it has drive, it has connectivity, it has pretty much everything. And so what’s important for everyone to understand is that these platform creators or providers like Qualcomm, with the digital chassis, create these technologies, but just like their platforms on devices, the OEMs, the car makers in this case, decide which features to implement and how.
So there’s a lot of flexibility though. It’s basically just like a baseline. These are all the capabilities that we’re offering, let us help you develop ways to make them yours and figure out which ones you’re going to use and not use. So it’s always interesting to see what an automaker or a device manufacturer is going to select as the features that they want to pay for and actually put in a device, especially at a premium or at the very high end level, where people are willing to pay for the best of the best, and how they implement them.
And what we saw here is a decision by Cadillac to, not do as much custom mixing and matching of platforms, like what we generally see in vehicles, but almost go all in with a package deal, a better together, like, why would we complicate our lives by taking this platform over here and this platform over here and trying to connect them in the middle and make them work together, when we can just work with this one provider of technology, this one technology partner, Qualcomm, in this case, and just build out the guts of our vehicle with almost all of the solutions that they have to offer and see how it all works together.
And I think you start to see some efficiencies of scale with that. So you definitely see that Cadillac is going for the best quality, the best performance that it can get technology-wise, but it’s also trying to get from ideation to markets much faster by streamlining its process, by taking a lot of this friction out of the equation. And I think if it works, and it should, it might usher a new era of consolidation when it comes to automakers choosing their technology partners and going with a platform wide approach as opposed to a mismatch, like patchwork of technology approach. So I’m not predicting this but I think it’s encouraging to see things moving in that way, perhaps.
Ron Westfall: Yeah, no, I think that’s aligning with what we’ve touched on before. It’s like, yes, the automobiles of the last several years have increasingly impressive, say, infotainment systems and other capabilities. The however is that they’re running a lot of different silos, domains to make that happen. And it introduces complexity and introduces extra costs and so forth. And so I agree, I think it’s a very solid, we can call it, prediction, that the trend is going to be more toward a platform approach, where the automaker is using a zonal architecture. It’s more or less managed uniformly, all the capabilities are inter-working using ethernet technology, a very proven technology, certainly, and the rest of the networking world. And so it’s good to see the auto manufacturers onboarding with an ethernet centric approach to make this happen. And I think that is something we’ll see more of. It’s quite simply.
And so that will be good news for who wins the platform contest. But there will be room for, again, if you can work with that platform in a very elegant, intelligent way, then you’ll be invited as well. So that will avoid a lock-in, if you will, because you select one platform type scenario. And so that, I think, is something that we’re seeing in other parts of the industry, like data centers and so forth, what’s called a selection of a full stack solution when embarking on a new architecture, a new approach, et cetera, for example, we can see some of that in the vRAN world, for example, within the 5G ecosystem realm. And with that positive note, thank you so much, Olivier, for joining today’s webcast. Great to have you back on and looking forward to more conversations.
Olivier Blanchard: Thanks for having me.
Ron Westfall: You bet. It’s a no-brainer. And thank you everyone for tuning in, our viewing audience and our listening audience, as always, we appreciate you spending time with us, and be sure to subscribe to The Futurum Tech Webcast and The 5G Factor channels. And with that, see you all again next time, and great 5G day, everyone.
Other insights from The Futurum Group:
Qualcomm’s Snapdragon Digital Chassis Will Power the 2025 Escalade IQ
Author Information
Ron is an experienced, customer-focused research expert and analyst, with over 20 years of experience in the digital and IT transformation markets, working with businesses to drive consistent revenue and sales growth.
He is a recognized authority at tracking the evolution of and identifying the key disruptive trends within the service enablement ecosystem, including a wide range of topics across software and services, infrastructure, 5G communications, Internet of Things (IoT), Artificial Intelligence (AI), analytics, security, cloud computing, revenue management, and regulatory issues.
Prior to his work with The Futurum Group, Ron worked with GlobalData Technology creating syndicated and custom research across a wide variety of technical fields. His work with Current Analysis focused on the broadband and service provider infrastructure markets.
Ron holds a Master of Arts in Public Policy from University of Nevada — Las Vegas and a Bachelor of Arts in political science/government from William and Mary.
Research Director Olivier Blanchard covers edge semiconductors and intelligent AI-capable devices for Futurum. In addition to having co-authored several books about digital transformation and AI with Futurum Group CEO Daniel Newman, Blanchard brings considerable experience demystifying new and emerging technologies, advising clients on how best to future-proof their organizations, and helping maximize the positive impacts of technology disruption while mitigating their potentially negative effects. Follow his extended analysis on X and LinkedIn.