On this episode of the Six Five On The Road at Computex Taipei, hosts Daniel Newman and Ryan Shrout are joined by Qualcomm’s Alex Katouzian, Group GM for Mobile, Compute, XR, Voice & Music Wearables Businesses, Qualcomm Technologies Inc., for a conversation on Qualcomm’s role in enhancing the Windows ecosystem through next-generation technologies and AI integration.
Their discussion covers:
- The transformative impact of AI across Qualcomm’s product portfolio
- Comparing the transition to AI in PCs with that of smartphones
- Overcoming architectural challenges for the X Elite platform
- The synergy between connectivity, cameras, sensors, and AI technologies
- Future paths for on-device AI Compute advancements
- Exploring the “one technology roadmap” approach
- The implications of AI and other tech on XR and upcoming Qualcomm announcements
Learn more at Qualcomm.
Watch the video below, and be sure to subscribe to our YouTube channel, so you never miss an episode.
Or listen to the audio here:
Disclaimer: The Six Five Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we ask that you do not treat us as such.
Transcript:
Daniel Newman: Hey, everyone. The Six Five is on the road. We are here at COMPUTEX in Taipei, Taiwan. A very exciting week ahead and a lot to cover here across the technology space. We’ve got data center. We’ve got PC. We’ve got economics, and we’ve got geopolitics. We’re going to cover it all here because COMPUTEX is the show where technology happens and where we hear about the next great things.
We’re doing some exclusive coverage with Qualcomm, and we’re very excited to get kicked off today. We’ve got Alex Katouzian. He’s the group general manager for MCX, and I’m going to spell that out for you all. It’s Mobile. It’s Compute. It’s XR. And frankly, this gentleman has a lot more. But Alex has been a regular here on The Six Five. We’ve enjoyed him many times before and excited to have him back. So without further ado, Alex Katouzian, welcome back to The Six Five.
Alex Katouzian: Good morning. Thanks for having me here.
Daniel Newman: It’s great to have you. Thank you so much for joining. And wow, you look great for making this trek. It’s only, what, about two, three hours, eight different times to get here?
Alex Katouzian: Yeah, we’re used to it. We go wherever our customers are.
Ryan Shrout: That’s important. So the big topic here this week or one of the big topics I guess is AIPC everything. It’s on the sign behind us. I am curious from an architectural standpoint, how do you compare what we’re doing on the AI PC with what has been happening in the smartphone world?
Alex Katouzian: Yeah, the smartphones actually started having AI in them so many years ago. Just it was in the background. People just didn’t realize. Like for example, when you take a shot on your camera on the phone, basically if it’s daylight, if it’s low light, if it’s more exposure than usual, everything is adjusted by these AI algorithms. For example, the assistant you had on the phone was AI-based. The malware detection on the phone is AI-based. Even detecting false networks, AI-based.
So there’s so many different AI-based capabilities on the phone, translation and in text, and everything that you saw was running really on device. And we started actually putting AI capability in those phones from 2016. And then the PC started to catch up, and the user interface, now it’s actually becoming different based on Copilot plus PCs. And these PCs are really actually there to understand you versus you understanding them.
And so they will start to help with all of your productivity and all the capabilities that you want this PC to run. You’re basically talking to the PC or writing to the PC, and it starts doing a lot of things for you. And now all the AI capabilities both on the smartphone and on the PC are in the foreground. Everyone’s paying attention to it. And if it doesn’t have AI, then what does it have? It’s actually a push to try to sell more devices because the functionality is so useful that people want that device more and more, that capability more and more.
Daniel Newman: We’re definitely in a world where we’re pushing for creating greater productivity for each individual. And at the same time, we’re trying to drive efficiency, which has brought a lot of interest in these new architectures. And you pointed out a couple of things Alex, I think are worth mentioning. The first is that you said in so many words, AI is not new. And I know the last 18 months has brought this AI into the conscious of everyone.
By the way, we’re still at pretty early days, even with ChatGPT. Even though we hear it’s like everywhere, it’s like 10% or something people have actually used it, but it’s proliferating really, really quickly. And Qualcomm has done it for a long time. It’s been doing AI in the devices in a number of ways, whether it’s image sensing, which you mentioned, whether it’s how some of the apps like Uber knows where you might want to go and how it’s able to talk across your different applications and use algorithms. It’s super interesting.
But you said something about architecture. You said something about basically building this architecture. It’s really that we’ve entered an era where people want their mobile devices to work like a PC, but really at the same time, they want their PCs to have that efficiency of a mobile device. So you guys had some really interesting architectural hurdles to cross to get there. Talk a little bit about that and how you’ve gotten here with X Elite.
Alex Katouzian: So the expertise of Qualcomm is to have high performance at low power, and we try to fit super high performance capability in a very constrained environment like a smartphone. Every year we’re increasing the capabilities of these devices. Now, if you translate just that capability by itself into other devices that people carry, whether it’s a watch or a PC or a tablet or earbuds, and then later AR or XR type glasses and headsets, that whole concept applies.
The whole concept of having high performance, low power applies, because what do you want out of your PC? Longevity. You want it to last as long as you are doing your work and even longer. If you’re putting your device through days of use set of cases that is demanding, you don’t want the battery to run out in two or three hours. You want it to last for a long time. And then you want the device to be light, and you want it to be portable. And you want it to have high performance capability.
So we applied that type of architecture of what we built in mobile first to all these adjacent markets that are actually mobile and are a complement to what you do on a daily basis and carry around your smartphone. Whether you’re in your home or in your car or go to work, you carry four or five devices, and those four or five devices all have to have the same type of capability in architecture where you don’t want the battery to run out. You want the performance to be high. You want your productivity to be high. And that’s exactly what we applied to the X Elite and X Plus devices.
Ryan Shrout: One of the other hallmarks of Qualcomm though, in addition to low power, high performance, is this idea of connectivity, cameras, sensors. How do you pull that into a pretty diverse, complicated PC ecosystem and still augment that experience?
Alex Katouzian: Sure. So you talked about architecture a little bit ago, and as you know, the whole architecture of our Snapdragon devices is heterogeneous. It means it has multiple cores in it, and multiple cores actually rely on each other to actually do the work for you. Sometimes the workload is better on a CPU. Sometimes it’s better on a GPU. Sometimes it’s better on a NPU. Sometimes better on a DSP.
And so all of it is applicable in the devices that we put out for solutions like smart devices that people carry. But if you look at the trends, what is the most amount of data that runs on the networks today? What is the most amount of data that runs on your devices today? It’s actually video. Video is a huge amount of uptake of data and the air interfaces. And so efficiency in these types of capabilities, whether it’s a camera or video playback or audio, you’re video conferencing every day since the pandemic, everybody went to video conferences. It’s like a norm now.
And so you want your camera to be great. You want your audio to be great. You want your background blur, and when you get up and you just don’t look at your midsection. Some of us have to be careful around that. The Zoom functions, if you’re showing something, everything is actually based on the multimedia solutions and capabilities that are built into these devices. So it wasn’t really that difficult for us to take that from the smartphone because people use that all the time. The camera and video, video playback, upstreaming a video, downloading a video, all those are used in a smartphone and it’s very applicable now to the PC, especially if you want to work from anywhere.
Daniel Newman: We’re definitely trying to create that ubiquity and people want the experiences from device to device to have more similarity than ever before. And I certainly tend to believe that Qualcomm and the architecture that you’re building provides with Snapdragon Elite for the first time an opportunity to take the true sense of what mobile is and the true sense of what traditional PC compute is and make that more ubiquitous. A lot of the past attempts have just been pretty big failures.
So I think we’re all leaning on you guys in this moment to really get this and start driving that right where we go from device to device seamlessly, Alex. Let me ask you with that in mind, roadmap. So there is a progression here. As much as us analysts like to call super cycles and say, “This is going to be the biggest moment of ever in selling PCs,” and by the way, there’s a reason all the CEOs, including Cristiano here, this is a big moment.
Alex Katouzian: That’s right.
Daniel Newman: But what do you see as the roadmap? Because we’re still early. The apps, the announcements and things that we’re talking about using on these, it’s still somewhat small. It seems like it’s going to grow. So how does this roadmap progress?
Alex Katouzian: Yeah, so I think the goal that we had in mind when we started thinking about, okay, should we get into the PC market or not, there were three things that we had to have in our portfolio. One is, can we come up with a disruptive enough solution to change this market? This is number one. Number two is, can we invest in it long term? Because you don’t just get up and you become super successful in an incumbent market. You have to have a ramp. And three, did we have enough partners to help us run in this market?
And all three are actually true. And so if you look at the architecture, what we had in mind was, can I provide the highest performance PC capability at the longest battery life with the best AI solution embedded into it? And all three are actually true. That’s disruptive about this solution. And so when you look at the user interface to the PC from now on, it’s actually changing. It’s very much changing. It’s not the traditional, I’m going to type something, look for a file, point and click, download.
Daniel Newman: Click the start button.
Alex Katouzian: Click the start button. Basically you’re telling the PC what to do for you. If you and I are talking, like if we met last week, say, “Hey, remember that watch we were talking about?” I picked a good subject.
Ryan Shrout: Hit the topic.
Alex Katouzian: “Remember that watch we talked about? The green face,” and you will recall exactly what that was and we can start talking about it. Well, if you were looking at something on your PC and you had to go search it, you didn’t figure out where it was, it’ll take half an hour to try to find it. Maybe you saw it on a YouTube channel. Maybe you saw it on some website someplace. But now the PC is recalling what you do. You just tell it, “Hey, you remember that thing I saw a week ago?”
It’ll bring it up for you. Summarize my notes. Start a presentation. Start writing a paper for me based on these subjects. There’s so much stuff that if I save a couple of hours a week, that’s a massive user experience change. And so that’s the disruption that we were going after. And our partner, Microsoft, actually is helping us run in this market. And we are here for the long-term. So disruption, long-term investment, partners that help us run, all three were true.
Ryan Shrout: I don’t think anybody will deny or doubt that the X Elite looks like it’s a fantastic product. You guys had a great launch last month. It’s going to go on sale this month. As we look through that future, Qualcomm has talked about, it’s one technology roadmap and how it applies to multiple segments as you continue to drive into a compute company. How do you think that looks in 2025, 2026 as more of these technologies merge?
Alex Katouzian: Yeah, that’s a great question. So like I said, everything started with mobile. We developed the highest performing technologies across CPU, GPU, NPU, DSPs, multimedia, low power islands, the whole thing. And then we started saying, okay, are portions of these or all of these applicable to adjacent markets as we transition from a communications company to a computing company? And they were absolutely applicable. So if you look at auto, it started off with Mobile. XR started off with mobile.
PC started off with mobile. IoT started off with mobile. And as we grow in these markets and we get a foothold, then you start to design particular solutions for that market, for the performance of that market, for the interfaces, for the memories of that market. And that’s exactly where we are in PC and auto and XR and industrial IoT. So over time, we make it very particular for those markets, but we start with our one technology roadmap that we developed in mobile first.
Daniel Newman: And I think creating continuity from experience to experience. You’ve had such tremendous success. We’ve had a cool to go on a number of times on the automotive side. And I mean, what a run to $45 billion in pipeline. But in those vehicles there’s a lot of infotainment. And when you’re going from device to device to device, having that consistency of experience is something that I think Qualcomm can really build a reputation and name around.
Alex Katouzian: We can. You said Seamless before. We actually have the product called Seamless, and it’s actually a low-power Bluetooth interface between devices that makes a device contextually aware. Look at the everyday use cases. You have a phone. You probably have a Bluetooth headset. Some people have a watch. For sure you either have a tablet or a PC. If these devices start to actually understand and work together past first-party apps or third-party apps between each other or split compute, which is even more applicable, think about you have a pair of glasses that you want.
Look at these Meta glasses. The app is running on the phone. The app is running on the phone. So if the phone and the glasses are connected and I can split my compute onto a much more powerful Snapdragon device in a PC or in a phone and render whatever I want to see up here both working on AI, imagine that work.
Daniel Newman: Were you reading my mind? I think you were reading my mind. That’s not the X in XR, but we are trying to extend reality, and I was just about to get there and ask you something about that. How do we take that particular trend line and attach it to what we’re seeing? You kind of started to just allude to it there. Because to me it seems like this might be the moment, more compute accessible, more contextual awareness from device to device to device. XR has had some stops and starts, and you’ve had some great wins.
I’ve heard really positive things on these Meta glasses. We’ve seen other companies launch things that had a wave of excitement, and then it doesn’t seem like… Can Qualcomm solve the mass adoption and interest in the extended reality? And does this PC trend align at all?
Alex Katouzian: Yeah, absolutely. I think both are connected, and I’ll explain to you in a minute. But the XR trend is actually picking up quite a bit. The mixed reality headsets where you actually have visibility of what you’re seeing, and then you’re enclosed in an environment where you could do gaming, you could do entertainment, sports, health and fitness, social media, all of those applicable to a pair of headsets that you actually wear with mixed reality.
But then if I looked at how does it connect to a PC? Now imagine the PC companies have a great channel into enterprises and multiple different enterprises. Many, many people at enterprises are using 3D apps and 3D software, AutoCAD and the like and others. If you look at what Microsoft just announced at Build, they announced what’s called volumetric APIs. And what that means is spatial computing API is now available to every ISV and every software vendor to allow their 3D software and creative software to run on MR devices. So the minute you put the glasses on and the services there, you’re in a virtual desktop. That desktop, you can launch a 3D application, 3D software, and whatever you’re working on becomes a spatial object that now you can lift, open, call teams, share with other people.
They have inputs into it. Think about how many people can do that at work. So that’s headset related to PC. Now imagine a frontline worker. So someone’s out in the field trying to fix something. They have a light, long battery lasting PC. Hopefully it’s running on X Elite. And they’re carrying that with them because they’re out in the field, they can’t plug in anywhere. Then they have to fix something. Then they put their AR glasses on and all of a sudden, whatever they need to fix in a panel, all of the instructions show up next to them. They can call back to the office because they’re connected to the PC. They can walk through what’s wrong. They can fix everything without even having multiple people come out to look at it. And they can get that feed anytime they want.
And where is the split compute happening? Glass to PC. What about the AutoCAD application? Glass to PC and you split. And so having that awareness of a device that can offload what you need is very unique. And we have that. It’s all tied in. And I think now the actual use cases of people using them longer and longer during a week is on the rise. And you could see the popularity of these glasses through Meta. Google’s coming up. All these companies, the big hyperscaler internet companies are after this. Why? Because the technology is so great and the use cases are so great. If you saw the Google I/O presentation, they showed a phone. They showed a demo with a phone that came up and said, “What am I looking at? What does this code look like? What is this code doing? What do I do to make this circuit more efficient?”
It would answer. Their AI, their Gemini AI, capabilities would show… It would response immediately. But then she walked past a pair of glasses and went and did something else. And then she said, “Do you remember where my glasses were?” I said, “Yeah, right next to that red apple on the table.” I don’t know what the apple thing was. But anyway, she went back, picked up the glasses, switched to that immediately from the phone to the glasses. Now the AI could see what she sees. Where am I located? What am I doing? What do you think about this? Can you give me an opinion? All those things were available to her immediately. And so you could see this transition of device to device. Now, over the next few months, that functionality is just not going to be cloud-based. It’s going to be device and cloud-based in a hybrid situation where the device actually has ability to compute and not go back to the cloud all the time. And if you can offload to the phone even better. So we’re hot after that with these glasses, and I think all the companies that we’re working with are doing the same.
Daniel Newman: Sounds like the connected intelligent edge.
Alex Katouzian: That’s right. The Connected Computing Company.
Daniel Newman: Group GM, Mobile, Compute, XR, thank you so much for joining us here.
Alex Katouzian: Thank you. Anytime. It’s a pleasure.
Daniel Newman: And thank you everyone for tuning in to this edition of The SiFive On the Road. We’re at COMPUTEX 2024 in Taipei Taiwan, an exclusive series of conversations with Qualcomm. Hit that subscribe. Join us for all of the other episodes. We appreciate you being part of the community. But for now, for Ryan Shrout and myself, we got to say goodbye. See you all soon.
Author Information
Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.
From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.
A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.
An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.
As President, Signal65 Ryan ensures the company provides valuable insight on competitive analysis, performance marketing, product positioning, and real-world experience comparisons.
With a focus on in-depth testing and nearly two decades of hands-on experience, Ryan has created a breadth of knowledge in nearly all fields of hardware including CPUs, GPUs, AI/NPUs, SoC design, memory systems, storage, graphics, displays and their integration into client and data center solutions and platforms.
He spent five years at Intel serving in roles from competitive analysis, to owning client technical marketing, and driving product delivery in the client graphics and AI division. Prior to Intel, Ryan spent 18 years analyzing hardware and technology as the owner of PC Perspective and three years as the Principal Analyst at Shrout Research.
Ryan has worked with major technology companies and their product management teams at Intel, Qualcomm, AMD, NVIDIA, Arm, MediaTek, Dell, Lenovo, Samsung, ASUS, Meta, Microsoft, and Adobe. His work has been cited and quoted by numerous technology news outlets and is a regular contributor to MarketWatch.
Ryan holds a Bachelors of Science in Computer Science from the University of Kentucky.