Bringing AI to Life and Microsoft’s Vision for the AI Future – Six Five on the Road

Bringing AI to Life and Microsoft’s Vision for the AI Future - Six Five on the Road

On this episode of The Six Five – On The Road, hosts Daniel Newman and Patrick Moorhead welcome Microsoft’s Steven Bathiche, Technical Fellow, Windows & Devices, for a conversation on bringing AI to life for the Microsoft Event in NYC.

Their discussion covers:

  • A look into how Steven’s team brought AI to life for this special event
  • Neural Processing Units (NPUs) and why developers should invest and be excited about the cloud to client capabilities
  • How this complex world of AI technology is making work and play easier for end users
  • How AI technology has advanced through today and what progress might be achieved three years from now

Be sure to subscribe to The Six Five Webcast, so you never miss an episode.

Watch the video here:

Or listen to the audio here:

 

Disclaimer: The Six Five Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded, and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.

Transcript:

Patrick Moorhead: We are live here in New York City at Microsoft’s AI event. It has been an incredible event. We have seen amazing new AI technologies, we’ve seen new Surface PCs. The action is everywhere, and hopefully you can feel that around us. I saw some incredible demos, some incredible stuff, GA on stuff. Daniel, how are you doing, my friend?

Daniel Newman: Yeah, the energy’s been palpable. You could feel it here. I think everybody was kind of, “Is it going to be AI?”

Patrick Moorhead: Right.

Daniel Newman: I think everybody kind of knew that was to be expected, but it was interesting because no one really knew what was coming.

Patrick Moorhead: Exactly.

Daniel Newman: And So it was that perfect blend of suspense and meeting expectations. And Pat, I have to say, we’ve been to events that have kind of fallen flat. We’ve been to events that kind of met the bar. And I would say in many ways, just by the reaction as analysts and looking around at the reaction talking to press media and some of the others that are here, it felt like Microsoft leaped over on this one.

Patrick Moorhead: Yeah, I think they did. They came out with that early lead and I always say, “Hey, this is a marathon, not a sprint.” And I got to tell you, Microsoft’s on fire with AI, not just for consumers, but also for folks that do work. And not just folks that do work, but also for consumers. But hey, none of this amazing magic tricks with software happens without deep architects, super smart people architecting the future and not just for today, but an architecture that works well into the future. And with that, Stevie, welcome to the show. First time Six Five. Thank you for coming on.

Stevie Bathiche: My pleasure. Thank you for having me. Appreciate It.

Patrick Moorhead: Yeah, absolutely.

Stevie Bathiche: Yeah.

Daniel Newman: Stevie, we’ve seen you demoing studio effects on stage in the past. You’ve got a big job of influencing how AI and the user experiences are all threaded together. But here you are at the event, both as a contributor and a participant on the show. Give us a little bit of the rundown on what got you really fired up about today’s slate of announcements.

Stevie Bathiche: Oh, Absolutely. So I’ve been in Microsoft for about almost a quarter of a century, 25 years. I’m a technical fellow.

Patrick Moorhead: New guy.

Stevie Bathiche: New guy. Yeah, I just started. I mean, I kinda look it.

Patrick Moorhead: Is that Windows 311?

Stevie Bathiche: Almost. No, not quite.

Patrick Moorhead: Okay. All right.

Stevie Bathiche: That’d be kind of cool though. I’m a technical fellow that works across both windows and devices, and that’s really important because the things that we’re working on really require optimization across hardware and software to deliver the best experience that you see today. We’re really excited about the hardware capability that you saw that we’re delivering as well as kind of the new copilot experiences that are kind of changing the computing industry today.

Patrick Moorhead: Yeah, so Microsoft is in a very unique position, and not only does it do consumer, it also does business, but it also does devices too with Surface as we’ve seen for a very long time. And that also gives you the opportunity to architect the AI in the most efficient and powerful way. And maybe that’s in the cloud, maybe that’s in the device. And in fact, you showed today a device that had a very high performance NPU that was actually doing some of the AI magic tricks native on the device, but working with the cloud. So what’s the value proposition to developers? Why should they be excited about this moving forward?

Stevie Bathiche: I mean, I think the fundamental tenet in computing, period, is I want to be able to have the ability to do really high workloads, but I also need to do it in an efficient manner. And that’s really important. And the phrase that we have is called TOPS per Watt, Tera Ops trillions of operations per second per watt. And that’s a really important metric. And those are the things that we’re driving and the things that you’re seeing here today, even if you stretch back on what we’re doing with the Surface Pro 9 with five G and even further back, we’re changing the silicon landscape for Windows and for the industry. And it’s exciting because what we’re allowing to do is we’re giving developers and also customers this computability in a small package, but it can do a lot.

Daniel Newman: It’s a lot of fun to watch the evolution on these devices. Pat, we were time traveling this week and we were at the Intel Innovation event, and Pat Gelsinger, the CEO, talked about the IPC and the NPU that you’ll be using in the new studio. And as I was kind of following the story along, you kind of saw the whole story come to life this week between him talking about it from the silicon layer to the NP… It’s very exciting. There does seem to be a new era, Stevie, that is going to be the AIPC. So all this together starts to create this new trajectory that brings together work that brings together our personal lives. And to me, it seems like the blur is only going to get bigger and we’re going to have this sort of ubiquity of our existence and AI is going to drive a lot of this. No?

Stevie Bathiche: You know what, your comment about the AIPC is super important. I wouldn’t really want to comment on that.

Daniel Newman: Yeah, please.

Stevie Bathiche: It’s not really like a specific thing. It’s a system and I think Satya alluded to it today here. It’s a combination of computability in the cloud along with what you have in hand. And that orchestration between compute, what’s happening in the cloud, what needs to happen in the cloud, and what happens essentially on your device for privacy, security efficiency is what we’re building and what we’re moving towards at Microsoft. And we call this hybrid AI and a hybrid system. And this is what’s going to enable the next generation of copilot experiences. This is what will allow the copilot to understand more contextually what you’re trying to do and essentially help you do it.

Patrick Moorhead: In the end, the user really doesn’t care where it’s happening. They just want the best experience and for some experiences lower latency. And if they don’t want to share certain types of information, they can do it on the system.

Stevie Bathiche: Privacy.

Patrick Moorhead: And there’s another set of experiences-

Stevie Bathiche: Cost.

Patrick Moorhead: … that if they have a decent connection maybe they’re willing to wait a little. And I think this hybrid type of architecture that brings the cloud and the client and everything in between, it’s just a smart way to go. And if I look at how even a lot of smartphone applications were designed to able to take advantage of the device. Otherwise, I think I heard a theory 30 years ago, we were just going to be streaming the entire time. Well, that didn’t happen. So a ton of work has been done. And again, not just the last six months, the previous 10 years in research, but we’re coming to this point though where you’re delivering tangible benefits today and that’s exciting. My question for you is what does maybe the next three to five years look like? Put the crystal ball out there and let’s hear from Stevie. What does it look like?

Daniel Newman: Let’s use a little Bing image creator and you can paint a picture. You use words and then we’ll Picasso this thing.

Stevie Bathiche: It’s a great question. For a lot of us who’ve been in the computing industry for so long, this is really the second wind that we’re entering in. It is the thing that is as significant as anything previously up to this point. And Satya talked about Doug Bart and the work that happened, the paradigm that was established over 50 years ago with the mouse like point and click, very specific things. If you think about that, that’s the same paradigm we’ve been at for the past 50 years. And it’s been very explicit, very down to the pixel, very exact. But this new era and what we’re entering in and whether you see it with enabled a hybrid AI and the new silicon that we’re creating both in Azure and on your device along with kind of an operating system and a system of experiences.

Patrick Moorhead: System of systems.

Stevie Bathiche: Yeah, a system of systems that’s really going to, essentially, really going to change everything. It’s going to change the application model. It’s going to change how people use their applications. And I have a little bit of a frame that might help people even think about it. I call it where is the AI basically relative to your app? Is the AI beside your application? Is it inside or is it outside going across? And you can see that actually happening even today. You see a lot of our copilot experiences essentially being complimentary, sitting beside the existing application, helping you along on the side, like going back and forth and doing things. But then you see applications evolving like a clip [inaudible 00:09:22] and designer where the AI is basically going in and infusing the application from inside out, changing the interaction model completely.

Now you don’t have to do these complicated filters and go to school for four years and figure out how to do that. You’re a professional automatically because the AI’s inside the app. And then the final thing, what I’m most excited about, and per your question, where I’m seeing things are going is where the AI essentially is orchestrating not just the individual task that you’re working on, but your jobs. What you need to do across multiple things, across multiple devices, across multiple applications. And the best orchestrator of all is Windows itself, isn’t it?

Patrick Moorhead: Well, for billions of people, it is.

Stevie Bathiche: For billions of people it is. Right? The thing that is sitting there that allows a vessel for work to be done across many, many applications, both the amazing first party apps that we’re developing along with third party apps. That’s the future.

Patrick Moorhead: Well, I do love, in Windows, how you have enabled it to bring that data and bring usability in from the smartphone. You take a picture on your smartphone and in about 15 seconds it magically shows up inside of Windows photos. Or I can even use the phone app, which allows me to send texts and images back and forth. And on certain phones, I mean, I can even run applications that are on my smartphone, on my Surface or any Windows pc. So only now do I see how important and strategic that was to do that. Because quite frankly, and Dan, you said this before, people fundamentally don’t want 17 different experiences across multiple devices. Now the reality is just like we’ve seen in enterprise with there is no single pane of glass, people, they’re going to be primary copilots and specialty copilots that do that.

Daniel Newman: The skills. And I think ultimately that the intelligence Stevie’s talking about, we can get to the point where we interact in natural language with maybe as few as a single digital assistant that will be able to eventually talk to all the systems. And I do think that’s where it ends up. I think there’s a lot of schema and organization and experience things someone like him is going to be thinking about to get us there. But I mean, why would we not want to seek Ubiquiti? Why would I not want to use… And Satya really did a good job in the keynote today kind of pointing this out, the fundamentals… And you did this too, the fundamental change is not…

You went from a mouse for 50 years to literally interacting with these things like we would with each other. So we basically have the empathy and natural human interactions that we crave and then we allow the machine to do what it does best through that interface as opposed to the point and click interface, which we don’t naturally do well. We’ve been trained like the homo erectus, we’ve stood up over 50 years to learn how to do that, but now we’re back to doing it the way we do it. We talk to it and it does what it does.

Stevie Bathiche: We go from explicit to implicit.

Daniel Newman: I love it.

Stevie Bathiche: From being direct exact to being fuzzy but intuitive.

Daniel Newman: But that’s how we are. That’s how humans are.

Stevie Bathiche: That’s how humans are.

Daniel Newman: And that’s why there’s so much brilliance here.

Patrick Moorhead: Yeah, I mean, I’ve been in this industry way too long. Windows 311, I think 1990, and this is the first time I think along that continuum where the vision that was set up in the early ’90s and the late ’80s is actually what we’re going to deliver. I have line of sight as to where these shoes were and why they’re there. And I’m super excited for enabling so many folks. I mean, even folks that have a hard time using devices and more people can get involved, more inclusivity. I’m super excited about that type of future.

Stevie Bathiche: So right. I mean, I think the big opportunity for software and these copilots is to digress a little bit. We’re always on the hunt for the next computing form factor. What is the next thing? And I think what we understand at Microsoft is the next computing form factor isn’t a specific thing, it’s a system. Software will stitch together your experiences across anything that you want, anything that you might own across all your devices, whether it’s on your lap, in your pocket, or on your wrist. And that’s the big opportunity for software in front of us, connecting those experiences, connecting that data, connecting what you need to do. And you have an agent essentially alongside you to help you complete your work. That’s the vision.

Patrick Moorhead: Love it.

Daniel Newman: And that is the future. Stevie, we could talk all day. This is a lot of fun.

Patrick Moorhead: I know.

Daniel Newman: You give a couple of analysts, someone that actually theorizes it, even another level, and this could end up being a two-day-long conversation.

Stevie Bathiche: It was fun.

Daniel Newman: But it was a lot of fun. Thanks for joining us. Let’s have you back sometime soon.

Stevie Bathiche: Appreciate it. Thank you.

Patrick Moorhead: Love that.

Daniel Newman: All right, everybody hit that subscribe button. Join us for all of The Six Five episodes here at Microsoft AI event, New York City. Pat, it’s been a lot of fun having these conversations, but we got to go.

Patrick Moorhead: So much fun, thanks.

Daniel Newman: See you later.

Author Information

Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.

From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.

A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.

An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.

SHARE:

Latest Insights:

Brad Shimmin, VP and Practice Lead at The Futurum Group, examines why investors behind NVIDIA and Meta are backing Hammerspace to remove AI data bottlenecks and improve performance at scale.
Looking Beyond the Dashboard: Tableau Bets Big on AI Grounded in Semantic Data to Define Its Next Chapter
Futurum analysts Brad Shimmin and Keith Kirkpatrick cover the latest developments from Tableau Conference, focused on the new AI and data-management enhancements to the visualization platform.
Colleen Kapase, VP at Google Cloud, joins Tiffani Bova to share insights on enhancing partner opportunities and harnessing AI for growth.
Ericsson Introduces Wireless-First Branch Architecture for Agile, Secure Connectivity to Support AI-Driven Enterprise Innovation
The Futurum Group’s Ron Westfall shares his insights on why Ericsson’s new wireless-first architecture and the E400 fulfill key emerging enterprise trends, such as 5G Advanced, IoT proliferation, and increased reliance on wireless-first implementations.

Book a Demo

Thank you, we received your request, a member of our team will be in contact with you.