On this episode of The Six Five – On The Road, hosts Daniel Newman and Patrick Moorhead welcome Lisa Su, Chair and Chief Executive Officer at AMD for a conversation on AI’s transformation of the industry and the market, as well as the ways AMD has advanced AI adoption through its ecosystem of products.
Their discussion covers:
- The market shifts around AI
- AMD’s progress on MI300X and its general availability, and what makes MI300X different from other AI accelerators in the market
- How AMD is approaching the software ecosystem
- What role Ryzen AI is playing in AMD’s data center training and inference, as well as end user devices and AI PCs
Be sure to subscribe to The Six Five Webcast, so you never miss an episode.
Watch the video here:
Or Listen to the full audio here:
Disclaimer: The Six Five webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded, and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.
Transcript:
Patrick Moorhead: The Six Five is on the road at AMD’s Advancing AI event, and it has been an energetic one. I mean, people were talking about this event in the run-up for quite frankly months because of this insatiable appetite for AI. And that’s in the data center and that’s on the edge and everything in between. Dan, it’s been a great day.
Daniel Newman: It has. It’s been the year really. We’re going on, what, about a year and seven or eight days since AI revolutionized our entire world. And the funny thing is there are companies that have been doing it a lot longer than that. And you and I have been covering it for a lot longer than that. But the truth is that the market always does tend to drive that demand. And it’s been a bit of a halo for the year. Markets have been tough, but for companies that are in the AI space, they’ve had a bit of a reprieve and there’s so much energy, Pat. And this was a day, not only was everyone else excited about, but I’ll be honest, I was really excited about.
Patrick Moorhead: Yeah, absolutely. And we’re pleased to announce Lisa Su here back for The Six Five. Again, Lisa, we really appreciate you coming on the show multiple times as we’ve chronicled your journey that quite frankly has been awesome.
Lisa Su: Thank you so much.
Patrick Moorhead: Thank you.
Lisa Su: Thanks, Pat. Thanks, Dan. It’s great to be here with you guys always. And thanks for spending the day with us. It’s been an exciting day.
Daniel Newman: Thanks for keeping the pace too. You came out really energetic. You flew through because I know you had a big list. You had partners, you had announcements, you had things. We’re going to talk about that in a minute. But you know Pat and I, we also do like to think about the broader market, what’s going on in the industry as a whole. You heard my somewhat monologue in the beginning about what’s going on in the market. I’d love to get your take though. I heard you say something like this is the fastest you’ve ever seen it accelerate or fastest you’ve ever seen transformation. Maybe start there and just how you’re overall seeing the market and seeing the opportunity and what’s going on with AI.
Lisa Su: Yeah, I think this is such an amazing moment. Mean if you think about, and you said it, it’s been about a year since ChatGPT came out. I think it just changed the way we think about technology. It’s so easy to use. The idea that you can ask your computer what you should be doing on a trip next week. I think this has really opened up people’s imaginations. And so when we look at the market today, a year ago we thought in 2027, the market for data center AI accelerators would be 150 billion. And frankly, that seemed like a really large number. And I’ve spent the last 12 months talking extensively to our customers, our partners, the ecosystem. And what we’re seeing now is I think the market’s much, much larger. We talked about a $400 billion TAM in 2027. And even that seems humongous, but we also see that there’s so much demand out there for more AI. Yeah, it’s a pretty exciting market. I think the technology is changing quickly. The applications are getting better very, very quickly, and we’re trying to drive widespread AI adoption.
Patrick Moorhead: Lisa, you made a ton of announcements today. And I want to start with one that I think was the most anticipated that everybody wanted to know more about, and that was the MI300X. And some of the things that surprised me, first of all was training and inference. Knew you were coming in with great numbers in performance and inference and you showed training. And I really appreciated that. And it really, I think opened up the aperture for people to see the AMD opportunity right now. And then there was the progress and the software and ROCm 6. And then the cadre of partners, that’s the grand purifier, is your customers. I mean, we always like to say there’s the analyst take, there’s the tech company take and then there’s the customer take. And I saw a lot of support today for that. Can you give us the highlights, maybe even how’s it differentiated for what else is out there?
Lisa Su: Yeah, so it’s been a huge day for us. I’m so proud of the team. It’s really been many years in the making to come up with MI300X. It’s an amazing product, 153 billion transistors. It has all of the latest generation everything. It’s 12 chips in triplets between five and six nanometer, two and a half D, 3D packaging, high bandwidth memory, all of these things. But most importantly, it runs workloads really well. We’ve talked about inference and how important inference is. But yes, we did show some training results today as, well, because, look, it’s the complete package. It’s the right product at the right time for what is this gen AI world. Yeah, it’s been a lot of fun today.
Daniel Newman: On the other end though, hardware is the rage of the market. Who’s got the GPU or is everything going to be an ASIC? And there’s a lot of back and forth in those conversations. Of course, we’re seeing homegrown silicon from the hyperscalers, Lisa. But one of the things that probably needs to be spoken about more that you’re very focused on is the software side. The market is thirsting for this competitive hardware offering, but for that to really work, it’s all about how do you make the transition as seamless as possible? You guys speak to the term. I think you keep saying higher levels of abstraction of software. And as you see these more open source tools. And it seems with what you’re doing with ROCm, you’re really focused on making it easy for people to choose your hardware, but also to really open up all the quality of engineers, the bigger ecosystem to deliver on the promise of AI. Talk about the vision around software because I think that may be part of the AMD story, it needs to be told even more often.
Lisa Su: Yeah, no, you’re absolutely right. Great hardware has to be enabled by software. I think our thought process with ROCm is, look, we’re not just trying to create another software environment. There are proprietary software environments out there. They’re very, very good, they’re very capable. But actually we thought about with ROCm, yes, we want the software to be very, very capable, but we also want to innovate together with our partners on this open ecosystem. Now, what’s different today about AI and innovation, AI is actually the latest generation software developers actually want to develop on these higher level frameworks. And it’s not because they like AMD or anyone else, it’s because they want to operate as fast as possible. Who wants to tune to hardware? I mean, that’s hard. That takes many years of investment. Whereas if you write at the higher levels, you can actually innovate much faster. And so, that’s what we’ve been really focused on. Things like PyTorch, the PyTorch capability. And what we’ve done is we have basically optimizations on a nightly basis to the PyTorch framework so that if you’re running on PyTorch, AMD is going to run out of the box. We’ve had a lot of opportunities to work with some of the large hyperscalers as well. Microsoft was here today, Meta, Oracle. And we’ve also learned a lot about how to optimize to make it easier for them to adopt. You’re absolutely right, software is super important. And I think today we can say for sure with ROCm 6, we’re absolutely ready for the software environment that AI developers need.
Patrick Moorhead: Yeah. Lisa, we’ve talked to macro-environment, went into data center, talk a little bit about the software support. And by the way, the sixth generation of ROCm, I’ve chronicled since the beginning. And what I heard today, the inflection point for me as an analyst was this is a turning point for the company because you have brought to bear a lot of competitive hardware in this space. But based on what the partners are saying, the software as well. I mean, I heard a lot of messages from Meta, one of the biggest software developers on the planet. And talked about that being the biggest deployment… The fastest deployment that there have. The work that the two of you did on PyTorch and talked about the performance benchmarking and improvements for ROCm 6. For me it was a turning point, which I think is pretty cool. The one thing we haven’t talked about it yet, and I alluded to a little bit in the run-up, is your strategy is really end-to-end AI. Okay. It’s from the data center, it’s the client device and everything in between. Can you talk a little bit about what did you announce today? And I also saw you flash a roadmap up there about the future. And maybe talk about the multiple phases of, we’ll call it the AI PC.
Lisa Su: Yeah, no. For sure, Pat. I think you summarized it well in the data center. For us, it’s been about we have great hardware, let’s make sure the software is ready. That’s ROCm 6. And frankly, I also want to point out, we’ve made sure the platform is ready. It’s not just about one GPU. It’s about what can we do to get multiple GPUs and frankly thousands of GPUs up and running. And we’ve really been able to put that whole capability together. But as you said, as much as I love the cloud and I love the data center and enterprise, I believe AI is going to be everywhere. And the key is how do we get the right technology in each form factor? Yes, we started with Microsoft, Kevin Scott at the beginning of the day, and then we had Pavan Davuluri at the end of the day talk about the Windows and client ecosystem. I’m excited about AI PCs.
I’m excited about this idea that the PC can become truly the personal productivity engine that we all have our own data that we want to be able to leverage, but we also want to leverage all of the models in the cloud. We talked about the importance of NPUs and Ryzen AI. We’ve already shipped millions of processors with Ryzen AI today. We’re now shipping our next generation Ryzen 8040. And yes, we previewed our next, next generation, which is called Strix Point, which will have the next generation NPU.And Pat, this is really just the beginning. I mean, I view the AI PC very much as a continuum of what we can do when we get the true technology in the hands of lots of developers. We’ll be able to unlock productivity that we haven’t imagined before. Lots of stuff going on in AI. I think it’s been just a wonderful opportunity for us to put it all together for people.
Patrick Moorhead: Awesome.
Daniel Newman: As we wrap up and there’s a lot of ground to cover, a couple of things I really loved that you talked about. Anytime you go into the economics, and Lisa, I thought you did a really nice job of talking about that, especially on the data center. Less servers, less GPUs, better economics, because this is very expensive. And so getting there is very expensive. And then obviously the killer workload is its inference. And so consuming it, you want to make it as… You want to democratize it as much as possible. And then you do it from cloud all the way to the edge. But the interesting thing is, we started talking about how fast it’s gone, and I think it’s going to get faster. And I would just love to get your take on that because what we saw in the last 12 months, are we going to see turns like that in six months? Have you thought about it? Because that puts an immense amount of pressure, but also a huge opportunity for you. How do you keep up the pace of innovation with the demand that AI has created?
Lisa Su: Well, I actually view it as just a huge opportunity. And what we see is the opportunity multiplies when you bring partners together. And that’s, hopefully we got that across. As much as we love our technology and the product capabilities that we have, we actually believe the way you accelerate the pace of innovation is that you co-innovate. That we work really closely with our largest customers, our most important ecosystem partners, so that we can bring that innovation together. That’s what we’re doing. In the past, people did things serially. With AI there’s none of that. I mean, this is the fastest moving technology that I’ve ever seen. I actually do agree with you that it is going to accelerate in pace. And I think that actually plays well for those who have a complete portfolio. That’s the piece that I love about it is I can see the synergy between what’s important in the data center and what’s important in the client and edge. And then we see all the embedded capabilities as well.
Patrick Moorhead: Yeah. Oh, go ahead.
Daniel Newman: I was going to say, there’s a tech joke in there, Pat, but about parallel computing. Like you’re saying how it’s serial versus parallel.
Lisa Su: Oh, you got that. I didn’t get that one. But yes. Yes, I got that. Yes.
Patrick Moorhead: It’s good. I mean it’s technology improving the next generation technology. Lisa, it’s been a big day. I mean we’ve covered the gambit. I mean there’s never enough time, but would like to ask you if you had any parting thoughts as we start to wrap up this interview and this day?
Lisa Su: Well, I would say that there’s one key theme, and that is AI is the most important technology really of the last 50 years and certainly the next many years. AI is our number one priority at AMD. I mean we’ve made a ton of progress. We love the partnerships that we have. We love the fact that the ecosystem is really coming together. And we intend to accelerate the rate and pace of AI in the industry.
Daniel Newman: Well, Lisa, thanks so much for joining us here on The Six Five.
Lisa Su: Super. Thank you.
Daniel Newman: All right, everyone. We’re here at the Advancing AI event for AMD in beautiful San Jose, California. It’s been a big day for AI. It’s been a big day for the industry. And we hope that you have learned a lot here with Patrick and myself. Hit that subscribe button, join us for all the episodes here. But we got to say goodbye.
Author Information
Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.
From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.
A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.
An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.