Search
Close this search box.

AMD in the AI Era: CEO Dr. Lisa Su on Product Innovation, Leadership, and the Future – Six Five Media at AMD Advancing AI

AMD in the AI Era: CEO Dr. Lisa Su on Product Innovation, Leadership, and the Future - Six Five Media at AMD Advancing AI

Six Five Media is bringing you the latest from AMD’s top executives at their second annual Advancing AI event.

AMD‘s CEO Dr. Lisa Su, sat down with our hosts, Daniel Newman and Patrick Moorhead, for a conversation on AMD’s advancements in AI and the technology industry’s future. Dr. Su shares valuable insights from AMD, highlighting their strides in AI and reflecting on her decade of leadership at the company.

Their discussion covers:

  • The distinctive features of the 5th Gen AMD EPYC CPU codenamed “Turin,” and its impact on the data center CPU market
  • The significance of the new AMD Instinct MI325X GPU for generative AI model optimization
  • Updates on ROCm, AMD’s open software approach, and what it means for customers and developers
  • AMD’s progress in comprehensive AI infrastructure, including AI PCs and edge AI portfolio
  • Dr. Su’s reflection on her 10 years as CEO of AMD, the company’s remarkable growth, and her outlook for the future

Learn more at AMD and AMD Advancing AI.

Watch the video below, and be sure to catch our full coverage at https://sixfivemedia.com/amd-advancing-ai/

Or listen to the audio here:

Disclaimer: Six Five On the Road is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.

Transcript:

Patrick Moorhead: The Six Five is On the Road here in San Francisco at the second annual AMD Advancing AI event. It has been an incredible event. We’ve seen AI not only from the hyperscaler data center with CPUs and GPUs all the way to AI PCs. Dan, we love AI here in The Six Five, right?

Daniel Newman: Well, it’s been the topic of the year Pat, and I think that’s for good reason. We’re seeing it grow. In fact, today, I know we have a special guest joining us for this one, but we saw a new TAM number. You and I love sizing markets.

Patrick Moorhead: That’s right.

Daniel Newman: There’s so much speculation out there about just how big this opportunity is and if the infrastructure investments that are going on and that are going to go on over the next five years has anything to say with it, I don’t know if people even appreciate how big it actually is just yet.

Patrick Moorhead: That’s right. And I can’t imagine a better guest, by the way. She doesn’t need an introduction, AMD CEO Lisa Su. Lisa, welcome back to The Six Five. Thanks for coming on.

Dr. Lisa Su: Absolutely. Thank you, Pat. It’s great to be here with you.

Patrick Moorhead: Yes.

Daniel Newman: It is good to have you. We have a lot of ground to cover because you made announcements across the entire portfolio. Let’s start with EPYC. I know we’re going to end up, people are going to want to talk about AI chips, but you can’t do this without some serious compute and you’ve been winning in the cloud. I mean, the numbers have been pretty remarkable and you really did march pretty much a star-studded cast. Fifth generation EPYC, talk a little bit about what’s going on. Why has this been so successful in the cloud and what else did you announce around EPYC today?

Dr. Lisa Su: Yeah, absolutely. Well, first of all, thank you again for having me. Super excited about fifth gen EPYC. The thing that we really have found about the data center is it’s not just about the product that you launch today. It’s really about all the work that we’ve done over the last seven, eight years to really build the trust, the capability, the partnership with the largest cloud hyperscalers. So, very excited today to launch fifth gen EPYC. It’s a super product, very strong performance, 17% average IPC improvement. When you look across it it’s probably our most comprehensive product portfolio. So we really have optimized for both cloud-native environments, like socket-level performance to enterprise environments that really want the highest per-core performance and yeah, we’ve seen fantastic results. So, it was really exciting to have Google with us, Meta with us, Oracle, a number of folks that all of our OEM partners were also talking about what they’re doing with EPYC and Turin is on its way to be just a fantastic product for us.

Patrick Moorhead: Every time I tell people about AMD’s hyperscaler, CPU share, I get a double take. I’m like, no, it’s 50 to 60% and with some it’s as high as 80%. And I’m glad you started talking about this, even though some of the secondary information sources have been talking about that, I think they way under counted it because they don’t know what’s going on in places like SAS and elements like that. Super impressive. Now, EPYC’s had quite the ride, but actually the fastest growing product in your history is MI300, data center GPUs. It has been amazing, I remember the number in December. Keep revising it up. It’s been pretty awesome. You brought out, announced a new one, MI325X. Can you talk about what that means for the future of generative AI models?

Dr. Lisa Su: Yeah, absolutely, Pat. So, look, lots of interest around the data center accelerator market. We look at it as by far the single largest growth opportunity for the industry-

Patrick Moorhead: Right.

Dr. Lisa Su: … as well as for AMD. We talked about a new TAM number. Last year we thought the TAM out through 2027 would be about 400 billion or so, which seemed really large-

Patrick Moorhead: Big.

Dr. Lisa Su: … at the time. And frankly what’s happened in the last 12 months is people have moved closer to our number. Now, as we extend it out through 2028, we think it’s a $500 billion TAM. We’re super excited about the technology that we’re bringing to market. We talk a lot about how from a hardware standpoint, we’ve always had fantastic hardware, but we’ve spent a tremendous amount of time really optimizing the ROCm software stack and with that, we’re just seeing great inferencing and training performance. So, MI325 is our newest product. It will start in production later this quarter. And again, what it does is it’s industry leading in terms of memory, capacity, memory bandwidth, and it really allows us to take the next big step for genAI workload. So, you would expect larger models, you would expect faster inferencing capability. You would expect to be able to really train in a very competitive environment. It’s really just the next step, but we have the 350 series, which is-

Patrick Moorhead: That’s right.

Dr. Lisa Su: … also coming later in 2025. Then we have the MI400 series. This market is moving faster than anything that I’ve seen before, and it’s because everybody wants AI to be more fully deployed throughout their enterprises and businesses.

Patrick Moorhead: Absolutely.

Daniel Newman: You also mentioned a pretty interesting data point that I know I shared and people were eating this up. But 405B, this massive Meta open source model, exclusively running on AMD. Yes? Was that the right announcement that-

Dr. Lisa Su: It absolutely was. So, first of all, we love the work with Meta. Kevin and his team are fantastic. He said two things that hopefully people paid attention to. One was that he’s ramped one and a half million EPYC CPUs, which is truly data center at scale. So very, very proud of that. Then what happened with MI300, this is the way things play out with our largest customers. We’ve been working with them on GPUs for quite some time, and we’ve really learned how to really co-develop and co-optimize. We spent a tremendous amount of time on the software and the overall infrastructure environment, so when the new Llama 3.1 model came out at 405 billion parameters, it’s a sweet spot for us because we-

Daniel Newman: Sure.

Dr. Lisa Su: … have the largest memory capacity. And so, for Kevin to say that it’s being serviced from an inference standpoint exclusively on MI300, that is absolutely our honor.

Patrick Moorhead: I took a picture of that for sure, and the 1.5 million I got out and our audiences again, just on social, are eating this thing up. It’s a huge shock to them, right? It’s clear to me your close relationship with Meta, literally they give you feedback and you respond and you co-develop. That was obvious to me from the stage conversation.

Dr. Lisa Su: That is really what we’ve learned is nobody has all the good ideas, right, nobody knows everything to look around the corner on. I think we are exceptionally focused on ensuring that we continue to push the envelope on the road map, but we do that with our partners giving us what they’re watching out for. So, we’re doing that on our CPU roadmap. We’re certainly doing that on our GPU roadmap and it will ensure that we have better products as well as a better understanding of what’s the next tech that we need to bring out.

Patrick Moorhead: Yeah.

Daniel Newman: I think as the market’s trying to digest though, these kinds of wins are really important because there’s sometimes this narrative that something that big and it can only be done on one certain architecture, one technology and seeing others. I know we did that when I think Google talked about how it trained Gemini just getting out there that there are different approaches. There’s different companies that can play and take care of these difficult technical needs that these companies have. Another big part of this win though, or too win is going to be developers, Lisa. I see it, I sense it, I hear it from the stage. I think you made a lot of progress with ROCm, but that’s what everyone’s waiting next. Okay, this hardware’s really competitive. It’s got this great memory, it’s got all this training capacity and inference capacity. Can they win the developers? Talk about ROCm and how that’s progressing.

Dr. Lisa Su: I have to say, if there’s one thing I’m most proud of over the last year, it’s the work that we’ve made on the ROCm software stack and really courting developers to want to optimize to AMD. Now, what does that mean? We’ve put a tremendous amount of resources in it so let me be clear, the heavy lifting is on our part-

Patrick Moorhead: Yes.

Dr. Lisa Su: … to make sure that the libraries are there, that we do all the tuning, that frankly we want to do 95% of the work, but the fact is this is a mission or this is one of those objectives that everybody can get their head around. Nobody wants to be locked into a proprietary ecosystem, everybody wants choice. And so the fact is with, let’s call it very little modification, you can design your software, you can build your software and build your models and build your applications such that it is quite hardware agnostic. So, when you think about things like PyTorch, that’s exactly what we’re trying to do. TensorFlow, Onyx, OpenAI’s Triton, all of these guys are really trying to make it easier for you to really write at a higher level of abstraction. And we’ve shown them that you can do it at MI300 and our MI3XX road map with very little perturbation to what they would normally do.

So, super happy about ROCm. We now run more than 1 million models out of the box so that’s a pretty incredibly large number. We’ve also seen substantial performance improvements on the newest versions of ROCm, and we have a whole bunch of developers here today who are talking about what the next important things are for software innovation. So yeah, this is a huge focus for us and we’re making great progress.

Patrick Moorhead: Yeah, I knew ROCm had changed last year. Again, I know we talk a lot about Meta, but Meta literally got up on stage and had nice things to say, and I think they’re probably the hardest graders of this, given that they essentially created PyTorch and they know this very well, and I knew it had turned a corner. I mean, ROCm has always been really good at HPC. When it comes to AIs, it’s just different.

Dr. Lisa Su: Yeah.

Patrick Moorhead: So, I knew there was something different there and with all of the investment that’s required in software, hardware’s hard enough, but then adding on top of that software again, it’s great to see the success there. So, a lot of the keynote today really focused on the data center, but you’re doing more than data center with AI. You’re doing AI PCs, you’re doing AI on the edge. Can you talk a little bit about your progress there?

Dr. Lisa Su: Yeah, absolutely. So, the way I think about AI is really an end-to-end story. So, the data center, the infrastructure gets a lot of attention just given the sheer amount of investment that’s going in there. But frankly, we’re all going to experience AI on devices and in the edge, and so that part of our roadmap is also really important. We talked about enterprise class Copilot Plus PCs today.

Patrick Moorhead: Yes.

Dr. Lisa Su: AI PCs continues to be a category that I’m very excited about. I know you guys write about it quite often. I think it’s making fast progress. It’s still early in the AI PC cycle, but I think some of the things that Pavan showed on stage today in terms of the new capabilities with Copilot Plus are going to be very interesting and really resonate with Enterprise customers. We have a number of new platforms coming out with Strix. We have newer products that are coming out as we go over the next couple of months and at the end of the day, we have one goal in life, which is to bring, let’s call it leading edge AI PC capability, but we also want to do everything else that PCs normally do. So, it’s-

Patrick Moorhead: Sure.

Dr. Lisa Su: … really an and function with all of the things that you do today plus AI capability. And yeah, I think this is going to be an exciting part of the roadmap.

Patrick Moorhead: Excellent.

Daniel Newman: Yeah, I continue to look at this cycle for AI PCs and even these whole devices, it’s just going to be more elongated. I know everybody wants this to be the immediate turn the page and boom. I think people are going to see now that it’s so software driven, they’re going to see a new feature, a new capability,-

Dr. Lisa Su: Yes.

Daniel Newman: … and then they’re going to say, “Now I got to go get this because I couldn’t do this with the device I have now.” And that’s different because in the past it’s always been like, “Oh, it’s got battery,” it’s been a thing. Now, it’s every day new software, new ISVs can create something that can create a new surge-

Dr. Lisa Su: That’s right.

Daniel Newman: … of demand and that’s really exciting.

Dr. Lisa Su: Yeah, and I think the other thing to keep in mind is people are going to want to future-proof their technology,-

Patrick Moorhead: That’s right.

Dr. Lisa Su: … right? So, when I talk to CIOs these days, they’re not exactly sure what they’re going to need, but certainly, if they’re going to make an investment now they want to invest in the capability that can scale going forward.

Patrick Moorhead: Yeah, they have bought in fundamentally to let’s do it more privately, let’s do it more securely and in some cases, save money, because they can run the inference on the device itself. It was interesting to hear Pavan talk about the hybrid future. Microsoft, the company with a very large data center estate also espousing AI on the edge. I think that’s categorically it is going to happen and the only question is when? I’m seeing everybody line up against that for a lot of different reasons. So, I’m very optimistic about the rollout.

Daniel Newman: All the while maybe getting 18 to 20% more productivity. I’m seeing numbers like that-

Patrick Moorhead: I know, I know.

Dr. Lisa Su: Yes.

Daniel Newman: … out of employees, which is remarkable. You had a bit of news, you shared it from the stage. I think both Pat and I shared socially. You’ve had such a tremendous run now 10 years, you took the company from a market cap of around 2 billion to now somewhere around 265 billion. That’s 130 times return. Pretty remarkable. Give us a quick recap on the 10 years and what are you most excited about going forward?

Dr. Lisa Su: Well, first of all, it’s really one of those things where you look back and you say, “Man, it’s been such an amazing 10 years.” I mean, I’ve loved every minute of it. I think, what I love most about AMD is that we are in a place where the technology that we’re building, the customers that we’re partnering with, this is technology that matters.

Patrick Moorhead: Yeah.

Dr. Lisa Su: This is technology that changes the world. That’s what I’ve always wanted to do. That’s the engineer in me. But what’s really most exciting is we are actually at a place where tech is even becoming more exciting. Computing is even becoming more important. Who would’ve imagined that even 10 years ago that every country needs to have their own computing capability? Every government is talking about it, every enterprise, all of those opportunities are opportunities for us to take high-performance computing and AI to many, many more places. So yeah, it’s a fun week. I would say the most fun times are always when we’re launching products, so to be able to launch products this week is pretty special.

Patrick Moorhead: Lisa, if I can share my thoughts when you got the CEO role, I think the first thing that you said, or at least the first thing that I heard was, “We’re going to bring out great products.” And I don’t know why this was such an amazing revelation because of course it’s great products, but I felt like AMD maybe had some other focuses. When you said that it was so meaningful to me, and I think I remember thinking, “AMD is so back.” And by the way, nobody thought you were back. And then I remember the Zen disclosure over at the St. Regis where at that point I literally said, “Yeah, AMD is absolutely back. Watch this company.” People still thought I was crazy, but I appreciate you making a lot of people look very smart. And then for the naysayers, well, we don’t talk about them, but anyways, that really made an impact on me.

Dr. Lisa Su: Well, thank you, Pat. I actually remember that day pretty well because it was our first reveal of Zen One, and you came up to me and asked me, “Is it really that good?” And I said, “Yeah, it is.”

Patrick Moorhead: I did. I asked you three questions and I’m like,-

Dr. Lisa Su: So, you actually did ask me that.

Patrick Moorhead: … “This is it, AMD is back. AMD is back. This is going to be amazing.”

Daniel Newman: It’s funny, they never see it though-

Patrick Moorhead: They don’t.

Daniel Newman: … until all the, it takes some risk to call it early.

Dr. Lisa Su: It takes a while. I mean, look, at the end of the day, nothing in our industry moves overnight. We always believed on the CPU side, it would take three generations to really have people believe that, “Hey, this is really leadership in the industry,” and I’m super happy with what we’ve done there. But frankly, when you look at this AI opportunity, it’s a replay of this notion of you take great technology and then you have to make sure that people know how to use it and take advantage of it and then really count on it.

Patrick Moorhead: Right, yeah. It’s been great to be part of this. So, I want to dial out a little. This is probably going to be the last question or I could sit up here for hours, but I know you have things to do. Busy person, companies to run. How do you view AMD’s role in the AI era? It’s clear what you’ve done, but maybe let’s talk a little bit about the future here. When people look back in five years or so, where do you want to go?

Dr. Lisa Su: Well, there are a couple of things, maybe if I can talk about where I see AI going, and then our role in AI. First of all, I really believe AI is end-to-end. So, we are all going to consume AI in our lives in different ways and we need to have AI across all of the technology that we touch. So that’s a foundational element for it. The other piece of it is, yes, you need great technology, but I also think it’s super important that it be done in an open ecosystem. I’m a big believer in this idea that one plus one is going to be much, much greater than three.

Patrick Moorhead: Right.

Dr. Lisa Su: And if you put all of the smart people in the industry together, if you put all of the capability of the industry ecosystem, what you get is the best answer. And so, I think, part of our role is to provide an open AI ecosystem that allows everybody in the industry to innovate on top of that. We’re pushing the bleeding edge of tech, I mean, hardware, software, systems. We’re doing all of those things. You expect us to do all those things.

Patrick Moorhead: Sure.

Dr. Lisa Su: But I think this is an opportunity for AMD to really lead the industry, hold the flag for an open ecosystem that will be able to have the most innovation out there. At the end of the day, we want technology to be used for good. We want to be able to accelerate that, and that’s our mission.

Patrick Moorhead: Satisfying for sure. And bigger than, I mean, running a company is big, but, I mean, doing something to benefit society is a whole different ball game. It’s exciting.

Dr. Lisa Su: It’s pretty cool.

Patrick Moorhead: Yeah.

Daniel Newman: Lisa, I want to thank you so much for joining us. We know you got a busy few hours ahead to finish out this Advancing AI event. It’s always great to have you on the show. Look forward to having you back. Let’s do it again soon.

Dr. Lisa Su: Wonderful. Thank you so much for having me.

Patrick Moorhead: Thanks.

Daniel Newman: And thanks everyone for tuning into The Six Five. We’re On the Road here at AMD’s Advancing AI Event 2024. A big day across the spectrum, lots of news, lots of announcements, and a lot more analysis coming from Patrick and I. Stay part of our community. Hit that subscribe button. Join us again soon. But for now, we got to say goodbye. See you later.

Author Information

Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.

From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.

A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.

An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.

SHARE:

Latest Insights:

Nick Coult, Director at Amazon Web Services, joins Keith Townsend to share insights on Amazon ECS's 10th anniversary, discussing its evolution, key innovations, and future vision, highlighting the impact Gen AI has on the industry.
Join hosts Patrick Moorhead and Melody Brue to explore the groundbreaking impact of high-capacity SSDs and QLC technology in driving AI's future, emphasizing Solidigm's role in leading this transformative journey.
Adobe Reports Record FY2024 Revenue Driven by Strong Digital Media and Digital Experience Segments While Leveraging AI to Drive Innovation and Meet Analyst Expectations
Keith Kirkpatrick, Research Director at The Futurum Group, analyzes Adobe’s FY2024 performance. Growth in the Digital Media and Digital Experience segments contributed to record revenue while addressing challenges like the impacts of foreign exchange.
Matt Yanchyshyn, VP at AWS, joins Dion Hinchcliffe to share insights on the evolving cloud marketplace landscape, highlighting AWS Marketplace's new features and the impact of GenAI on business operations.