Search
Close this search box.

AMD Advancing AI Event

AMD Advancing AI Event

The Six Five team discusses AMD Advancing AI Event

If you are interested in watching the full episode you can check it out here.

Disclaimer: The Six Five Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we ask that you do not treat us as such.

Transcript:

Patrick Moorhead:He really is. Hey, let’s dive in. Dan, you and I went to the event out in San Francisco at Moscone. Big intersection. I mean, everybody wants to talk about data center AI, we got that and a lot more.

Daniel Newman: Yeah, I think there’s so much here. I’ll chat a little bit about some things that caught my attention and I’m sure there’s going to be a lot of oxygen left in the room, The Six Five and look out for all these to drop because we talked to heads of the PC business, client business. We talked to the heads of data center business, GPU business, of course we mentioned we talked to Lisa Su, so there’ll be a lot of in-depth stuff here. But let me give maybe some back off and give a little bit more of a broad observation. Coming into this advancing AI event, second year, you could absolutely be certain that all eyes were on the GPU. Everybody wanted to know what’s coming, which had already kind of been out there, it’s been talked about for several months. But are there any changes? Are there any material updates, any new cloud partner wins. The stock sort of went a little negative yesterday. I think people, and Pat, I think you said this very astutely, “People wanted to maybe hear about a big AWS win.”

Of course, Lisa came right out and this was the tweet. The tweet was, “$500 billion TAM.” She just went from I think 400, which everybody has said was sort of the most bullish forecast on Wall Street to 500, and now a 60% CAGR from 23 to 28 for AI accelerators. This just chips people. This is the volume. So when you hear things like, TSMC is sold out for two years, Blackwell is sold out for all of 2025. There’s no HBM3 or HBM3 memory out there. This is what’s going on. There is an insatiable amount of demand, but there’s also still this kind of, well, who’s going to get market share? Because we all kind of know right now where Nvidia sits and Nvidia’s got 90 to 93 depending on which data set you look at.

And there’s even some speculation that with Blackwell they’re gaining market share at least as a matter of revenue because of pricing. We know Intel is starting to shift down to three then. So AMD had a lot of success with their MI300, and in fact, she even broke some very interesting news about what AMD has been able to accomplish with Meta running the 405B model on the MI300 exclusively, by the way, another big breaking moment from the event. But as the event went on, what it was all about was the 325 and then the upcoming 355, the new architecture, how this is basically providing more memory ’cause memory is the big need, especially for training.

And then the comparisons came out, which of course everybody’s eyes were on that too. And this was a really interesting juxtaposition for AMD because you’ve got the Hopper series, two years old, the H200 announced a year ago, released earlier this year, and you’ve got this new part which is just starting to ship in Q4 and at the same time you’ve got Blackwell coming out and so you’ve got this kind of comparison that’s going on right now is, should they compare it to the part that came out earlier this year? This is such a fast moving market, Pat, with the annual cadences now is apart from early this year the right thing to compare to? You and I both had some pretty blown up tweets that got a lot of responses. I was getting a lot of pushback from making that comparison. But also at the same time, what else can they do?

Right now, that is what is out there. That is what is being used. So that’s interesting. But I think, Pat, this leaves a lot of work for Signal65 over the next year to start doing some really significant assessments. The MI part was really interesting. Pat, I’m going to just talk about one more thing and I’m going to leave the rest. I’m going to leave networking and client to you, plus anything else you want to cover. But I thought the Epic announcements were really promising. The company was able to really make clear that they have been the undisputed winner in cloud. I mean they just marched up Google, they marched up Microsoft, they had Meta up on stage and these companies were up there talking about really how they’ve gone all in on AMD. I mean Meta is probably the most symbiotic partnership of all of them. They seem to be very co-developing, working very closely and Meta, I think I’m hearing as well, maybe as high as, this is speculation y’all, but maybe as high 80% of the data center, CPU at Meta is AMD.

Huge wins, but of course just having that overall hyperscaler business. Satya was incredibly bullish about the company. He came on in a remote interview. I don’t know if he ever shows up anywhere. He’s the anti-Jensen. He shows up everywhere, but never in person. But the Epic business has just been really, really strong and some of the things I learned yesterday about the head node, about the impact that it can have about higher throughput and efficiency, that was really interesting too. I mean we heard double-digit performance gains on the Epic head node on some of these AI systems back, so that was great. There’s a ton more but 34% market share at this point. I mean, which basically means they’re crushing it in the cloud, single digit on the enterprise. So I see a big opportunity in enterprise for AMD, but that’s a totally different muscle. So I need to see, but I’ll pass this over to you ’cause like I said, there was way too much news to try to do in a five-minute bit.

Patrick Moorhead:Yeah, this will be our longest topic and I think it deserves it. So let me fill in some of the cracks here. So AMD came out with CPU GPU AI client solution for business, but they also entered a new market and that’s back end network. So there’s the front end network and the back end network. Back end network is connecting all of the GPU nodes together, other, and you need a different type of performance and different types of protocols to make that happen. One of the biggest reasons training runs don’t get complete and that’s bleeding over into inference latency is a breakdown in the network. By the way, the second reason stuff doesn’t work is GPUs burning up. And what’s interesting, this market has been traditionally dominated by companies like Broadcom. Interestingly enough, the tweet that got the most views was the tweet about the AI networking card that they brought out.

I didn’t expect this one, even though I know they have Pensando with DPU, that’s for front end networks, I didn’t think that that type of architecture could be high performance enough. Of course it’s programmable. That’s what it is from the start, but it’s an ASIC based design, so you’re going to get performance. We’re going to have to see this one pans out. This could be a sleeper, but if you look at the first of all, the market need for more reliable solutions and that the biggest reason for training runs is a breakdown in networking. There’s known-knowns, known-unknowns that you try to solve for clients. This is a known-known issue and AMD is coming into that market. We had a very interesting conversation with that on The Six Five, so we’re going to have to see how that pans out. I’m going to talk a little bit competitively.

What do we know? What do we don’t know? Let’s do AMD versus Intel in data center. So like you said, AMD has a rock in 34%. That’s peak, when I was at AMD, Opteron was 27%. That was the peak. There really wasn’t a cloud business. There was a web business. In fact, Google was our biggest customer for Opteron followed by HPE, Cisco and a little bit of Dell at the end. So I believe when the smoke clears, AMD will have put a little bit more distance between itself and Intel. Now when it comes to let’s say doing inference, Intel has the clear lead with its accelerators with AMX. AMD’s biggest challenge is going to be in the enterprise. They have single digit enterprise share, so they’re doing really well in the hyperscalers, they’re doing really well in enterprise SaaS, they’re doing really well in tier two CSPs.

But when it comes to enterprises choosing instances in cloud, there’s not a ton of that. And with single digit market share in the enterprise with what looks like a vastly superior product, AMD has to get on the stick and invest in enterprise sales, enterprise marketing in terms of collateral, in terms of POCs because if they give Intel some oxygen is going to come not roaring back in enterprise, they have what, 91% market share there? But it establishes them, gives them a point to pivot off of. And, Dan, we haven’t seen the lift in enterprise from AI yet, when that hits and if Intel has even a more competitive product going into Aetina, it could spell an issue for AMD. On AMD GPU versus Nvidia GPU, it’s murky. I mean, Nvidia came out with new numbers for more finely tuned software stacks that they dropped. Nvidia came out with their numbers and their numbers were not done by a third party, no third party attribution, by the way, there is Nvidia. So I don’t know, right. I have no idea. What I can say is what AMD showed with it’s MI355X, which would be out closer to second half of ’25, is a vastly superior product to the predecessors of the MI300 series, including the 325.

So first of all, it’s higher efficiency, it’s higher performance, and it supports lower bitrate models a lot better. It’s a pretty impressive four and six bit numbers that came out of here. One thing I think AMD did a great job on is showing the type of scale that it operates in. One of the head turners for me, and I had 45,000 people tune into this on X, was that 1.5 million Epic processors inside of Meta and Meta Llama 405B runs exclusively on MI300X for all live traffic. And I think what they mean by live traffic is not training and that would be inference. So makes sense, but it shows the scale. I was on Yahoo Finance yesterday and the first question they asked me was, “Why did AMD’s stock go down?” You had addressed this a bit, people were looking for a knockout kill. They were looking for a new customer like an AWS or Google. First of all, AWS if it’s going to do anything with AMD would be at re:Invent in December that you and I are going to be-

Daniel Newman: I agree.

Patrick Moorhead:… attending. And Google, they’re putting a ton of effort into TPU and NVIDIA GPU, they might not have the resources. But I think at the end of the day, all the hyperscalers will be an MI customer. Nobody wants-

Daniel Newman: Nothing else for merchant, right. Just to have the offering ’cause if AMD starts to get traction, don’t they want to make it available? I mean, that would be my take. Plus-

Patrick Moorhead: Yeah, and also there’s concern about the power that NVIDIA has and NVIDIA is making all the money here. Now the hyperscalers are the second-biggest beneficiaries where for every dollar they invest in GPU, they can charge $8 compared to a dollar in for CPU, they can charge $3. So they are making money, but they’re not making it hand over fist like NVIDIA is. So anyways, a lot of conversations. Check out the videos and they come out today, next week. We think you’ll enjoy them. We asked some pretty tough questions and I feel like we got some really good answers.

Daniel Newman: Just before we jump back, ’cause like you said, I wanted to put this on record is, I think there is a significant opportunity for AMD just based on what we heard yesterday about the CPU GPU combo. And I think you said something really poignant there, “Hedging is going to be a trend in ’25,” and these companies are hedging in two ways. They either hedge with Gaudi or AMD or they hedge with building their own. And frankly, my take is they’re going to do both. There’s absolutely no way this sort of monopolistic AI control is going to remain. That doesn’t mean Nvidia’s not doing great things. It just means these companies hacked it diversify, same way they did with Compute. It’s going to be the same. It’s just going to move faster with AI. Sorry, I had to put that on the record on 10/11, my mom’s birthday, happy birthday mom, because I think that’s what a lot of people might be missing about why this is an opportunity.

Patrick Moorhead: Yeah. And by the way, Nvidia needs to be very careful in the way it handles its customers and its ecosystem. The industry is hoovering evidence against Nvidia and shoveling it into DOJ as we speak. I mean, I haven’t personally seen the threats. I mean, nobody from Nvidia has ever said, “You better not say that,” or anything like that. But Nvidia needs to be very careful in how it’s handling its competitors and its customers even more.

Author Information

Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.

From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.

A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.

An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.

SHARE:

Latest Insights:

Nick Coult, Director at Amazon Web Services, joins Keith Townsend to share insights on Amazon ECS's 10th anniversary, discussing its evolution, key innovations, and future vision, highlighting the impact Gen AI has on the industry.
Join hosts Patrick Moorhead and Melody Brue to explore the groundbreaking impact of high-capacity SSDs and QLC technology in driving AI's future, emphasizing Solidigm's role in leading this transformative journey.
Adobe Reports Record FY2024 Revenue Driven by Strong Digital Media and Digital Experience Segments While Leveraging AI to Drive Innovation and Meet Analyst Expectations
Keith Kirkpatrick, Research Director at The Futurum Group, analyzes Adobe’s FY2024 performance. Growth in the Digital Media and Digital Experience segments contributed to record revenue while addressing challenges like the impacts of foreign exchange.
Matt Yanchyshyn, VP at AWS, joins Dion Hinchcliffe to share insights on the evolving cloud marketplace landscape, highlighting AWS Marketplace's new features and the impact of GenAI on business operations.