Intel Announces Sapphire Rapids

The Six Five team discusses Intel announces Sapphire Rapids.

If you are interested in watching the full episode you can check it out here.

Disclaimer: The Six Five Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we do not ask that you treat us as such.


Daniel Newman: Sapphire Rapids, next generation Intel, chips for data center, Pat, announced live, came out. You wrote a great piece on Forbes. This one’s yours. So I’ll let you go first, but big moment for Intel.

Patrick Moorhead: Yeah, big moment for Intel a couple years late. We’ve both had the opportunity to talk to Pat Gelsinger, and I just had to have five-minute conversation with Pat Gelsinger to understand. I know why this is late. Intel took on a lot. Typically, what you don’t do in the same generation is you don’t change the node and you don’t radically change the design. Intel did both of those, right? They radically changed the design from a monolithic design to one that was distributed. That’s the first thing that they did. They also made one of the biggest node changes that was out there.

The reason this was late was because the confluence of that, but primarily they had to do a lot of backporting from what they thought they were going to be on more like an Intel 5 than an Intel 7, but big takeaways for me were, first of all, it’s here. This has been shipping for months. It might be late, but it’s already here.

It’s all about acceleration performance. I think Intel outlined that they had eight different accelerator engines, and these are pieces of code that don’t run on the CPU, they run on these accelerators just like we’ve fallen in love with GPUs that do acceleration. So these are anywhere from accelerating data streaming to AI, to analytics, to load balancing, to vRAN, quick assist for encryption and decryption, crypto acceleration. So a lot of these different sub-components that come together not just for AI that let’s say NVIDIA does, but for a lot of other type of capabilities.

I think the second big picture here is that it’s not only all about acceleration, but it’s not at all about the CPU. Intel in their announcement did not talk about the CPU a single time, which is very different from let’s say what Ampere or AMD has been talking about. It’s a positioning move, and I think the degree of success, Daniel, is going to come down to, A, the software that could take advantage of that acceleration, customers wanting to use the software that uses those optimizations, and a heck of a lot of sales and marketing. They’re not going to do this on raw CPU performance. They’re not going to do this on cost. So they’re headed, which I think is a very valid strategy.

Now, they had the who’s who show up on their stage, which is an indication of the type of support they’re getting. Heck, who was the first person, non-Intel person who was on their virtual stage? Our year one Six Five Summit keynote speaker, Michael S. Dell. You had Antonio Neri. You had YY from Lenovo. You had Jensen. Hey, Dave Brown from AWS, another Six Five guest. Arvind Krishna was on there.

Daniel Newman: I think they’ve all been Six Five guests.

Patrick Moorhead: They have, except for … Well, actually, YY has not, but the other four, yes. Raghu was there from VMware. So Intel rolled out the digital red carpet. I think that’s really a plus, but a word of caution. I wouldn’t directly relate maybe what people are saying behind the scenes all the time to the big veeps that go on stage, but they do know that that Intel is going to continue to be a major force. They have the dominant market share in server parts today between 80% and 70%, depending on who you count.

Guess what? The next generation, they have to no longer apologize for what node they’re on. I believe Intel is going to be at a, first of all, they’re going to be on their second generation of distributed architecture and they’re going to be on a much more competitive node, which means the amount of area they can devote to certain subsystems will go up and keep the chip the same size. So I’m optimistic. We’ll see.

Daniel Newman: Yeah. So you hit it on the head. I mean, the bygones be bygones. Intel was going to be late. Nobody’s surprised by this. It’s here. The future is really what Pat Gelsinger and the team can control. They’re very ambitious. Was it four and three?

Patrick Moorhead: Five and four, baby. By the way, I never get that right, Daniel.

Daniel Newman: Five and four.

Patrick Moorhead: Five nodes in four years, and Pat affirmed-

Daniel Newman: One ahead of the year, meaning it’s a really consolidated, compressed timeline, but something that if Pat can get done could get Intel back into the driver’s seat. I think what you called out deserves a double click and that’s Intel is alluding to some extent that some of the traditional computing and workloads on these servers are becoming table stakes, that into this generation they can all do it, meaning the versions built on ARM, the versions that are being built by AMD and, of course, the versions being built by Intel, and they’re really leaning into what accelerated computing is going to be.

Future research analyst, Ron Westfall, did a really good breakdown on this and when he was coming back and saying, “Hey, what was special about this?” and it was really just that. It was that this is all about innovation, it’s all about accelerated compute, and that’s what Intel is focused on. You look down the list of AMX, DSA, DLB, QAT, AVX. Now, again, nobody knows what that means.

Patrick Moorhead: I love it when you talk like that.

Daniel Newman: I knew you would like it. Now, the other 93% of listeners had no idea what I’m talking about, but you’ve got Advanced Matrix Extensions, you’ve got Data Streaming Accelerators, you’ve got Load Balancing. These are all things that make servers work better. This is accelerating workloads that are going to really make people’s day-to-day interactions with software better.

So Intel is saying, “Look, some of the things that used to be benchmarks and everybody would run numbers next to each other, some of that’s becoming table stakes. Let’s talk about the things that we’re building into our next generation technology that’s going to make your systems work better.” For Intel, my opinion is it’s all about keeping the customers they have. For the last few years, it’s been a bit of shedding market share as delays have crept in and opened the door. You got to give credit to AMD. You got to give credit to ARM for enabling new companies to enter the server market, but at the same time, Intel has given market share because it hasn’t been able to compete.

So now, the question is with products that are more competitive with their existing, they still have, and correct me, Pat, I know you keep tabs on this too, around 80% of the server market share.

Patrick Moorhead: That’s right, 78%. That’s right.

Daniel Newman: So it’s still a very good number. If you and I had 80% of the analyst market, we’d be doing very well. Sometimes I think people forget about that is that Intel is still doing very well. My take though is the five and four in an incredible discipline in winning and keeping its existing customers for as long as possible are going to be the critical things that take place. So it was good, of course, to get the who’s who of OEMs and the who’s who of cloud providers up on that stage.

We all know that all those companies are hedging more and more and they’re going to continue to hedge, but if Intel gives them products that perform, Intel can tap into its long-term and deep relationships and the pretty much CIO offices and the cloud companies all over the planet to win more business and keep more workloads. Let’s remember, computing is going to grow. There’s going to be more demand and it’s not going to change any anytime soon. So congratulations, Intel. It’s one checkbox down, many, many more to go.

Author Information

Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.

From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.

A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.

An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.


Latest Insights:

Cisco and Intel Bolster Their Ability to Accelerate Private 5G Worldwide by Giving Partners and Customers Access to Three New Innovation Centers
The Futurum Group’s Ron Westfall examines why the expansion of Cisco’s collaboration with Intel can accelerate global private 5G implementations as partners and customers can access three new innovation centers.
Rami Rahim, CEO at Juniper Networks, joins Daniel Newman to share his insights on AI-native networking and its significant impact on the industry, offering a glimpse into Juniper’s innovative approach and future prospects.
A Comprehensive List of Qualcomm’s Most Relevant Generative AI Announcements at Mobile World Congress Barcelona 2024
Olivier Blanchard, Research Director at The Futurum Group, discusses Qualcomm’s latest on-device generative AI announcements at MWC 2024, hinting at the massive market opportunity for OEMs and ISPs.
Qualcomm’s New AI Hub Solves Two Key Challenges for On-Device AI’s Market Opportunity: Velocity and Scale
Olivier Blanchard, Research Director at The Futurum Group, discusses why the Qualcomm AI Hub is exactly what the company and its OEM partners needed to accelerate the on-device AI app ecosystem to boost market adoption of AI-capable devices.