Optimizing Connectivity in the Data Center, Server-to-NIC Using LPO – Six Five On The Road at Dell Tech World 2025

Optimizing Connectivity in the Data Center, Server-to-NIC Using LPO - Six Five On The Road at Dell Tech World 2025

LPO vs. CPO: What’s the future of data center optics? 🤔

At Dell Tech World 2025, hosts Patrick Moorhead and Daniel Newman speak with James Wynia, Director Product Management Networking ESG at Dell Technologies. The focus? A new innovation for networking: Linear Pluggable Optics (LPO). Listen in on Dell’s innovative approach to redefining data center connectivity through Linear Pluggable Optics (LPO), a significant advancement over Co-packed Optics (CPO) in cutting power, latency, and cost in data centers. 🏋️

Key takeaways include:

🔹LPO: The Next Frontier in Connectivity: LPO is emerging as a pivotal technology, significantly reducing power consumption, latency, and cost in data centers, crucial for scaling AI infrastructure.

🔹World’s First General Release LPO Solution: Dell Technologies announced the launch of the world’s first general release LPO solution, Switch-to-NIC, at DTW, underscoring its commitment to innovation and delivering tangible customer value.

🔹LPO vs. CPO: A Complementary Future: The discussion explored the relationship between LPO and the upcoming Co-Packaged Optics (CPO), highlighting their distinct benefits and how they can complement each other for optimal data center performance.

🔹Experience LPO in Action: Attendees at Dell Technologies World are invited to witness live demonstrations of Dell’s LPO solutions, integrated into AI Factory offerings, showcasing real-world applications and benefits.

Learn more at Dell Technologies.

Watch the full video at Six Five Media, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Or listen to the audio here:

Disclaimer: Six Five On The Road at Dell Tech World 2025 is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded, and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.

Transcript:

Patrick Moorhead: The Six Five is On The Road here in Las Vegas, Nevada. We’re at Dell Tech World 2025 and it has been a great event so far. A lot of enterprises are talking about the value they’re getting out of AI. And of course I love products. We’re hearing new hardware, new software, new services to wrap all those together.

Daniel Newman: Yeah, I love what we’re seeing in terms of pulling it together and really bringing the value propositions we saw. Keynote day one really sort of the beginning of a story about reimagining the future where Michael Dell really took the stage, brought the big customers out, and then today, Jeff Clarke really brought those generative AI applications. You kind of brought those, you know, proof of concepts to life. And I think that was just a great continuation. The story here is definitely about bringing it all together, bringing it all to life.

Patrick Moorhead: So let’s dig a little deeper here. We love infrastructure. Infrastructure is sexy. We’ve been saying that forever. Even before everybody jumped on the bandwagon, there’s been a ton of discussions. You know, generally people talk about GPU, compute now, CPUs, they talk about HBM memory, they talk about storage, but what about the connective tissue that pulls that all together and that is networking. And networking can either be your friend or your foe based on the quality and the way that it is set up. So I can’t imagine a better person to talk about this and optics than James from Dell. James, great to see you.

James Wynia: Thank you, Pat. So happy to be here and talk about this topic. Been with Dell for 25 years and living in the networking arena that entire time. Always finding a better way to the better mousetrap, as it were, is always exciting. So today I’m here to talk about LPO. You know, we’re announcing a new solution and LPO really is important to be able to find how we get better benefits. Right?

Patrick Moorhead: Yeah.

James Wynia: And we’re looking for how do we remove components that are, you know, maybe not necessarily in every circumstance to get us there quicker. And so an LPO, a linear programmable optic or linear program optic, helps us remove the DSP from that solution. So not to get into the technical weeds, but.

Patrick Moorhead: Well, maybe we can do this. Maybe if you can talk about, okay, there’s copper and there’s optics.

James Wynia: Yep.

Patrick Moorhead: Tell us about why you might need something more than Copper. I thought copper was going to solve everything. I mean we have scale up networks that are copper.

James Wynia: Yes. Never bet against copper for sure, but copper, I mean, physics gets in the way. And on a 400800 meter, 800 data rate transceiver. You can go at most 4 meters on copper and that’s if everything is pristinely orchestrated. Usually it’s only two. Okay. So in a data center the size of a football field, you’re going to need more than 4 meters. So yeah, so that’s where fiber comes in. And as you get into these high powered 400, 800 gig transceivers, all of a sudden they’re consuming a lot of power and they’re adding a little bit of latency and they’re generating heat like crazy. So.

Daniel Newman: It sounds like, you know, LPO offers a number of benefits. Obviously the distance is going to be one of the big reasons that we look to light instead of copper to solve this problem. What do you sort of, are there other benefits though? Because I think a lot of companies are sort of, they’re always weighing optionality right now.

James Wynia: Right, right.

Daniel Newman: There’s some different cost impacts. Right. Related to optics. I mean, beyond just the distance, what do you sort of share with your customers about some of the key benefits to LPO?

James Wynia: So great question. And it’s the money question. This is the important one right here. And it’s really, it’s three things. It’s power, it’s cost and it’s latency. So the power as, as I mentioned, the DSP, we can remove the DSP. The DSP is the largest consumer power in that transceiver device. Okay. And so you remove that, okay, the power is going to come down. So your TRU, your cost of ownership over the years is going to be better, less power. The second is the cost of that device. Not only the most power hungry, it’s the most expensive element in that transceiver. And so you help with that. Then third is every time you have a bump in the wire and you go through an ASIC, it’s another about 50 nanoseconds. So 50 nanoseconds on one side, 50 nanoseconds here, it starts to add up. So those are the primary reasons.

Patrick Moorhead: Yeah. So we are here at Dell Tech World and all its splendor and glory. There were a ton of announcements made across storage, across compute and around networking, including software and the services that go on top of that. What announcements did you make here at the show?

James Wynia: Thanks for leaning into that. So we’re announcing the industry first switch to NIC LPO solution. And so we’re in a unique position as Dell because we make all those components. We’re not relying on a partner to give us the server. We’re the number one server company and we sell our own switches as well. So the reason that’s important is because LPO is essentially removing the DSP. The DSP is the cosmetic, you know, stuff lipstick that allows you to get past a myriad of, you know, problems. So it’s the gearbox. So if you need to change speeds from 100 gigs to a 50 gig 2 by 50 gig. Yeah, it takes care of that. If you need to get some smoothing in there, it’ll, it’ll. The DSP is magic. Okay. The problem is that now it could do all that stuff for you. Things got a little noisy, take care of it. It’s gone. So you have to have a very clean engineered link, essentially. But since we are selling everything in this picture, the servers, the NICs, the transceivers, we can characterize that on all of our equipment right up front and then we can sell in volume. Whereas a peer networking company, they don’t have the server, they have to make assumptions or go with partnerships and things change and they don’t know. So that’s a huge reason why we can actually introduce this solution. And you haven’t seen it anywhere else.

Patrick Moorhead: Glad you hit on that.

Daniel Newman: Yeah. So James, we hear a lot as we sort of moved into this rapid warp speed build out of AI about Co packaged optics, CPO. And just because we don’t get enough acronyms, it doesn’t even exactly align to how LPO. But the bottom line is we’re hearing about it, it’s seemingly coming. Can you share a little bit with the audience about how CPO and LPO compare? Is it a competitor? Do these things work in conjunction? Because I think this is going to be a very popular topic over the next few years as we see, you know, compute designs look to increase scale for connectivity.

James Wynia: I thought this question might come up. I brought my boxing gloves here. Seriously though, CPO is like a competing solution, okay. They both head in the same direction of lowering the power, reducing latency and changing the cost. Both lower the cost in some ways. A CPO, a co packaged optic, they move like the lasers and some of the components inside the switch right on the, right on the layer right next to the NPU. And so you have a nice clean, easy connectivity there. You don’t have to go all the way out to the transceiver. Okay. So it’s kind of obvious that this is going to be less expensive. There are other complications though. Okay, well, now you have to use all the ports in order to really get the cost savings that you need. Also, if there’s a problem now, you have to replace the whole switch potentially instead of just a transceiver. So I mean there’s definitely places where it makes sense. I’m not taking a position that they’re bad. I don’t, they have their place and they will hit their marketplace. But I think that they coexist with LPO for different customers and different use cases. The CPO is something that is definitely coming, but it’s not quite proven yet. I don’t have any doubt that it will be, but what I like to say is LPO is a CPO without the baggage is how I describe it.

Patrick Moorhead: Yeah. And so doing a double click on that. You know, I’ve had a lot of conversations even with the hyperscalers and a lot of questions I get are around reliability and some questions about, okay, if something happens with my CPO. First of all, if I have LPO, I can, I can replace the module and, and, and, and we’re good to go.

James Wynia: Right.

Patrick Moorhead: And a lot of questions on, well, if the CPU goes down, am I not only going to take down the whole switch? If I take down the whole switch, I take down an entire node of GPUs or GPUs.

James Wynia: It’s a risk.

Patrick Moorhead: And that’s a, and I know the industry is trying to respond with maybe a modular CPO architecture.

James Wynia: Yeah.

Patrick Moorhead: But, but anyways, it’s a lot more complex that the industry has to put time into, to figure that out. And I don’t think, unless you’re a gigantic hyperscaler and you know the risks going in and you control every part of that. If you want to dive right into that right now, I mean, you can’t dive into that right now. I think in total maybe, maybe 100,000 have shipped at this point. I do believe that at some point is the future. We’ll probably see CPO to the XPU or CPO to the GPU. But that is a long way away from here.

James Wynia: Yeah, I agree. And I think that it’s really again, solution based. If you have super high density dense solutions, which, yeah, we’re in the world of AI, so we see that a lot. That potential has more appeal for it versus the flexibility of an LPO. If I want to use an LPO grade, if I want to use a classic transceiver, I’m done, you know, and so I have that flexibility. So we’ll see how they both kind of exist and live next to each other.

Patrick Moorhead: Hey, wanted to do a double click on the power savings. And this is power savings versus copper. Okay. When it comes to that, that would be the. Like you talked about, all the power savings there are an order of magnitude, like just to give the audience a sense of how much power you can save.

James Wynia: So let’s talk about a 400 gig transceiver, a classic DAC cable. There’s basically no power. It’s amazing. It’s so little that the electrons are just flying through. There’s no, there’s no tax, there’s no electric, no electric devices in the transceiver itself versus a transceiver. The classic, like a single mode today is anywhere from what, seven to nine watts. An LPO would be like four and a half to six watts. So you can basically cut the power in half, typically. But compared to copper. Oh, copper is going to win every time.

Patrick Moorhead: Yeah, I got you. Yeah, sorry, I meant performance.

James Wynia: Performance.

Patrick Moorhead: Sorry about that.

James Wynia: Yes. So performance wise on the copper again, there’s, it’s flying along. So it’s about the same. The same as a, as a fiber solution. The only difference is going to be in a fiber solution. Do you have the DSP that’s going to have that extra 50 nanosecond speed or not? And so LPO performance compared to copper is going to be right next to each other, going to be almost the same.

Patrick Moorhead: Appreciate that.

Daniel Newman: So, James, as we tie off the conversation, thank you so much for giving us the rundown. What are people here, the AI factories get a lot of attention. Everybody wants to pick up a new AI PC, but for the, for the, you know, the people working in the racks and building this stuff out. This stuff’s cool.

James Wynia: Yeah.

Daniel Newman: So all those, the technical folks here that are trying to think about their future, their designs, that want to get to know these solutions a little better, what are you all showing off here at Dell Technologies World?

James Wynia: Yes. So we’re not here to talk. We’re here to show. And if you visit the CMON booth for 65, we actually have this running live in a rack. We have an 800 gig switch, we have two servers. We’re running both single mode option and multimode option through a real world scenario, through a full structure cable layout. Not just a, you know, a demo, a lab thing where you got the loop back between two, which is okay, cool, but not very impactful. So, yeah, it’s the full thing. So CMON 465, please come check it out.

Daniel Newman: All right. Well, James, thanks so much for joining us here on Six Five On The Road at Dell Technologies World 2025. I look forward to hearing more from you soon and tracking all that’s going on with LPO, CPO, networking, and all things that connect the future.

James Wynia: The future looks bright. Thank you.

Daniel Newman: Thank you everybody for tuning in. This is the Six Five On The Road at Dell Technologies World 2025. We’re going to step away and take a break. We’ll see you all back here very soon.

Author Information

Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.

From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.

A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.

An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.

SHARE:

Latest Insights:

OpenText Deepens Its Business Optimization Strategy, Linking Workforce Cuts to Its AI-First Operational Model as Part of a $550 Million Savings Plan
Keith Kirkpatrick, Research Director at Futurum, shares insights on OpenText’s AI-led restructuring strategy and its implications for long-term cost savings and operational efficiency.
Platformization and AI Investments Shaped Palo Alto Networks’ Q3 Performance, as the Company Balanced Internal Execution With an Evolving Industry Landscape
Fernando Montenegro, Vice President and Practice Lead with Futurum, digs into Palo Alto Networks Q3 2025 results. In focus: the key movement towards platformization and AI momentum within a competitive landscape, as the company advances toward long-term targets.
Qualcomm Re-Enters the Data Center Market with Arm-Based CPUs Designed to Interface with NVIDIA AI Infrastructure
Olivier Blanchard, Research Director at Futurum, explores Qualcomm’s re-entry into the data center market with Arm-based CPUs designed to integrate with NVIDIA’s AI infrastructure, marking a strategic pivot from its past efforts.
John Roese, Global CTO & Chief AI Officer at Dell Technologies, shares valuable insights on AI implementation and future prospects with Patrick Moorhead and Daniel Newman.

Book a Demo

Thank you, we received your request, a member of our team will be in contact with you.