Helping Telcos Deliver New Edge AI Use Cases

Helping Telcos Deliver New Edge AI Use Cases

On this episode of The Six Five On the Road, host Patrick Moorhead welcomes Charles Ferland, VP & GM, Edge Computing & Communication Services Providers (Telco), Infrastructure Solutions Group at Lenovo at MWC 2024. Their conversation dives into the forefront of edge computing innovations, particularly focusing on Edge AI and its implications for telecommunications companies.

The discussion covers:

  • An introduction to edge computing and Lenovo’s pioneering role in Edge AI.
  • The uniqueness and recent advancements in edge computing technology.
  • How edge computing enables Communication Service Providers (CommSPs) to expand their service offerings.
  • Lenovo’s partnership with Telefonica: achievements and insights.
  • The challenges faced and benefits realized through the deployment of edge computing solutions.

Learn more at Lenovo’s Edge Computing Page.

Watch the video below, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Or listen to the audio here:

Disclaimer: The Six Five webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded, and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.

Transcript:

Patrick Moorhead: The Six Five is on the road in Barcelona at Mobile World Congress 2024. We are here in the Lenovo booth. It’s been an incredible experience here at Mobile World Congress and as you would expect, there’s a whole lot of Edge use cases to the Edge network, to the core network and everything in between. It’s very, very exciting. One of the key trends that we’ve seen over the last five years, which by the way, it’s ironic, Edge computing has been around for pretty much forever at retails and manufacturers, but it really was never connected in with the rest of the systems of the company.

So there’s a lot of talk about Edge computing and the reason for Edge computing is very simple. It’s putting compute where the data is to get faster responses, better responses without constantly having to go back and forth to the cloud or your on-prem data center, colo, wherever your infrastructure sits. And I am really pleased to have a many times guest, I was trying to guess how many times. Five times guest, I think.

Charles Ferland: I think so.

Patrick Moorhead: Charles Ferland, who runs the Edge, and Telco biz with Lenovo. Great to see you.

Charles Ferland: Thank you, Pat. Great to be here. And you’re absolutely right that Edge computing has been around, but I think the innovation now is, well first of all, the compute has always followed the data and the center of gravity has been the data, which was the case in the mainframe, was the case in client server architecture, the case in the cloud. Now we’re realizing that the center of gravity of the data is actually at the Edge.

The massive amount of data being generated by video cameras, for example, requires us to bring the compute to the Edge. And the problem we were facing before is we had PCs that were under power or small IOT gateways unable to process ai. And that’s the biggest innovation that we have is we were able to package a compute capability of the cloud, the same performance, the same AI capability, but put it in a much smaller ruggedized, secure, quiet form factor that can operate at the Edge in retail stores, hospitals, schools, ambulance, whatever.

Patrick Moorhead: Yeah, I’m glad you gave that background. And I remember Edge computing 30 years ago, it was either a PC or it was maybe a mini computer somewhere or a raised tile, which I don’t know, could we call raised tile flooring with a rack Edge computing? I don’t know. Maybe?

Charles Ferland: Perhaps we can. However, that’s very impractical. If you think about a retail store or a convenience store, they don’t have the physical space for it. They don’t have the cooling, the electricity, they don’t have filtered air and so on. So that’s one of the challenge of Edge computing and that’s what we’re addressing by having these components as powerful as the cloud as a data center, however, packaged into something that can operate mounted on a wall in a janitor’s closet, in a gas station, for example.

Patrick Moorhead: Let’s do the double click on Edge AI. What is Lenovo doing? You’re not new to Edge AI because I know the GPUs don’t define AI because, actually, most of AI gets done on the CPU still, but what is new with Lenovo Edge AI here at the show?

Charles Ferland: So, that’s a good question, and you’re right to point out that GPUs are an important aspect of the AI strategy. However, the CPU itself with OpenVINO from Intel, for example, is quite powerful as well. What is new is simply that the vast amount of data generated by the cameras that I was talking a few minutes ago, what do we do about it? How do we process it? And we need to have the compute capability that will find insight. And that insight might be very well, how many people are in the store at any time of day or how long are they staying in the store?

Or if I put an advert at the gas station at the pump station of a specific brand, when the customer walks in the store, are they buying that brand? So being able to correlate the advertisement with the shopping experience requires processing video image. And this is what AI is becoming critical to many of these smart retail use case. AI and Edge are extremely important because while we’re realizing many of our customers are investing in large language model and learning technology in the data center, we believe that the next three years is going to be defining AI at the Edge. The AI inferencing is going to be key, and we expect to have most of our projects deploy in Edge environment using AI inferencing.

Patrick Moorhead: And it’s interesting in technology it’s more about “ands” versus “or”. You’re going to keep doing analytics at the Edge, you’re going to keep doing machine learning, deep learning and generative AI, right? Isn’t that the new, right. I mean you can’t have any conversation here at the floor without talking about generative AI, but is that the next frontier of the Edge?

Charles Ferland: That is correct. Right. And this is actually, if I step back a little bit, many of our customers are looking at the Edge and looking at their site and say, “I have a PC, I have a IOT gateway, I have a media server for the in-store music and whatever.” They have multiple device each representing a different management challenge, each representing a different security risk because you have to maintain them individually.

So actually the strategy that many are taking is extending their public or private cloud approach with whatever they’re using in the data center and stretching it with the same administrative tools to these Edge locations. So it’s seen as an extension of their cloud and consolidating the application, using VMs, using containers, and eventually building upon that infrastructure that’s now powerful enough to host generative AI and other smart application for their businesses.

Patrick Moorhead: Yeah, I’m glad you talked about the symmetry between the public or the private cloud and the Edge, because that was really one thing that was missing in the past is we had OT operational systems and we had IT systems and the two really didn’t cross, maybe aside from shuttling data back and forth, and a lot of these OT systems were very rigid, very difficult to put new applications and layer it on there. And I’m also glad that you were very clear to talk about the future capabilities, probably even today, of generative AI smaller models, because we’re hearing a ton about generative AI in the cloud. We’re hearing a ton about generative AI on the smartphone and the pc, but you have obviously the technology is going to be there, it just takes a little bit longer to roll that out to have, let’s say smaller models.

Probably don’t need a 70 billion parameter model on the Edge, maybe 10 or 15. I know a smartphone is going to be running 10 billion parameter models to be able to get that extra oomph out of what it can do. And my short explanation or education about what’s different about generative AI, it’s the ability to mix different types of data. And that could be security camera data with POS data or even CX data. And I think that’s super, super exciting. Indeed. So another big, this is always a discussion here in Mobile World Congress, I don’t know how many years you’ve come here.

Charles Ferland: Too many, but it’s exciting.

Patrick Moorhead: Yeah, it is exciting. But to keep all of this going, we have to talk about business models of the CSPs. And I’m curious, what’s your thesis or what examples do you have where Edge computing can help CSP offer more services and drive more revenue?

Charles Ferland: Well, for the communication service provider, it’s often about how do I use my network infrastructure? And therefore we work with Telefonica here in Spain to build a proof of concept that demonstrate how we can improve public safety and first responder action. So using the cameras existing in the city, using drones, even mounted with some cameras, we’re able to capture a vast amount of data. The challenge I set was how do I detect if there’s an early sign of fire, how can I detect smoke somewhere in the city, somewhere in a park or in a forest? While you have the data being generated and captured by the cameras, however, you can back-haul, you can bring all of this back to a central cloud for processing, it’s simply too much. It’ll overwhelm your network.

Patrick Moorhead: Too much data.

Charles Ferland: Too much data. Therefore, we need to train the system to recognize early signs of smoke, for example, and bring the compute distributed across vast geographic area so that the system is trained and able to detect an early sign of smoke and notify the authorities, the firefighters in this case, to make an action earlier, sooner rather than later, basically. That’s one example, how we can enhance the quality of life of the advent of the city by using AI in processing the existing data. I think that’s very important. Many of our customers don’t realize it, but the data is already being captured by video. Only 2% of it is processed. So what we’re bringing Edge computing at those sites, we’re processing a vast amount of data to extract the insight.

Patrick Moorhead: That’s a great example. We’ve heard of an earthquake country earthquake detection. There’s tidal wave detection, fire detection. That’s interesting. And I love it. It saves lives. Sometimes, when I’m working on the weekend, I’m wondering, “Hey, am I actually saving lives here?” I’m like, “No, this is not mission-critical,” but then again, there are examples when technology is truly mission-critical.

And I was in another one of your partners booths where they had Edge computing inside of an ambulance being able to treat patients where they didn’t have the skills of the ambulance crew to go directly to the emergency room to get help. And the other one was having a cataloging system of equipment because sometimes you forget to bring the right equipment on the ambulance to be able to treat it. And this was a way with RFID to track to make sure that that equipment was inside of that ambulance. But again…

Charles Ferland: There’s numerous examples. I mean, I’m blessed to have a job that I get to talk to these customers all over the world. I talked to fishermen in the Gulf of Mexico who have Edge servers on their boat measuring the sizes of the fish using AI. We have remote islands in the Pacific using Edge AI to detect invading species and being able to prevent the spread of that species on the island. I’m talking to farmers in the UK that are using AI to identify the behavior of their animal and detecting early sign of a disease instead of giving antibiotics to the entire farm.

So there’s numerous use case that as a technologist, myself and the team has never thought about, but it’s actually exploding and exponentially growing fast. However, it’s extremely interesting to learn about these use cases. And at Lenovo, we have a vast ecosystem of independent software vendor of solutions, probably over 50 of them right now that are designed for retail smart, for retail industry, manufacturing, logistic transport, quick serve restaurant. It’s quite exciting to be talking to all those customers.

Patrick Moorhead: Yeah, I was going to ask, with all this opportunity and people needing your help, where do you start? And I guess you’re customer led here and incrementally where can you provide a unique benefit to them that others can’t? Is that a good answer? I just answered my own question.

Charles Ferland: We start with a conversation. I guess we are blessed to operate in 180 market. We do have a very comprehensive sales force and an even more impressive channel. We’re channel led, so we have partners all over the world that are able to have that conversation about Edge computing.

Patrick Moorhead: And I think I may have said on the record, and using absolutes as an analyst is always a dangerous thing, but you still have the most comprehensive Edge platform of anybody out there that we research. And I think it matters because the Edge is very different.

Charles Ferland: It is.

Patrick Moorhead: It can be in an ambulance, it can be in an elevator shaft, it can be behind the scenes, screwed to a wall at a retailer or a fast food. And I’m sure you’ve got a hundred other ways out on an energy platform.

Charles Ferland: Absolutely.

Patrick Moorhead: Somewhere. Sometimes you don’t have full connectivity. Sometimes you do, but you can’t rely on it and bank on it.

Charles Ferland: And thank you for the comment on the portfolio. I do believe it is the broadest and the most comprehensive Edge portfolio. In no coincidence though, we took engineering team from the laptop division, for example. We’re number one in ThinkPad, and we have engineers that know how to build very small, very compact, very ruggedized device. We take our laptop, we drop coffee, unfortunately, we drop them on the floor and they keep running for years.

So we took engineering and know-how from that team, we went to our mobile colleagues in Motorola and say, help us build more efficient wireless communication out of an Edge server. As you’ve mentioned, sometimes you do not have an internet cable and rely on wifi. And finally, we work with the infrastructure group, which build the fastest supercomputer in the world and the most resilient computer servers on the market to build these servers to be performant and also operating at the Edge and not having to maintain them for several years ahead.

Patrick Moorhead: So I have to ask if this has been all rainbows and unicorns for this conversation, which I think is always a good place to start, but let’s talk about the challenges. I mean, what are the challenges for getting this out? Because, again, we’ve had the Edge for, I don’t know, as long as retail has existed, I would say even going back to mechanical cash registers or maybe Abacuses, I don’t know where it started, but actually electrification, electronics has been in banks since computer started. So what are the challenges moving this forward because the benefits are there and why is this web 4.0 thing not immediately happening? Like a breaker switch?

Charles Ferland: There’s a couple of challenges, but the number one thing that I’ll say, Pat, is every time we meet a customer, they test this in the lab, fantastic. They select 10 sites and they do a pilot. Brilliant. It always works. Now you deploy 8,000 sites across 14 country, and the challenge is there, right? So the scalability, and this is where we need to help. And Lenovo looked at this a couple of years ago and said, “Well, this is not going to scale fast enough to reach that level of deployment.”

So, we developed a tool called Lenovo Open Cloud Automation, which basically allowed the technician that goes at the Edge site to simply mount on the wall, do the physical connectivity, and using a very simple app on a tablet or a phone securely activated device. Once a device is securely activated, everything is automated. We treat the infrastructure as a code. You don’t have to think about it. You just walk to the next site and do the next physical installation all within a few minutes, a few hours, the entire site will be automated and up and running. That’s called Lenovo Open Town Automation. And if you don’t automate the deployment process, it’s an execution nightmare to try to deploy this at scale.

Patrick Moorhead: Right. I can’t help but to think too, my company covers not only the Edge compute and data center, but also PCs and devices. It sounds a whole lot like zero touch provisioning.

Charles Ferland: It is.

Patrick Moorhead: On a system, and Lenovo might know a little bit about that.

Charles Ferland: We have a couple million device to practice ourselves.

Patrick Moorhead: Exactly, exactly. No, this is great. And by the way, when I talk to enterprises and I ask them this question, these are also very complex changes. When you’re talking about OT, you can’t bring down your retail store, you can’t bring down your manufacturing plant. Sometimes we think everything’s a new installation, it’s not. There’s actually business going on at that facility that not only have to run the facility on the current infrastructure as you’re deploying new infrastructure.

Charles Ferland: Correct.

Patrick Moorhead: So it’s very complex and it has a lot of different vendors, has consultants, has a lot of a multi-vendor solution, typically. So it is a little bit more complex. So I don’t always ask the question when I know the answer, but this is always, I’ve always for the last few years, scratching my head at it, I see the benefit of it. And by the way, while we’re at it, are there any core business benefits to deploying that? We talked about better information, but what does better and quicker answers actually mean to a business?

Charles Ferland: Well, I’ll give you an example of self-checkout. We’ve seen during the pandemic, many of the retailer deploying these self-checkout kiosks. Turns out, that many people forget to scan an object as they check out. Intentionally or not, the fact is that there’s some lost business there, right? Quite significant. Now, these retail environments already have cameras, right? By introducing Edge computing at that site with a powerful GPUs to process the image and recognize the hand movement, we’re now able to recognize if somebody forgets to scan an object or if there’s two objects, one on top of another, right?

Retailer in this case are able to cut down their shrinkage or their loss by 80%. So the payback for these retailers is in less than two months, typically when we do the business case. So when you talk about benefit, yes, the benefit number one is you’re able to improve your bottom line. However, once you have that compute at the premise, now you’re able to track people walking in the store, see how long they’re remaining in the store, which areas are they interested, which areas are not interested. So it’s all these additional benefit that can come out of Edge computing. But the core use case in that scenario is, how do I reduce my shrinkage? How do I improve my bottom line?

Patrick Moorhead: I love it. Charles, this has been a great conversation.

Charles Ferland: Thank you.

Patrick Moorhead: I could keep this going, but I know you have important people to meet with and there’s always next time. And I want to have you back on the show to do the double click in the future on generative AI. What’s your learning, what the deployments look like, the state of that? I’d love to have you back.

Charles Ferland: I’ll always have time with you, Pat. I appreciate the opportunity. Thank you.

Patrick Moorhead: Thanks very much. This is Charles Ferland Lenovo Edge Computing and also carrier services lead here. It’s been a great discussion here so far. I love the Edge. I’m passionate about the Edge, and I hope you enjoyed it too. Hang in there for more Lenovo coverage at Mobile World Congress 2024. Check out all of the coverage. Thanks, and take care.

Author Information

Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.

From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.

A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.

An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.

SHARE:

Latest Insights:

Novin Kaihani from Intel joins Six Five hosts to discuss the transformative impact of Intel vPro on IT strategies, backed by real-world examples and comprehensive research from Forrester Consulting.
Messaging Growth and Cost Discipline Drive Twilio’s Q4 FY 2024 Profitability Gains
Keith Kirkpatrick highlights Twilio’s Q4 FY 2024 performance driven by messaging growth, AI innovation, and strong profitability gains.
Strong Demand From Webscale and Enterprise Segments Positions Cisco for Continued AI-Driven Growth
Ron Westfall, Research Director at The Futurum Group, shares insights on Cisco’s Q2 FY 2025 results, focusing on AI infrastructure growth, Splunk’s impact on security, and innovations like AI PODs and HyperFabric driving future opportunities.
Major Partnership Sees Databricks Offered as a First-Party Data Service; Aims to Modernize SAP Data Access and Accelerate AI Adoption Through Business Data Cloud
Nick Patience, AI Practice Lead at The Futurum Group, examines the strategic partnership between SAP and Databricks that combines SAP's enterprise data assets with Databricks' data platform capabilities through SAP Business Data Cloud, marking a significant shift in enterprise data accessibility and AI innovation.

Thank you, we received your request, a member of our team will be in contact with you.