Search
Close this search box.

AI Rules at 5G Americas; T-Mobile, Verizon, Nokia Make Key Moves – Six Five Webcast: The 5G Factor

AI Rules at 5G Americas; T-Mobile, Verizon, Nokia Make Key Moves - Six Five Webcast: The 5G Factor

On this episode of the Six Five Webcast – The 5G Factor, hosts Ron Westfall and Tom Hollingsworth provide in-depth analysis on how AI emerged as the dominant theme at the recent 5G Americas conference, showcasing its critical role in the advancement and deployment of 5G networks.

Their discussion covers:

  • T-Mobile’s innovative use of AI to optimize network upgrades and expansion
  • Verizon’s strategy to monetize computing infrastructure tailored for AI operations
  • The potential of GenAI-enabled platforms like Nokia Network Service Platform to revolutionize network design and management
  • AI’s capacity to correlate data in novel ways, enhancing network operation’s efficiency and problem-solving capabilities
  • The foresight on AI’s evolving role in the telecom industry, particularly in driving 5G development

Learn more about the Six Five Webcast – The 5G Factor at The Futurum Group.

Watch the video below, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Or listen to the audio here:

Disclaimer: Six Five Webcast – The 5G Factor is for information and entertainment purposes only. Please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors.

Transcript:

Ron Westfall: Hello and welcome everyone to The 5G Factor. I’m Ron Westfall, Research Director here at The Futurum Group. And I’m joined here today by my distinguished colleague, Tom Hollingsworth, the Networking Nerd and event lead at Tech Field Day here at The Futurum Group. In fact, I believe we are coming off a successful string of Tech Field Days, and we’ll touch on that in a moment. But first, Tom, welcome back to The 5G Factor once again, I trust you’ve been bearing up since our last episode pretty decently.

Tom Hollingsworth: Yeah, I have. Spooky season is almost over, but I’m on the lookout for all of those things that are still hiding out there. It’s a world of security breaches and all kinds of other stuff, so I’m kind of excited to be talking about 5G and some cool things for once.

Ron Westfall: Right on. And I think nobody’s going to be surprised if we inject AI into our conversation about 5G. And before we dive right into The 5G Factor, I just want to let folks know, speaking of Tech Field Day, there’s an upcoming Networking Field Day event on November 6th and 7th. With a host of innovative maverick companies, at least I believe so, including Arista, Meter, Itential, Path Solutions, Elisity, and Aviz. And Tom, do you have anything to add to the upcoming Networking Field Day we’ll be participating in early November?

Tom Hollingsworth: I bet you you’re going to hear some topics around AI, a couple of these companies are using it in their platforms and a couple of them are using their platform to help enable it. So if that’s something that’s in your wheelhouse, make sure you head over to techfieldday.com and set your calendars because we have the presentation schedule lined up. You might even hear the dulcet tones of Ron’s voice in the audience.

Ron Westfall: Well, thanks for the shout-out, Tom. I’m confident we’ll be hearing your voice as the event lead, so this is all I think kismet. And that’s also I think a great segue into diving into 5G matters, after all it’s all about the ecosystem and they’re all interrelated. And recently I was at the annual 5G Americas Analysts event, and lo and behold, AI was the hottest topic. Nobody’s surprised. But what I think is going to be interesting is what’s going on in terms of what the operators are thinking about potentially, in terms of how they can play an integral role in the evolution of AI and hopefully monetizing it to customers out there, both consumers and naturally enterprises. And to start off, T-Mobile is a member of 5G Americas and was a participant there. And they also certainly spotlighted the fact that they’re using AI technology as well as billions of data points to determine exactly how to upgrade and expand its network.

And basically, it’s an effort that could be described as customer-driven coverage. And it’s a strategy that T-Mobile has put into action after developing it for more than a year. So already we’re almost about two years into this whole advent of the AI era, as it could be aptly described I believe, and so we’re seeing T-Mobile as a player, as an operator specifically that is implementing this to really determine how their network is going to be built. That is, they’re taking new AI capabilities and just making it better. After all, many of these companies have been using AI to some degree, at least machine learning. And now what we’re seeing is, I think, the taking off of these capabilities. And to dive down a bit more, T-Mobile is correlating these data points with business data and with real customer outcomes. So this is really about the bottom line now, this is connecting, okay, here’s AI, the technology, the capabilities, and how is it going to do just that, improve business outcomes as well as customer experience.

In other words, we’re seeing that the customer is integral to how they’re going to build their network. And how’s that coming about? Well, T-Mobile is assigning a customer lifetime value, or CLV, to a grid across the country. And now that’s more than 4 million hexagons, as they describe it, that have been created across the country here in the US. And these 165 meter wide hexagons, T-Mobile is using them to assign those values relative to the competition to allow it to know exactly where the company can build to well satisfy the customers naturally, but also to expand its presence when mindshare, when market share, of course. Now, T-Mobile has tens of thousands of future projects that get ranked based on some practical issues that we’re all familiar with, or certainly in the operator space, zoning, permitting. And a lot of that’s based on outputs from its AI driven algorithm model now.

And again, it’s about this customer driven coverage that I believe can enable them to stay ahead of the demand curve at the very least, let alone maintain a competitive edge. Also, T-Mobile is not just using AI on the algorithmic side, but also on the GenAI side. And what I thought has been somewhat underappreciated is that across all of these AI announcements by T-Mobile, but also by all the operators, it’s a fact that they are engaged in an alliance with OpenAI. That basically combines its expertise in cultivating customer relationships alongside OpenAI’s AI technology, both the knowledge as well as the research and development experts to basically custom build an innovative intent driven AI decisioning platform. And for now, they are calling it Intent CX or Intent Customer Experience.

Now, I think this is significant because with the secure access to T-Mobile data and the ability to comprehend customer intent and sentiment in real time, and this capability will be available starting in 2025, Intent CX will I think have the ability to apply meaningful understanding and knowledge of the customer to everyday interactions now. So it’s not just about customer support, it’s about the entire experience whenever you have to use a T-Mobile network or interact with T-Mobile at any level potentially. And that I think is something that is going to be a difference maker. T-Mobile’s proactive approach to AI I think has been a bit more creative, and as a result, I think it can allow it to maintain its competitive edge against major rivals, AT&T and Verizon, at least if we’ve judged by the most recent Q3 results that have come out. And with that, Tom, what do you see that is just making a difference, that’s even potentially groundbreaking in terms of how operators can use AI, well, to make a big difference?

Tom Hollingsworth: It’s the data, that’s the part that I don’t think they’re really getting yet. And T-Mobile finally figured it out. They have access to all the data they could ever need, they have all these handsets that are out there sending back telemetry and helping them understand coverage patterns and growth patterns and travel patterns. And up until now, they really haven’t been doing anything with it, because the problem is it’s a massive amount of data and what am I supposed to do with it? Well, you’re supposed to sift through it, that’s what we’ve always done with analytics. And having an AI platform that can go out there and can say, okay, we’ve noticed these kinds of use patterns or we’ve noticed that when these things happen, this happens. That is hugely valuable. But another thing that they’ve done that I really wish other people would do, is they’ve broken it down into those hexagons.

They’re not taking these wide coverage areas of a tower and saying, oh, we know we’ve got coverage over here and we’ve got coverage over here. If you’re an enterprise Wi-Fi person, you know what this feels like. Oh, well, we’re just going to put an access point right here, and as long as everything’s green, we’re good, right? No. No, we’re not. Because at the edges of those cells is when weirdness starts happening, right, and even if it looks like you’ve got good coverage, what happens when your handsets trap going back and forth? How do I improve that capability? How do I tell my handset, you need to start being more selective about sticking to a tower and maybe dropping when your noise floor hits a higher threshold? And that’s what AI will do, is it will analyze all of these data points in these areas, whether they’re directly underneath the tower or at the fringe, and then they’ll push those recommendations back to T-Mobile.

And it’ll say, okay, for these areas, you need to change your thresholds, for these other areas, it looks like you’re getting a big growth pattern in this direction and you need to be able to build more towers in this area in order to cover all those potential users. And when you think about the way that these companies market their networks, I remember a time when having a cell phone was like, oh, well, you get coverage along this interstate because that’s where our towers are, but as soon as you get off of there, good luck, because we don’t really have any coverage.

Now, people expect to have coverage everywhere. It is an odd situation when you don’t. And for T-Mobile to basically kind of have a chance to move into that number two spot, or at least contend to be in that number two spot, they’ve got to have better coverage and happier customers. But going along with that, a lot of the ways that companies have historically used this data is they’ve relied on their customers to tell them when something’s not working right. That doesn’t always work. I mean, one thing is people tend to bias in the wrong direction. If they’re having a bad day or they’re having a particularly bad incident, they’re going to give it the lowest possible rating, even though the call completed and you could kind of hear what was going on on the other side.

Likewise, they tend to over bias to being more permissive when they really shouldn’t be, it’s like, oh, well, I heard robot voice once, but how bad of a deal can that be? So by allowing the handsets to give them backup data on this and being able to pull quantitative analysis off of it, what they’re really allowing is an unbiased opinion. And the consumer may not understand how valuable that is, but they’ll understand how valuable it can be later on when T-Mobile puts the right assets in place to make their experience a whole lot better, and then the customers didn’t have to do anything to get there.

Ron Westfall: Yeah, I think that is succinctly put. I couldn’t agree more, Tom. In fact, I think it’s showing that T-Mobile is being smart about that, how to really leverage AI and quite simply make themselves a more intelligent carrier. And we didn’t even touch on the recently formed AI-RAN Alliance along with Nvidia naturally as well as Ericsson and Nokia. But the idea is the same principles, how can we even further optimize RAN capabilities using AI at the cell sites themselves or where the RANs are distributed and so forth, and combining AI computing with RAN optimization. And so stay tuned, that has not actually been something that has a tangible timeline like we touched on with these other AI initiatives. However, I think it’s just indicating this is the smart thinking that needs to take place for the operators to really recreate how they can be much more meaningful to not only their customers, but across society. I think that’s a great point, that now people expect connectivity, at least mobile connectivity, and that’s something that is on par with utilities and water and so forth.

And with that, I think it’s a great way to now touch on, now what are the operators going to do to really take advantage of not just making their networks better, the customer experience better, but how can they potentially monetize serving up AI? So let’s shift it a little bit and let’s look at what Verizon came out just within the last couple of days. And what we saw is that the Verizon executives said that they’re planning to profit from the sale of computing infrastructure for AI operations. But this is just committing to the principle of it, they haven’t come out with details yet. But I think we can bet that they are leaning toward coming out with those details in ’25 at least, or at least they really need to. And so what Verizon has done is indicated it’s getting a lot of good orders from the hyperscalers or it’s dark fiber or lip fiber, and it’s going to simply keep growing.

Now, what Verizon’s believing is that more than that, it’s not just about the fiber, but also its assets in terms of power, space, and cooling resources, which is in a really high demand combined with having the resources in the right places to really assist with things like latency and other key considerations when it comes to AI workload performance or AI experience optimization. And I think this is also important because we’re seeing the hyperscalers are already reaching these critical power consumption and energy demand levels that they’re investing in modular nuclear energy, that they’re investing in re-firing up nuclear plants, Three Mile Island, we saw that. So this is something that I think is also going to impact the operators in terms of how they can also get into this AI game.

Now, what I think is important though is that they’re not going to compete directly against the hyperscalers when it comes to heavy duty, large language model training at the major data centers. And that is where the GPU clusters are heavily concentrated right now, pulling up all this energy demand and also, well quite simply, enabling that these AI capabilities will be in place for where the next shoe will very likely drop. And that is on the AI inferencing side that is enabling the resources at the edge, throughout the edge to take advantage of the large language models and other models that have been trained to allow customers or organizations to get intelligent AI, well, interactions or what we saw just with the T-Mobile example. And so what I think is going to be interesting is that this is kind of the Back to the Future scenario for the operators, where the business case and monetization of AI inferencing can provide the warrant for the service providers to go ahead and take advantage of all these edge resources that not even the cloud providers have.

That is central offices and other hubs where they have equipment already that can potentially host computing much closer to the customers. And that I think can drive offering such as AI as a service, which was specifically invoked with the AI-RAN Alliance debut. Now, telecom infrastructure can play a role in the future of AI, I believe, because as consumers and businesses use the technology it’s going to require driving AI processing a great deal outside of those massive data centers and quite simply close to the user, including those devices that are always accessible to the folks out there using a mobile network. Now, however, the idea of operators selling computing resources is not new, we’ve been here before. In fact, we’ve seen that operators including Verizon exited that model when they sold their data center businesses to companies such as Equinix about a decade ago.

But what’s different here is that they can be, I think, more intelligent or smart about how they distribute the AI computing resources. It doesn’t have to necessarily be newly built data centers, but using RAN locations and using other parts, other already existing resources to make this happen. And Tom, I know that’s a lot, but from your perspective, do you think that operators are up to it? Can they return to the future and start selling AI computing resources across their vastly distributed edge resources?

Tom Hollingsworth: Yes. Not only do I think they can, I think they should, and they may be the only ones that are capable of doing it right now. There was a stat that came out of the Open Compute Summit last week that said that data center power budgets are expected to triple in the next five years. Where are we going to get all that power? You mentioned restarting Three Mile Island, there’s research being done to small scale nuclear reactors and the kinds of things that can give us dozens or even hundreds of megawatts of power in order to run these things. But companies like Verizon and others have resources that are already positioned out there. And like you said, the central office, that’s kind of what that whole edge computing thing was all about, was positioning these resources close to the edge where they can get good response times.

And I think that one of the things that Verizon really wants to focus on here is that they have more compute resources closer to the people that need them. And like you said, they’re not going to be running these gigantic Blackwell water-cooled systems that are consuming dozens of kilowatts of power with every cycle, they’re going to be doing things that are a little bit more focused. And that’s really where Verizon’s data comes in handy, is they know what their customers want, they know what their customers are using. So rather than trying to boil the ocean and create new algorithms and new LLMs that allow their customers to come up with these grand new ideas, they can literally just offer the things that their customers would’ve normally wanted in the first place. Things like voice transcription, how many times have we seen the value of being able to turn on closed captioning on something? Well, what if you could do that live?

That’s not something I necessarily want to do at a central location because every added millisecond of time creates lag in that call. So if it’s something that I can do on the edge, where the pop is the one doing all the heavy lifting before it gets sent on through the network, that’s a huge benefit for me. And that’s just one example of the things that Verizon’s wanting to do. Because building out these massive Equinix style data centers is going to take time, you can’t just drop a couple of extension cords in there and Bob’s your uncle. You’ve got to put in massive new resources, massive new cooling capabilities, and if Verizon can offer a stop gap for people that need to ramp up quickly, then that’s valuable for them. And you talked about the fact that 10 years ago they sold off all their data center assets. 10 years ago we thought the cloud was the way to go, right? Everything was going to be based in Corvallis and Reston and that was that, right?

We didn’t need all of these other data center assets, colo’s were on death’s door. I can remember that. Cloud’s going to take my job. And it wasn’t until AI became a thing 18 months ago that people really started looking at the possibility of needing to host their own stuff again. And so I think Verizon is really, they’ve picked that up and they were already headed in the edge computing direction to begin with. I think it was more, at the time, a solution in search of a real problem to utilize it. I think AI is that problem, for lack of a better term, and the resources are available and ready to be consumed if you’re intelligent about how you consume them. And you’re not going to be dropping these heavy LLMs on things, you’re probably, what you’re going to be doing is offering this like a cluster to tenants who want to be able to distribute this workload, maybe run it overnight and get the results back the next day, as opposed to, I have to have this in the next hour.

Ron Westfall: Right on. And I think that is something that is going to happen. I think the operators are going to already have the plans in place, but make these steps to figure out how they can quite simply take advantage of it. And they need to, because as we know, the 5G space hasn’t exactly been a huge revenue generator for the operators. And now with AI, it cannot only fire up 5G and the next iterations of 5G as well as 6G in terms of revenue diversification and potential, but also any connectivity, fiber and so forth. All of these assets can take advantage of this AI potentiality, and I think this is something that is going to create new competitive dynamics. And it’s just an interesting time to be watching all this. And this is also bringing to mind, as impressive as AI has been in some areas in terms of the large language model training and the capabilities that these GPU clusters can enable, and at the edge we can use CPU clusters.

In fact, Nvidia offers CPU clusters. People sometimes might not know that. But I think it’s interesting that when it comes to the edge, this is where we can see, again, the small language model approach and these other right-sized approaches that the operators can sell to the customers out there. But what I was getting to in terms of what is limiting somewhat AI’s capabilities is the fact that when you’re looking at AI workload performance, it is slowed or delayed by 30% by the network. I think we’ve seen data points to that effect and similar ones. And so while we might have some very impressive heavy duty performance going on inside the data center, it’s also the network within the data center as well as across the WAN and other parts that will quite simply are going to be important to take full advantage of these AI capabilities. And so this is where I think AI or GenAI specifically can make a difference, and that is in the area of advancing network automation.

I know we just recently had an exclusive on Tech Field Day with Nokia that focused on this very vital issue, and I think we’re seeing solutions and advances that can make a big difference here. So in essence, when we are looking at AI, it’s about semantics and data, which together can bring about new ways to express disruptive insights, sharp actions and so forth. And so what I think is going to happen is that semantics is about picking up everything that can be said amongst stakeholders, that is certainly amongst network planners, network engineers, as well as the other teams that are going to be important in terms of making the network itself better performance capable. That is, all the more automated. And there are existing barriers out there that are significant. There’s been some incremental progress, but I think this is where we can see this making a catalyzing difference, that is using GenAI natural language prompts to have intent-based networking become basically almost a norm, if you will.

And so we’re seeing a lot of research that is being conducted around the idea of semantic spaces. And what this is linking to is that it’s enabling what can be quite simply described in terms of network automation, the autonomous network. And as we’ve seen with the TM Forum’s vision of an autonomous network, it’s really the path to zero eight, zero touch, zero trouble, et cetera. That is a network that performs just as simply and as easily as say a local ATM machine, except writ large. And we know that it’s not going to happen overnight, but I think this is an interesting approach and interesting solution that can make a difference. That is, through network automation, all the more impressive SLAs can be rolled out and fulfilled because of these AI workloads and all these other workloads that are also expanding dramatically.

And so, what we’re seeing is vendors like Nokia coming out with how they can make this happen. And already we see existing solutions such as the Network Services Platform, NSP, by Nokia offering these automation capabilities, already being used by over a thousand network operators out there. So this is something that’s in place and I believe can make a difference in terms of enabling that end to end IP network that allows the automation to be extended throughout the network, across the customer’s premises, over aggregation and exhaled domains, as well as gateways of various kinds. So this is something that’s important. It has to be organization-wide, and it has to be something that is enduring, that is something that is going to enable just that, the entire automation of the network, not just portions of it or at the individual level.

Now the good news is, is that the addition of GenAI capabilities allow the Nokia Network Services Platform to correlate these data streams that I touched on in new ways, because it can provide reasoning that helps the platform understand what it sees, discover new relations across the networks, and identify root causes of event. Getting closer and closer to that idea of an intent-driven network. Ultimately, I believe we’re getting closer to using GenAI enabled platforms such as Nokia NSP that can act as an assistant to the customer’s network designers, planners, engineers, operation staff and so forth. And quite simply use the platform language model towards a specific service provider organization network language model that is exclusive to them and enable these possibilities. And so, Tom, with that, from your view, I know that these barriers are real, but do you think this type of capability, this type of approach can actually move the needle more in terms of making network automation happen?

Tom Hollingsworth: Yes. And part of the reason why we run into the network automation problems that we have, it comes courtesy of our friend Mike Bushong at Nokia who was talking about this during that exclusive event where he said, “A lot of people have these concepts of the way that things just should be done.” And so they go out and they try to automate data entry tasks, the easy things, I don’t want to have to check this thing to program a VLAN or whatever, and they eventually hit a barrier because once they’ve solved all the easy problems, the only thing left are hard problems. And you have to pick the hard problem that you want to tackle, and eventually people just kind of give up. Think about the way that some systems are basically forced to automation now. An API is nice, but what if you don’t have one? Well, the system has to log in, dump the config, analyze the config, figure out what needs to change, it has to go back in, it has to rewrite the config.

It’s basically doing human things superfast without human interaction. And you’re probably sitting there thinking to yourself, well, that’s dumb. Well, it is dumb. And the reason why is because the first step that we took for that is an API, an application programming interface. It would be like if I was going to paste something into a Word document, I have to pull up the thing over here and I have to copy the text out and I have to paste it back over here and I have to change formatting. You don’t have to do that, right? You can just hit share and insert it into the document. And that’s an API, right, is we’re sending information to another program through a call and it shows up over there. That’s how humans interact with it. What Nokia is proposing is they want to extend that, they want to create more things that allow the automation operations to run even faster.

So that as we enable these things, as we create these new languages to do that or create new metadata that allows us to understand it, that the automations don’t need human interaction. It goes back to that whole thing of, think about using a Rube Goldberg machine to automate a door lock, right, I have the bar that opens it up and closes it up and things like that. Well, the next phase is to have a door lock that doesn’t have a handle on it, because if it’s all automated, why do I put a handle there? Other than for emergencies. And so we’re going to slowly start removing these capabilities that we don’t need anymore, like screen scraping, and we’re going to get to a point where the systems can run on their own, and it works better that way. The idea of having an operating system that I don’t have console access to scares me.

But there are a lot of people that don’t know there’s a terminal window in macOS or Windows or anywhere else because they’re just used to the operating system working the way that it operates. Or moving into cloud networking, why do I ever need to console into this switch? I can do everything through the web, I can click here and program this stuff and take care of that. And once I know how to do that, then I can have a system replicate that for me, I can feed it scripts, I can give it intent. And that’s the whole purpose behind intent-based networking, is not to remember, oh, this is a Juniper switch, I need to use Juno syntax. It’s to say, I would like to configure this application to use these ports and these networks and do this thing, and then the system figures out how to do that for you. And that’s true automation in my mind.

It’s not making my typing day a little bit easier type of thing, it’s me making a decision based on business process that executes technical capabilities on the back end. That’s how cloud functions, right, nobody goes in and you’re like, okay, what VLAN am I supposed to use? No. They say, I need to deploy this application to support this many users in this location, and then the rest of it gets taken care of in the background, and they don’t know that. And I think that that’s one of the reasons why using GenAI to do these things is so important now, because for better or for worse, AI is this magical idea that things can just happen, and I don’t need to deal with it. And you’ve seen the asking ChatGPT to give you summaries of things and stuff like that, it just works.

Well, if we can program the GenAI things to make other stuff just work, then we get over those humps that we get into in these automation journeys where people get stuck on problems that shouldn’t be as hard to solve as they are, but it ends up derailing the whole project. So I applaud Nokia for doing this, I hope that there’s more traction for this beyond just their operating platforms and their operating systems, I hope that they can bring this to market enough that other people start to embrace it and that it really becomes a game changer.

Ron Westfall: Yeah, no, I think those are very important points. I think one thing that’s a difference maker here, it’s using an existing platform, applying GenAI capabilities, and it’s already distributed amongst a thousand plus operators. So that I think can in itself be a difference maker. But I think another key takeaway that was emphasized or it was talked about in a good deal at 5G Americas was the fact that, because AI is such a strategic business priority for almost all organizations, that there’s a broadening in the decision making. And that is, hey, we see CXOs and other key decision makers saying, well, GenAI is making a difference in things like ChatGPT prompts that can help summarize a, say a webcast script, and other capabilities, and it’s helping with customer service, lowering coding requirements, improving field tech performance. Then, why not other important areas such as network automation?

And I think with Agentic AI, tying together all of these domains, all the knowledge and data from these different domains into a natural language interface that allows the decision maker to take advantage of the capability can be just that massive breakthrough, where even massive networks can be a great deal more automated and more intuitive and easier to support a build and so forth. So I think that is actually good news. And with that, speaking of good news, don’t forget, we have our Tech Field day coming up November 6th and 7th. And again, we’ll be diving into how AI can even be all the better applied toward improving networking. Tom, now that we’re getting closer to Halloween and all that good stuff, any thoughts to add on? Should AI be something that we should fear or something that we should embrace?

Tom Hollingsworth: I don’t fear it, because I have a non-conducting blade right under my desk so that if I have to cut it off, then I’m totally fine. I think that just like any other tool, right, a weed burner is a valuable tool for people who need to work outside, it’s a terrible tool for people who want to use it as a flamethrower, you’ve got to figure out how to apply the AI properly in order to get the outcome that you want. And of course, the most important thing is do not ever let it become self-aware. And if it ever does, then Crystal Palace in the mountains is a good place for me to hide out. And Ron, I’ll save you a seat.

Ron Westfall: Shout out to The Terminator movie series. Fair enough. Well, with that, thank you everyone again for joining us. Again, be sure to bookmark The 5G Factor, it’s on The Futurum Group website as well as the Tech Field Day site, it’s all inter-linked to the Futurum Group. And with that, thank you again, Tom. And everybody have a great AI enabled 5G day.

Author Information

Ron is an experienced, customer-focused research expert and analyst, with over 20 years of experience in the digital and IT transformation markets, working with businesses to drive consistent revenue and sales growth.

He is a recognized authority at tracking the evolution of and identifying the key disruptive trends within the service enablement ecosystem, including a wide range of topics across software and services, infrastructure, 5G communications, Internet of Things (IoT), Artificial Intelligence (AI), analytics, security, cloud computing, revenue management, and regulatory issues.

Prior to his work with The Futurum Group, Ron worked with GlobalData Technology creating syndicated and custom research across a wide variety of technical fields. His work with Current Analysis focused on the broadband and service provider infrastructure markets.

Ron holds a Master of Arts in Public Policy from University of Nevada — Las Vegas and a Bachelor of Arts in political science/government from William and Mary.

SHARE:

Latest Insights:

Microsoft Announces Key Advancements at Ignite 2024, From a Proprietary Chip to Enhanced Ai-Driven Security Tools, to Help Enterprises Keep Pace With the Evolving Threat Landscape
Krista Case, Research Director at The Futurum Group, examines Microsoft Ignite 2024’s key cybersecurity announcements, such as the AI-powered Security Copilot, proprietary Azure HSM, and its SFI and the bug bounty initiatives.
Exploring How Microsoft’s Latest AI Advancements Reshape Enterprise Operations, Productivity, and Security
Keith Kirkpatrick, Research Director at The Futurum Group, explores Microsoft Ignite 2024's AI advancements, such as Microsoft Copilot Studio, new AI agents, and governance tools, which are transforming enterprise workflows.
Dion Hinchcliffe and Camberley Bates delve into the latest earnings and updates from Lenovo, Cisco, Kyndryl, and the impact and understanding of Large Language Models in today's tech landscape.
Elastic Posts 18% Revenue Growth, Driven by Elastic Cloud While Addressing Leadership Changes and Persistent GAAP Losses
Mitch Ashley, VP and Practice Lead, DevOps and AppDev at The Futurum Group, discusses Elastic's Q2 FY2025 earnings, exploring cloud growth, key AI innovations, and the challenges posed by leadership changes and profitability concerns.