5G Factor Video Research Note: NVIDIA Softbank Collaborate on Gen AI and 5G/6G Apps

5G Factor Video Research Note: NVIDIA Softbank Collaborate on Gen AI and 5G/6G Apps

In this vignette of The 5G Factor, Ron Westfall and Steve Clint Wheelock provide their perspective on the 5G ecosystem dimensions of the NVIDIA and Softbank collaboration on a platform for generative AI and 5G/6G apps based on the NVIDIA GH200 Grace Hopper Superchip.

The discussion spotlighted:

NVIDIA Softbank Collaborate on Gen AI and 5G/6G Apps. NVIDIA and Softbank are collaborating on a platform for generative AI and 5G/6G apps based on the NVIDIA GH200 Grace Hopper Superchip in support of Softbank’s plans to roll out at new, distributed AI data centers across Japan. The platform will use the new NVIDIA MGX reference architecture with Arm Neoverse-based GH200 Superchips, and is expected to improve performance, scalability and resource use of application workloads. We evaluate the prospects of the NVIDIA MGX architecture in moving the mobile ecosystem to scale and broaden adoption of a wide array of apps such as AI, HPC, and NVIDIA Omniverse.

Watch The 5G Factor show here:

Or, you can watch the full episode here, and while you’re there, subscribe to our YouTube channel.

Listen to the full episode here:

If you’ve not yet subscribed to The 5G Factor, hit the ‘subscribe’ button while you’re there and you won’t miss an episode.

 

Disclaimer: The Futurum Tech Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we ask that you do not treat us as such.

Transcript:

Ron Westfall: And with that, Clint, did you see any other key developments that demonstrate AI and 5G are coming together and making a big impact?

Clint Wheelock: Well, I mean, I tell you it’s hard to avoid the intersection of AI and 5G just like it’s hard to avoid the intersection of AI and anything in the market right now. So in any case, absolutely, I mean there are a number of stories we could pick from, but I thought one of the more interesting ones was the newsroom, Nvidia and SoftBank about their collaboration on powering SoftBank’s data center build using Nvidia Grace Hopper Superchip, which is utilized for generative AI and 5G and I think ultimately for 6G as well. And Ron, I know you’ve been following this story too and very interested in your thoughts, but a couple of initial impressions on my end and then it’d be great to kind of dive into your perspective too, but I know SoftBank has been building data centers that post generative AI and wireless application on a multi-tenant common server platform all across Japan with the goal of reducing costs and improving the fabric wide energy efficiency. And this all includes a top priority of continuing to advance SoftBank’s infrastructure to attain some greater performance using AI, including optimization of the radio access networks themselves.

And in addition, SoftBank has signaled that they’re expecting AI to help reduce the energy consumption, going back to our earlier theme, and generate a network of interconnected data centers that could be used to share resources and host a really rapidly expanding range of different generative AI implications.
So I know you took a look at this one too, Ron. I mean, what are your thoughts? What do you think are some of the key implications with the Nvidia and SoftBank news here?

Ron Westfall: Yeah, right on Clint. And I think first of all, we can anticipate that what SoftBank is doing is going to be emulated in other parts of the world. And when you drill down here, what we’re seeing is that the platform will use Nvidia MGX reference architecture that is leveraging the Arm Neoverse based GH 200 Superchips. And this is designed specifically to improve performance as well as scalability and the use of resources for emerging application workloads.

Now when you look at specifically Nvidia Grace Hopper, along with the Nvidia BlueField 3 data processing units, they are designed to accelerate the software defined 5G vRAN as well as generative AI applications without having to use be bespoke hardware accelerators or specialized 5G CPUs. And in addition, the Nvidia Spectrum Ethernet switch with BlueField 3 is set to deliver more precise timing protocols for 5G implementations. So as a result, the solution is designed to improve 5G speed on an Nvidia accelerated 1U MGX based server design, which is natural, but also it can deliver throughput 36 gigabits downlink capacity. And what this means is that Nvidia is asserting competitive differentiation. Naturally, the shot at CPU is targeted at players like Intel and AMD. But what they’re claiming is that when you look at the available data across 5G accelerators, they are asserting that they can now come out with a competitive advantage at this juncture. But stay tuned, we know that this is a back and forth battle. So this is just a way for Nvidia trying to gain more attention for its 5G proposition.

However, from my view, I believe operators have struggled somewhat in delivering high speed downlink capacity using at least today’s industry standard servers. So now we’re seeing again, advance in server design to company advance in design and architecture and chips themselves and of course the systems that are implementing them. So again, this is pointing to the 5G realm becoming more interesting here in the near future.

And I think one thing that’s also important to note is that Nvidia MGX is a modular reference architecture athat is aimed at enabling system manufacturers and hyperscale customers to build over 100 different server variations to suit their needs for AI, HPC and Nvidia Omniverse applications. So this is basically catering to reality. We know it’s complex out there, there are many different vendor supplied solutions, but that I think is intriguing that Nvidia designed something that’s specifically adapted to being able to run over any server implementation. So we’ll see how that plays out naturally. That’s something I think will gain some attention.

Also by incorporating Nvidia’s aerial software for cloud native 5G networks, the 5G based stations can allow operators to dynamically allocate compute resources. And what that means that can potentially achieve power efficiency gains of twice, doubling basically power efficiency gains. And so naturally we know that there’s sustainability initiatives out there as well as even sustainability mandates. And this is something obviously the operators as well as the cloud providers and everybody else in the 5G ecosystem, keeping a close eye improving 5G performance, but also meeting sustainability goals.

Other insights from The Futurum Group:

NVIDIA & Snowflake

5G Factor: Chips Ahoy! How Chips are Integral to 5G Transformation with Qualcomm Promoting 5G Standalone, Juniper Beyond Labs Using Intel Xeon/FlexRAN, and Samsung MediaTek Boosting 3Tx Antennas

5G Factor Video Research Note: Red Hat Becomes the Primary Infrastructure Platform for Nokia’s Core Network Applications

Author Information

Ron is an experienced, customer-focused research expert and analyst, with over 20 years of experience in the digital and IT transformation markets, working with businesses to drive consistent revenue and sales growth.

He is a recognized authority at tracking the evolution of and identifying the key disruptive trends within the service enablement ecosystem, including a wide range of topics across software and services, infrastructure, 5G communications, Internet of Things (IoT), Artificial Intelligence (AI), analytics, security, cloud computing, revenue management, and regulatory issues.

Prior to his work with The Futurum Group, Ron worked with GlobalData Technology creating syndicated and custom research across a wide variety of technical fields. His work with Current Analysis focused on the broadband and service provider infrastructure markets.

Ron holds a Master of Arts in Public Policy from University of Nevada — Las Vegas and a Bachelor of Arts in political science/government from William and Mary.

Clint brings over 20 years of market research and consulting experience, focused on emerging technology markets. He was co-founder and CEO of Dash Network, an integrated research and digital media firm focused on the CX market, which was acquired by The Futurum Group in 2022. He previously founded Tractica with a focus on human interaction with technology, including coverage of AI, user interface technologies, advanced computing, and other emerging sectors. Acquired by Informa Group, Clint served as Chief Research Officer for Informa’s research division, Omdia, with management and content strategy responsibility, formed by the combination of Tractica, Ovum, IHS Markit Technology, and Heavy Reading.
Clint was previously the founder and President of Pike Research, a leading market intelligence firm focused on the global clean technology industry, which was acquired by Navigant Consulting where he was Managing Director of the Navigant Research business.

Prior to Pike Research, Clint was Chief Research Officer at ABI Research, a New York-based industry analyst firm concentrating on the impact of emerging technologies on global consumer and business markets.

Clint holds a Master of Business Administration in Telecommunications Management from the University of Dallas and a Bachelor of Arts in History from Washington & Lee University.

SHARE:

Latest Insights:

Brad Shimmin, VP and Practice Lead at The Futurum Group, examines why investors behind NVIDIA and Meta are backing Hammerspace to remove AI data bottlenecks and improve performance at scale.
Looking Beyond the Dashboard: Tableau Bets Big on AI Grounded in Semantic Data to Define Its Next Chapter
Futurum analysts Brad Shimmin and Keith Kirkpatrick cover the latest developments from Tableau Conference, focused on the new AI and data-management enhancements to the visualization platform.
Colleen Kapase, VP at Google Cloud, joins Tiffani Bova to share insights on enhancing partner opportunities and harnessing AI for growth.
Ericsson Introduces Wireless-First Branch Architecture for Agile, Secure Connectivity to Support AI-Driven Enterprise Innovation
The Futurum Group’s Ron Westfall shares his insights on why Ericsson’s new wireless-first architecture and the E400 fulfill key emerging enterprise trends, such as 5G Advanced, IoT proliferation, and increased reliance on wireless-first implementations.

Book a Demo

Thank you, we received your request, a member of our team will be in contact with you.