In this edition of the Futurum Tech Webcast, my colleague Ron Westfall and I take a deep dive into the collaborative world of virtual reality and 3D simulations offered by NVIDIA’s Omniverse. First announced as a beta version last fall, NVIDIA’s Omniverse Enterprise is set to hit general availability this summer and has the ability to reshape the future of manufacturing and design, and perhaps much more.
It’s not unusual for technology companies to offer tools to help their customers get the most value of their products or services. But these tools are generally limited to configuration and optimization. NVIDIA’s Omniverse suffers from none of those limitations and while it may help manufacturing and design teams better utilize NVIDIA’s chip technologies the real value of the Omniverse is in helping manufacturers collaborative design, update, and improve their manufacturing operations.
In this webcast we discuss:
- How the NVIDIA Omniverse looks a lot like the Marvel Cinematic Universe.
- The role of Pixar’s open-source Universal Scene Description (USD) software in the Omniverse.
- How the Internet of Things (IoT) and a concept known as a Digital Twin are making helping organizations create real-time virtual versions of the real-world, physical operations.
- How the Omniverse can help organizations improve factory and manufacturing efficiency and safety.
- The benefit of running simulations in the Omniverse and leveraging machine learning and predictive analytics to improve asset maintenance and availability.
- The five different components that make up the Omniverse Platform (the Nucleus, Connect, Kit, Simulation, and RTX Renderer).
- How the Omniverse allows real-time updates and collaborative changes to be distributed throughout an organization’s supply chain and distribution ecosystems.
As we wrap up our conversation, Ron highlights some of the core RTX technology that makes the Omniverse possible and discusses how NVIDIA is working with partners like Microsoft and how competitors like AMD might need to respond. And finally, we ponder the ultimate question, when will tools like the Omniverse allow us to not just model manufacturing but extend to the entire operations of an entire business, from the first supplier to the last customer.
Want to watch the video of our conversation, you can grab it here:
Or grab the audio here:
Disclaimer: The Futurum Tech Podcast is for information and entertainment purposes only. Over the course of this podcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we do not ask that you treat us as such.
More Insights from Futurum Research:
NVIDIA GTC 21 Brings Key Innovation to Enterprise AI
NVIDIA’s Future is Bright as its Many Bets are Paying off: Q4 Earnings Update
NVIDIA’s DRIVE Platform to Power Hyundai’s Newly Launched Connected Car OS Across Entire Fleet
Transcript:
Fred McClimans: Welcome to this edition of the Futurum Tech Webcast. I’m Fred McClimans joined here by my colleague, Ron Westfall, and we’re going to dive into NVIDIA and the Omniverse today. So before we get into that, though, I do want to make it clear that we may talk a little bit about NVIDIA and other companies out here. We’re trying to inform and educate and just have a conversation about what they’re doing. Please do not accept anything that we offer here as any kind of investment advice in any way. We’re here for infotainment? Infotainment. To educate, to have a good conversation. But again, please don’t take anything that we’re talking about here as any type of advice for investing in NVIDIA or any other company that we may mention today.
So Ron, last week, big event for NVIDIA, their big annual event, all virtual, and they just nailed it, in my opinion. The most amazing thing that I saw in that entire session was the Omniverse. And it’s sort of an interesting thing for somebody like NVIDIA to get into, because the Omniverse, it’s not a chip, it’s not a piece of hardware, it’s software. And in fact, it’s software that’s based on open source software, the open USD toolkit. But what they’ve done here with it is NVIDIA has basically taken, in my opinion, sort of the same model that they’ve had in the past, where they look at industries and they really kind of break it down and say, “How can we help this industry grow? What can we do to further our customers’ efforts moving forward?” They did that in gaming. They did it perhaps accidentally with Bitcoin mining. Certainly they’ve been one of the pioneers moving into artificial intelligence and autonomous vehicles with their technology. And now with the Omniverse, they’re doing it again.
So what I’d like to do, Ron, is just I’ll explain a little bit about the Omniverse and why I think it’s really cool, and then I’m going to ask you to talk a little bit about the technology side of that, and perhaps what some of the competitors out there are doing with this. So let’s step back for a moment, and when we think about Omniverse, it’s really easy to think about the multiverse or the Marvel Cinematic Universe, and actually they’re all really kind of tied together. So what we’re talking about with the Omniverse is the ability to create a virtual 3D environment that exists, that in this case are designed to mimic the real world scenario, real world operations out there. Your business, your company, your manufacturing plant, your chip designs, something that’s real world.
Now the cool thing about the Omniverse, is the Omniverse is actually based on the USD software that Pixar developed way back as a way for them to create a virtual 3D world in which their movies would exist. So what does this have to do with chips company, with NVIDIA? Well, what we see happening here in the real world is the internet of things or the industrial internet of things. Sensors being put on everything that you can imagine, from individuals working in a factory floor, to the robotics, to the assembly line devices, everything now has the ability to be monitored in real time and to create so much data that we can create what we call a digital twin, that is essentially a virtual rendition of what’s going on in the real world.
So what NVIDIA has done is they’ve taken this concept and they’ve blown it up and said, “What we really want to do is we want to marry that with the capabilities of the USD technology and we want to create an Omniverse that allows a factory plan,” in this case, some of the examples they’ve talked about recently, the Bentley engineering, they’ve talked about BMW that’s actually taking all of their manufacturing facilities, they’re monitoring everything in real time. And that allows them to build a true working 3D version of their entire operations. And then the cool thing here is the open USD tools that they’re using for this, they actually have the ability to, essentially because they’re open, for other developers to plug into them, other assets, other 3D modeling tools, simulation tools, predictive analytics tools. All sorts of really cool things that we’ve been seeing in different industries now all coming together with the ability for somebody to collaborate in real time.
And I’ll give you a real quick example before Ron, I want to get your take on the tech underneath this, because it’s really cool. But imagine you are modeling an assembly line, and in real time, you’re watching in this 3D world and you want to make an adjustment. What if we move this portion of the assembly line over here, and what if we changed the assets that we’re using from this shelf to that shelf? What would that look like? Well, you can now simulate that in this digital twin in real time, and you can gather information from the plant floor, and you can even have somebody wearing a suit with sensors on it and have them walk through the manufacturing assembly process. How they’re picking things off the shelf, where they’re using them, and you can see does this really work?
And then you can take that and you can model that out and go, “Well, based on this, what can we do to reconfigure our plant operations to be more efficient, to be more productive? What can we learn from this model that we’ve created, this digital twin of our real world operations to identify problem areas? Where a particular part shortage may occur in that assembly line process. Or where certain robotic devices may require maintenance down the road.” So it’s just an incredibly rich, incredibly beautiful system that NVIDIA has come up with here. And again, like I said earlier, for me, it’s just really cool that NVIDIA thinks about furthering the industry as much as they do their own technology.
Now this is a service. You do subscribe to it. It’s not something that I expect NVIDIA will make a lot of money off. I think it’s about $1,800 a seat for an individual user, and there’s the $25,000 a year fee on top of that for you to be able to build your virtual digital twin of your business. It’s really cool. That’s all I can say about it. It’s really cool. So Ron, tell me about the technology that’s underpinning all of this. I mean, for this to work, it actually does need a lot of NVIDIA’s chips. What’s going on there?
Ron Westfall: Yeah, right on Fred, and it really was remarkable. NVIDIA’s GTC 2021 exhibition conference event, with naturally CEO Jensen Huang leading the way. And I think you really hit on some of the key points about the Omniverse technology. It’s really fundamentally a platform that combines, for example, Pixar’s Universal Scene Description technology with NVIDIA’s RTX technology, or more appropriately real time tracing technology. And what I think is important to know about the Omniverse platform is that it consists of five parts. That includes the Nucleus component, Connect component, Kit simulation, and finally RTX renderer. And these components, along with the the connected third-party digital content creation tools you referred to, Fred, and in addition to other Omniverse micro-services capabilities, constitute what NVIDIA is promoting as the Omniverse ecosystem. And that’s enabling these fascinating or inspiring digital twin demonstrations that we saw during the event.
And out of those five components I’d like to emphasize two of them. The first one actually is Nucleus, which is the server that really manages the database sharing amongst all the clients and collaborators. And it operates under a publish/subscribe model, and as a result is also subject to secure access controls. And what that means is that Omniverse clients can publish modifications to the digital assets and virtual worlds to the Nucleus database, or at least being able to subscribe to those changes. And what’s really interesting is that these changes are transmitted in real time between the connected applications, and examples that we saw included geometry, lights, textures, materials, and other data that describe the virtual worlds and their evolution. So in essence it’s like Minecraft for a real world business and industrial applications. Pretty amazing.
Fred McClimans: You mentioned something there, Ron, real quick. I just want to jump on this for a second. The ability for collaborators to subscribe to certain data sets in this model. And I want to make sure that we don’t overlook that there, because if you think about the modeling and simulation capability here, it’s one thing to say, “We in our own organization are going to create a digital twin of this facility and have access to it.” But in this model here, being collaborative, what you can do is you can bring in your entire extended ecosystem, your parts suppliers, your distributors, all the people that are around you in your organization. And when you make changes to the system, they can subscribe to those and receive those changes in real time, so they know exactly what you’re doing in that facility. And that’s a huge leap.
Ron Westfall: Yes. Now it definitely has impact on the market. And that really does crystallize the point about the Omniverse connectors. These are the plugins to end demand design applications, and it really does bring in that collaboration. Allow the ecosystem to in essence, optimize their ability to use the NVIDIA capabilities as well as the Pixar capabilities. So this is something that will have impact really on the competitive terrain. Specifically, I see AMD in particular having to respond to NVIDIA’s announcements. They recently came out with their own announcement, which is the Radeon RX6000 line, their attempt to target that real time tracing segment of the GPU market. And that, I think, is something that will oblige AMD to have to update their portfolio development efforts in this area, as well as really specifically market to this.
And what I think is important about NVIDIA’s RX technology that leverages the Ampere, the Volta, and Turing based GPUs within their portfolio family. And that allows them to use the Tensor cores as well as the real real time tracing cores on the Turing GPUs to really drive the architecture here, to really make these breakthroughs. And it’s really innovative. It’s the fact that NVIDIA is leveraging, again, the Ampere architecture, that’s rather distinct from my perspective. And plus these heavy hitting partners that include Microsoft. And Microsoft is already looking to integrate the RTX support on their DirectX ray tracing API. So this is really coming together. NVIDIA is really solidifying and maintaining their market leadership within this specific segment of the GPU market. So watch out AMD or any other competitors. This is something that is going to have impact through the rest of this year.
Fred McClimans: It’s going to be a lot of fun to watch, and we’ve been watching it for a while. The whole Omniverse went into beta, I think it was October of last year, when that started to come out there. I think the really cool part here, though, is if you take the different pieces that NVIDIA has, just the graphics capabilities and the 3D capabilities feed directly into this model. The performance that allows them to do some really intense machine learning and predictive analytics on all this data, it just fits in perfectly.
In fact, I’ve got to kind of ask the question, at what point do we stop modeling just the manufacturing floor and step back and go, “We can model, with enough data inputs, the entire business operation. All the employees, all the customers, all the suppliers, and literally look at a tool like this to say, ‘How can we optimize and use predictive analytics machine learning, even deep learning to go in here and figure out what’s the optimal business model, the optimal IT model, the optimal operations model that really makes this work.’ And because of the collaboration capability, let’s open it up and let’s bring all our ecosystem partners in from a business perspective.” I think it’s very cool. Very cool what they’ve done here.
Ron Westfall: Yeah, I concur. In fact, this is also an opportunity for NVIDIA to bring in their AI assets to help drive that scenario, that vision. We’re not just talking about production on the assembly or manufacturing setting. We’re also talking about the end to end business operations, in essence really taking automation, as one example, to the next level, and really being able to scale and apply this in a more impactful business case type of way. So yeah, the possibilities aren’t far because of the portfolio assets you pointed to, Fred, and I think this is something that is definitely going to be a game changer and something we’ll definitely be paying close attention to.
Fred McClimans: It’ll be interesting to see. I’ve always been a big fan of models. If we’re looking at things from the equity side, let’s pull out five different models from the crazy up, crazy down scenario, and let’s really see where this all goes. I’ve got to expect at some point that NVIDIA just turns this in on themselves and says, “What can we do to leverage all of this capability ourselves to improve what we’re doing moving forward?” So Ron, I really appreciate you taking the time here on this today. Again, this is just a quick wrap on the NVIDIA Omniverse announcement. Something that I think we both agree is something that will definitely have some significant impact in the marketplace.
And I’m really interested now to see where some people really start to think outside of the box in terms of leveraging this with NVIDIA’s technology, and then all of the animation, 3D modeling, the storytelling and everything else that’s out there. So definitely a big move worth following here. So for my colleagues at Futurum Research and my colleague Ron, thank you Ron, I’m Fred McClimans, signing off of this edition of the Futurum Tech Webcast.
Author Information
Fred is an experienced analyst and advisor, bringing over 30 years of knowledge and expertise in the digital and technology markets.
Prior to joining The Futurum Group, Fred worked with Samadhi Partners, launching the Digital Trust practice at HfS Research, Current Analysis, Decisys, and the Aurelian Group. He has also worked at both Gartner, E&Y, Newbridge Networks’ Advanced Technology Group (now Alcatel) and DTECH LABS (now part of Cubic Corporation).
Fred studied engineering and music at Syracuse University. A frequent author and speaker, Fred has served as a guest lecturer at the George Mason University School of Business (Porter: Information Systems and Operations Management), keynoted the Colombian Associación Nacional De Empressarios Sourcing Summit, served as an executive committee member of the Intellifest International Conference on Reasoning (AI) Technologies, and has spoken at #SxSW on trust in the digital economy.
His analysis and commentary have appeared through venues such as Cheddar TV, Adotas, CNN, Social Media Today, Seeking Alpha, Talk Markets, and Network World (IDG).