The NVIDIA GTC 2021 Event

The Six Five team discusses the recent NVIDIA GTC 2021 event.

Watch the clip here:

If you are interested in watching the full episode you can check it out here.

Disclaimer: The Six Five Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we do not ask that you treat us as such.


Daniel Newman: Pat, this week’s big event was all about NVIDIA. GTC, even Twitter gets a little hashtag with a custom leather jacket. When we do a Six Five Summit, I want a little one of our little heads or something to be next to our names. Then we know we’re doing it right. But listen, there was 20 plus announcements. It was huge. The way I think we could break this down, Pat is maybe I talk about one thing, you talk about one thing that kind of stood out to you and then we have another topic later that’ll indirectly talk a little bit more about what’s going on. My focus kind of bounces off the last topic of Meta and Facebook and Metaverse and that in case you weren’t aware for some time now NVIDIA’s had a technology called Omniverse and Omniverse has been all about creating more integration between our digital and physical universes through things like gaming, collaboration, simulation and this particular GTC, the company further inundated its role and its commitment to playing in the verse.

And the reason I’m going to take the word out in front of it is because clearly Jensen and NVIDIA is in dispute that Meta or Metaverse is going to be the term of the future. The company believes it will be Omniverse. That feels a little bit Pat, like DPU versus IPU versus XPU, which I’ll just say P-U. Listen, we’re going to get there. But what NVIDIA is doing and what I really came out and said, is first of all, they had two really big announcements. One was around synthetic data with its Omniverse replicator synthetic data generation engine for training AIs. Ooh, take a breath there.

But what’s going on there is that effectively when you’re in this Omniverse, Metaverse when you’re in simulation, digital twin, you’re creating all this data and you need to be able to build models that can take that data, which is called synthetic data that’s being created in a digital twin or a parallel universe as I like to call it and concurrently mesh it with the data of our physical universe and be able to use it for things like simulating test drives of vehicles. When you’re going into the future of fully autonomous vehicles, you can create millions of simulations that create useful data that would be practical in the real world but you need to have a way to be able to utilize that data, optimize.

Patrick Moorhead: Daniel, did you say digital twin?

Daniel Newman: Digital twin.

Patrick Moorhead: Oh okay. I got you.

Daniel Newman: Ah, there’s a digital twin. That’s the cover of my Forbes piece about it, in fact. And so the best two examples of how that’s going to end up working is going to be really around autonomous vehicle and robots. I think those are the two early utilizations of it but really this is all about the evolution of AI. It’s having physical and virtual environments and all that data used concurrently to create the most valuable models, frameworks to develop future environments, which could and possibly will include something like Metaverse from Facebook or Meta or Microsoft’s Teams on workplace spaces, whatever.

Patrick Moorhead: I know what you’re saying.

Daniel Newman: The second is your avatar is waiting, is what I called that little section, but NVIDIA Omniverse avatar. And the company’s basically built a platform for generating interactive AI avatars. And this is what we saw Jensen show off. But in order for us to have the Omniverse or Metaverse or whatever the heck we end up wanting to call it, we’re going to need to be able to create intelligent versions of ourselves and others. Jensen showed himself but the company like it did with inference and Jarvis and Merlin and these frameworks, it did something similar here with its real world use cases, which this is what NVIDIA’s so good at by the way is creating real world use cases where they’re less about being a hardware company and more about being a solutions or software company.

And they came up Project Tokkio and Project Maxine. Tokkio was all about showing examples of how customer support could be used. How an avatar could basically be generated and created to in this Meta Omni world to create interactive customer service. They showed an example in a restaurant, they showed an example with conversation with a professor or someone about climate change. You could actually have an interactive conversation with it.

The other one was Maxine, Pat. And this is a really interesting and actual practical version in this hybrid future of work is they put two people working in noisy environments across the world, speaking different languages and how using the avatar technology and Omniverse, they could instantly streamline the language, take all the background noise out so it’s like you’re having a quiet, private conversation with seamless language translation in this Omniverse. To me, that’s a really useful practical example of bringing the world closer together, which is why I’ve sort of said, I think NVIDIA may end up blurring. NVIDIA may end up being more important to the future of whatever these digital and physical world creations are than Facebook Meta but not necessarily will they get the credit for it but that’s why Pat, we are here.

There’s one other thing I know you and I, I’m going to kick this one to you but our friends at Cloudera and NVIDIA announced a really interesting partnership as well.

Patrick Moorhead: Yeah, you want to talk about that?

Daniel Newman: I did. I could punt it to you or you want me to do it?

Patrick Moorhead: Oh, well you wrote the article on Cloudera, why don’t you do it?

Daniel Newman: Absolutely. The Cloudera machine learning and basically NVIDIA offered what’s called RAPIDS edition machine learning runtime, another mouthful but it’s really all about rapid docker images. It’s about helping data scientists get up and running on GPUs with a single click of a button. And to me, this is one of those big moments where a company like Cloudera partnering with a company like NVIDIA to be able to streamline and add capabilities for democratizing data science.

And by the way, Pat, this is a really important topic because we have better and better compute technology with GPUs but the ability for enterprises to utilize it comes down to software, comes down to data management and data science. And so what I’ve really been impressed with is the Cloudera data platform, the company and the way it’s been basically partnering successfully around the internet and basically removing barriers that basically enable data scientists to adopt GPUs for workloads that are more just deep learning. Overall Pat, it was an announcement within an announcement within a number of announcements, but you and I we track Cloudera very closely. And as far as I see it, it’s a great alignment for the company.

Patrick Moorhead: Yeah, totally. Totally. One thing I’m going to hit, I’m going to hit on in NVIDIA’s auto announcement. Little bit of background, NVIDIA has about a $600 million annual business in automotive. Hasn’t moved up extraordinarily over the past couple years but I believe once we start seeing more L2 Plus, L3 and even L4, it’ll explode. I’d love to get a backlog number from them like we get from Qualcomm but I have not been successful in that. But they have a tremendous amount of logos and you can catch those logos in my Forbes article. But one thing I want to hit on is what they actually did announce at the show.

First off, is they brought out a new stack called Drive Concierge and they also brought out Drive Chauffeur. And you can imagine what those are. Should be pretty easy to figure out. With Concierge, NVIDIA creates an Omniverse avatar for you to communicate with and interact with driving. You have an emoji, a 3D emoji like figure. It sees, it speaks, you have conversations back and forth. Unfortunately it doesn’t look like it’s the Jensen emoji. It’s more kind of a robotic but it can essentially help you do all these on demand things like getting a reservation, easy stuff like show me where I can go, turn off the radio or do something like that. And then the next thing they brought in was Chauffer and Chauffer is exactly what you think it might be. It’s AI assisted driving platform and it’s built for highway and urban traffic environments. Everything you would expect from NVIDIA. Does NVIDIA have too many names, too many brand names? Probably, but Chauffeur and Concierge are pretty easy for people to understand.

Now this software sits right on top of a platform called Hyperion 8 and Hyperion 8 is a platform for L3 and up to L4. You can have one or two Orins, which is the name of their SOC. You can have DriveWorks, comes along with that, DRIVE AV software and a bunch of tools that allow OEMs to optimize for it. Hyperion 8 went GA and part of Hyperion 8 was a full set of sensors. And what we learned at the show was that the LIDAR sensor of choice, the only LIDAR that showed up on the slide was from a company called Luminar and their stock price just absolutely took off. I think it went up 20% based on that announcement. Thought it was a good way for Luminar to show off their wares in that they’re part of, I guess, arguably one of the most sophisticated driving platforms out there.

The other thing is I think it put the you don’t need LIDAR to bed. There’s a lot of theories out there and even Elon Musk thinks he doesn’t need LIDAR but he’s three years late in getting out his fully autonomous software stack.

Author Information

Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.

From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.

A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.

An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.


Latest Insights:

On this episode of The Six Five – On The Road, hosts Daniel Newman and Patrick Moorhead welcome Intel’s Greg Lavender and Sandra Rivera for a conversation on Intel’s AI Portfolio during Intel Innovation in San Jose, California.
A Ride-Hailing Service Powered by 100% Renewable Energy
Clint Wheelock, Chief Research Officer at The Futurum Group, examines Waymo’s announcement that it has decided to focus its efforts and investment on Waymo One, its ride-hailing service.
From Digital Transformations To Periodic Software Reviews, Increased Visibility Can Help Reduce Costs and Improve Application Utilization
Keith Kirkpatrick, Research Director at The Futurum Group, covers WalkMe’s Digital Adoption Platform and discusses why the tool is useful for organizations that are expanding or consolidating their software tech stacks.
Are Consulting Firms Best Positioned To Lead Enterprise AI Transformation?
Mark Beccue, Research Director at The Futurum Group, examines the EY and BCG announcements about major AI initiatives and how these offerings will affect the market.