On this episode of the Futurum Tech Webcast, host David Nicholson welcomes Steen Graham, Founder at Scalers AI and Jeremy Johnson, Technical Marketing Engineer at Dell Technologies for a conversation on the democratization of AI through an industrial AI Metaverse solution.
Their discussion covers:
- Insights into the creation and deployment of industrial AI Metaverse solutions.
- The collaborative efforts between Scalers AI and Dell Technologies in advancing AI technologies.
- The impact of AI Metaverse solutions on industry and society.
- Future trends and developments in AI democratization.
Learn more at Dell Technologies and Scalers AI. Download our related report, Dell Enables an Industrial Digital Twin Proof of Concept with Artificial Intelligence Technology, here.
Watch the video below, and be sure to subscribe to our YouTube channel, so you never miss an episode.
Or listen to the audio here:
Or grab the audio on your streaming platform of choice here:
Disclaimer: The Futurum Tech Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded, and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.
Transcript:
Dave Nicholson: Welcome to the Dell Experience Lounge here in Round Rock, Texas. I’m Dave Nicholson, Chief Research Officer at The Futurum Group. I am joined today by two distinguished guests. The first is Steen Graham, CEO of Scalers AI. Welcome, Steen. Also, Jeremy Johnson. Welcome, Jeremy. Jeremy is an engineer in Dell’s tech marketing. We are going to be talking about a reference implementation that Dell put together with the help of Scalers AI. Specifically, looking at deploying AI in the manufacturing space. We’re going to talk about some pretty cool stuff here. Jeremy, what is an industrial digital twin?
Jeremy Johnson: Well, in a lot of industrial environments, you’ll see full-scale mockups being done to do prototyping, also test failure scenarios. A digital twin is a way of doing this without having to incur all the expense and all the work inputs of actually building something out physically in the physical world.
Dave Nicholson: So this is virtual space?
Jeremy Johnson: Yes, exactly. This is already happening. That’s kind of one of the things to highlight about this solution is this proof of concept is that, I mean, there’s widespread adoption of digital twins in the here and now. We’re training on synthetic data within this space to deploy AMRs or autonomous mobile robots. Industrial environments, obviously there’s a lot of potential for calamitous events and hazards and dangers and failures of all varieties. We’re training on synthetic data for the AMR to detect both chemical spills, which are obviously potential hazards. Then also compressor failures, which can cause a huge amount of fallout and incredible expense for, in this case, the oil and gas industry. Take synthetic data, train in a virtual digital twin, and then deploy that to the real world.
Steen Graham: The scale of what we’re doing here is what’s quite unique because we’re doing a full facility level scale. Phase one, you’re really creating this physical environment replica. That’s kind of your metaverse or your Omniverse-based implementation. In that phase, you’re trying to recreate the actual facility all the way down to the physics-based models that are deployed in that facility. Then you’re trying to create all the scenarios that would naturally occur within that facility to train the robot. So you’re creating all these synthetic data and synthetic events across a modality of challenges like chemical spills that as we all know, the number one goal in any manufacturing environment is to keep the employees safe.
You don’t deploy robots in the physical world without advanced training of the environments that they’re going to go into. Then also using computer vision technologies to create these events, these hazardous events artificially, because we don’t want to spill chemicals in the real world. The other piece of it is when you’re in production, now we’ve got a production environment that drives massive levels of productivity where the employees that may actually have to be on site can now be at home. Managing that robot remotely. Understanding all of the elements of the digital twin. What’s uniquely different is that full scale simulation environment, the training of robotics, running AI on synthetic data, and then production grade deployment of digital twins. That gives you all the real-time insights on what’s happening in your facility.
Jeremy Johnson: You can do all of this without pre-deployment, I think. That’s the big takeaway.
Steen Graham: The more you can upstream into this metaverse environment, simulate and validate, especially when you talk about some of the harshest conditions on earth and some of the most ruggedized environments on earth.
Dave Nicholson: This is a reference implementation that is available and usable and adaptable, something that people can actually take advantage of?
Steen Graham: The reference code will be published on the Dell GitHub repo. People can think about how to recreate their environment with this reference. I mean, the physical world is a very unique space. To the extent that you want to recreate physical world spaces, you need to look at this as a template to guide you through this journey. A lot of people are intimidated by just step by step. How do you get a digital asset? How do you start with the right digital assets? How do you simulate the environment using something like Isaac Simulation? How do we deploy AI and synthetic data and run AI on synthetic data to create accurate models.
One of the big modalities is video. Of course, what we’re doing uniquely here is we’re running a lot of distributed video workloads at the edge because there’s a camera intake from those AMRs. That’s where Ethernet’s coming into play too as well. We’re taking you, guiding you step by step for that process. But yeah, you’re not going to be able to deploy an AMR in your manufacturing facility from day one, but you will dramatically reduce the time to do that and at least overcome some of the early challenges at the forefront of innovation we are in this technology stack.
Dave Nicholson: Jeremy, coming into this, we all come in with expectations. Anything surprise you?
Jeremy Johnson: I think adding the AMR and showcasing all of the AI technology kind of just makes it a lot more compelling and just kind of fascinating. Especially, Scalers, they’re full-blown ISV and they were able to bring together all the middleware components, everything. I would encourage people to go and look at the GitHub and mess around with this themselves. It’s all there to see and tinker around with. It’s really awesome and extremely useful technology.
Dave Nicholson: You think it’s going to become more accessible over time and more people will embrace this?
Steen Graham: Yeah, absolutely. I think if you could do something today, as Jeremy alluded to, one of the hidden gems of this work is the validation of the OPC UA northbound protocol, which is a great way to get all your field data, all your industrial manufacturing field data into a power edge class device and start to create that digital twin. So you’ve got real world insights in your facility or set of facilities. Then that can unlock a whole ton of innovation. What’s really different about this is actually observing what’s going on in the physical world via digital twin is great, but actually transforming the safety of the environment and improving the uptime is really critical. If we can make our workers safer and then we can drive top line revenue, that’s really what’s unique. That’s where the robots come to play, and we can deploy them to drive that innovation.
Dave Nicholson: Great. Thank you gentlemen so much. From the beautiful Dell Experience Lounge here in Round Rock, Texas. Sadly, I just got word that my digital twin took out a home equity line of credit against my home and lost all the money in Vegas. Dave Nicholson here from Futurum Group, hope to see you again soon.
Author Information
David Nicholson is Chief Research Officer at The Futurum Group, a host and contributor for Six Five Media, and an Instructor and Success Coach at Wharton’s CTO and Digital Transformation academies, out of the University of Pennsylvania’s Wharton School of Business’s Arresty Institute for Executive Education.
David interprets the world of Information Technology from the perspective of a Chief Technology Officer mindset, answering the question, “How is the latest technology best leveraged in service of an organization’s mission?” This is the subject of much of his advisory work with clients, as well as his academic focus.
Prior to joining The Futurum Group, David held technical leadership positions at EMC, Oracle, and Dell. He is also the founder of DNA Consulting, providing actionable insights to a wide variety of clients seeking to better understand the intersection of technology and business.