AI in the Data Center — Futurum Tech Webcast

For this special episode of the Futurum Tech Webcast, Principal Analyst and host Daniel Newman welcomes Jeremy Rader, GM, Enterprise Strategy & Solutions, Data Platforms Group at Intel to discuss the edge to edge data center journey, deep learning, and bringing AI into the data center. Most businesses have a lot of growth potential in this area and they’re on the precipice of realizing it.

Their conversation includes a look at:

  • Model training, which is only about 10 percent of the edge to edge data center journey and is only one approach to machine learning.
  • Intel’s unmatched optimized edge to edge software options for AI and analytics from ingest and cleaning to modeling and deployment.
  • How to include AI in the data center to support priority workloads without deploying GPU-solutions.
  • We also discussed a few different use cases that highlight how some SMEs have been able to use AI to optimize their data centers at scale, as well as ways companies have improved their data pipelines to save time and money.

It was a fascinating discussion and we’re glad to have you as part of it. Check it out below:

Disclaimer: The Futurum Tech Podcast is for information and entertainment purposes only. Over the course of this podcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we do not ask that you treat us as such.

Other insights from the Futurum Research team:

Microsoft Ignite 2020 Means Big Updates For Power Platform And D365 

MATRIXX Software Provides Cloud Native Keys To Innovative 5G Charging

How Tech Is Leading A Marketing Comeback Following The Pandemic Driven Slowdown

Image Credit: Tech Decisions

 

Transcript:

Daniel Newman: Welcome to Futurum Tech TV, I’m Daniel Newman, your host, Principal Analyst and Founding Partner at Futurum Research. Excited to have this conversation today with Jeremy Rader of Intel, where we’re going to be talking about AI in the Data Center. This is part of a three-part series I’ve done. If you check out our YouTube channel, you may see that there’ll be one up on AI in the Edge. There’s another great conversation on AI in the Cloud. Jeremy is my third conversation. I’d definitely love for everyone to check them all out. But Jeremy, welcome to Futurum Tech TV.

Jeremy Rader: Thank you. Yeah, awesome. Thanks for having me excited.

Daniel Newman: Yeah, It’s great to have you here. Appreciate you taking the time. Appreciate Intel for partnering up with us on these videos. Big topic to me. AI, gosh, with everything that’s going on with COVID, has gotten a lot of headlines. It was already getting a lot of headlines, but as we all went home and quarantined and then got our favorite recommendations and our Netflix streams and our Amazon shopping cards. We saw just the way the algorithms on our social media were changing and evolving. I think the attention to AI was proliferated and got even faster, but a lot going on there. Quickly though, Jeremy, for everyone out there listening to this short little Futurum Tech TV segment, introduce yourself and just tell everyone a little bit about what you do at Intel.

Jeremy Rader: Yeah, great. I’m in the organization that focuses on our Data Center customers. The role that my team plays is really around enterprise customers, their digital transformation, the solutions they need, what the big challenges are that they’re facing. We look at these big workloads like analytics, AI, what’s happening in hybrid and multi cloud, cybersecurity. In the conversations we have with our ecosystem partners or direct customers, you can’t have one anymore without talking about machine learning, deep learning, just AI as a whole or even analytics if you really look at that overall umbrella. It’s front and center, any enterprise going through any kind of transformation or trying to stay competitive in their market. It’s the top conversation.

Daniel Newman: Intel, its reputation precedes it, as it pertains to machine learning and the Data Center. Xeon has been one of the most popular products from a CPU standpoint, has been long undisputed for its role in inference, Deep Learning Boost. One of the most interesting things to me about it is the fact that Intel is in so many companies’ data centers already, but when it comes to AI, how many companies like… Are they taking advantage of this? Are companies already taking advantage of this? Because to me, it seems like this potential has been there, but is AI really being utilized? Are companies embracing it and how’s that going?

Jeremy Rader: Yeah, I think it depends on how you break down AI. When you look at the different aspects of it and you talk about machine learning, I think you’d be hard pressed to find a traditional large enterprise that’s not doing some level of machine learning. Sometimes they did it knowingly and sometimes it’s just through the partners of software vendors, the system integrators that they’re leveraging, but ultimately there’s some aspect of doing that, that clustering, that evaluation and statistics on their data. I think where it gets a little bit less prevalent is when you get into deep learning and that’s still a newer technology, newer capability, there’s those that are out in front.

Then I think there’s still a whole group of population out there that’s trying to figure out, do I need to be doing something in this space? Reminds me of the big data era a decade ago of, should I be doing something here or am I missing out? I think there’s a lot of those conversations of, I have a mandate or I have a desire to get more involved with deep learning, but I don’t exactly know how to get there yet. It’s still at that early stage, machine learning, we see that very prevalent, very much a part of the Data Center, whether you know it or not.

Daniel Newman: Yeah, the reason I was sort of leading you with that question is, there’s a lot of enterprises and sure there are very specific cases. Some of the things I mentioned early in this show, if you’re doing a lot of e-commerce recommendation where Excel or big GPU’s make a lot of sense. But for a lot of companies that are running SAP, or running ERPs and CRM applications and have lots of data, customer data platforms, CX Tools, and are sitting on all this and are trying to figure out how to get more out of the data. Those that have invested and that are already have a Data Center full of Xeon Scalable are basically sitting on a toolbox. Like you said, maybe not requiring that next big investment or iteration, isn’t that a big opportunity for a lot of customers, a lot of your users right now to actually just start utilizing all those resources for AI and machine learning?

Jeremy Rader: Yeah, that really is the playbook and a lot of the conversation that we talk about with enterprises is I think in some cases they’re trying to figure out how do they get started with this dedicated infrastructure for AI. You can do that if you’re already at that stage where you know what problems you’re trying to solve, the data’s organized, you’ve gone through that data journey and you’re ready for it. That may make sense to you if you think it’s going to be that dial of performance tuning and whatnot. But there’s a ton of enterprises out there that are sitting on this data and frankly, the training that they’re going to do and model training is probably going to be a very small piece of the overall data journey that they have to go through.

We really try and emphasize that start with what you have. You have upside potential with your infrastructure. You have the data, you’re probably working with a software vendor that is trying to capture the AI market as well, so they’re coming out with more and more tool sets that you can take advantage of. You mentioned SAP or Microsoft, or some of the smaller disruptors that are coming into the market that are trying to shake up a little bit, the AI business. They’re out there and ready to sit on top of that data, whether it’s a Spark de-dupe overall data center application that you’re working with. It really is about not taking these massive leaps, but taking the first step of, I’ve got the data, let’s go land some things on top of it and really try and get some of that value and ROI out of it before you make some of these really big major migration decisions.

Daniel Newman: Yeah, you alluded to something that is very interesting to me. It’s something… Is the example of big data. It seems like we’ve evolved a couple of years later to now, where I remember everyone was “Where do we start with big data?” I think a lot of companies over the last two or three years have started, like you said, they’ve built something on Hadoop or they’ve built something on Spark, they’re using more big data tools, whether that… We talked about SAP, but maybe it’s Splunk, maybe it’s BlueData, maybe it’s something in the cloud, maybe it’s something on-prem.

There’s a lot of different ways that they’re starting to both collect, manage, enrich, visualize, and do more in real time. Of course the Edge and IOT has only exacerbated this, but what you really are saying is, that foundation is there. The same application that a few years ago was about picking a few workloads and starting to create your big data strategy. Now it’s you have that big data strategy in place. Now it’s starting to accelerate those workloads using AI, using ML to get more and be able to use more data more efficiently.

Jeremy Rader: Yeah, that big data platform is your inference platform of the future. That’s where your data sits, that you’re going to take that trend model and apply it to, so it’s all sitting there. Now, is it organized and is it ready? Have you gone through the steps in that to building that foundation? We find a lot of people haven’t, they’ve maybe built data lakes. First and foremost, it was a low cost data storage and now they’ve realized this is their gold mine, but is it organized in a way where you can really apply AI or deep learning from it yet? Maybe, maybe not. A big step that we try and take with enterprises is, look at that whole journey. Make sure that you have a data strategy in place because without the data you’re not doing AI, you’re not doing machine learning, you’re not doing deep learning at all. Take the time and spend on your data strategy because ultimately that is what’s going to enable you to evolve into this space. Best tools out there can’t overcome no data.

Daniel Newman: Yeah, and you make a great point, all that investment over the past two years. Now, you’re an example I write a little bit about and I think it’s always great to put those use cases are always good for people’s visualization. Talk a little bit, I believe it was MasterCard that had an interesting case of taking this approach and obviously a large enterprise at scale. I’m not asking you to dive deep, but just, a quick superficial rundown, what did MasterCard do?

Jeremy Rader: Yeah, let’s start with, we’ve done a number of these like MasterCard where, it ends up sometimes being these bake-offs of infrastructure. Then you try and take a step back to what is the real problem you’re trying to solve? Where is that data sitting today? They had a big investment in Spark and Hadoop. Then it became, are there tools that you can land directly on top of that infrastructure, directly on top of that data, that will allow you to immediately start doing some of the types of analytics that they were hoping to do? The types of machine learning and deep learning. They were able to do that with Analytics Zoo, which is an open source set of libraries and capabilities that they could land right on their infrastructure. It basically enabled them to look at their overall strategy and say, “I don’t need to overcomplicate this.”

As we talked about at the beginning, I’ve already made my investment in hardware. I’ve already made my investment in the platforms for managing my data. Now, I want to make sure I get the quickest value out of it. How do I do that? That was a lot of what we tried to showcase with them is, based on how much model training you’re doing, based on how much data you have, here’s how you can approach it to simplify your path towards a result. We do that a lot, whether it’s looking at the medical space on patient readmission, accelerating diagnosis. These are all areas where massive data exists. Let’s just help enterprises take advantage of that first step of getting value out of that investment.

Daniel Newman: Yeah, those are some great stories and it really wraps up things nicely. Jeremy, I think in the end, there’s a really strong story that’s being told about a lot of companies. You’ve made a lot of the foundational investments, both architecturally and strategically to start really applying AI into your business, into workloads, into creating more value from all that work that’s been done over the past few years. There will be cases where big investments may be needed, but there’s a lot of companies and a lot of cases where they’re sitting on all the tools and technology that’s needed to start getting a return. Jeremy Raider, Intel, thank you so much for spending a few minutes with me here on Futurum Tech TV.

Jeremy Rader: My pleasure, thanks for having me.

Daniel Newman: Yeah, have a great one.

Jeremy Rader: Thanks.

 

Author Information

Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.

From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.

A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.

An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.

SHARE:

Latest Insights:

Brad Shimmin, VP and Practice Lead at The Futurum Group, examines why investors behind NVIDIA and Meta are backing Hammerspace to remove AI data bottlenecks and improve performance at scale.
Looking Beyond the Dashboard: Tableau Bets Big on AI Grounded in Semantic Data to Define Its Next Chapter
Futurum analysts Brad Shimmin and Keith Kirkpatrick cover the latest developments from Tableau Conference, focused on the new AI and data-management enhancements to the visualization platform.
Colleen Kapase, VP at Google Cloud, joins Tiffani Bova to share insights on enhancing partner opportunities and harnessing AI for growth.
Ericsson Introduces Wireless-First Branch Architecture for Agile, Secure Connectivity to Support AI-Driven Enterprise Innovation
The Futurum Group’s Ron Westfall shares his insights on why Ericsson’s new wireless-first architecture and the E400 fulfill key emerging enterprise trends, such as 5G Advanced, IoT proliferation, and increased reliance on wireless-first implementations.

Book a Demo

Thank you, we received your request, a member of our team will be in contact with you.