Menu

Intel and Hugging Face Discuss Compute and Ethical Issues Associated with Generative AI

The News: Last week, the leading open source library for machine learning, Hugging Face, reported that Intel’s AI hardware accelerators run faster than any other GPU on the market. This is critical, as Generative AI tools, which continue to increase in popularity, require more compute than other AI use cases. Thought leaders from Intel and Hugging Face joined me for a discussion regarding the compute and ethical issues associated with the use and growth of generative AI. Read the full release and watch the video interview here.

Intel and Hugging Face Discuss Compute and Ethical Issues Associated with Generative AI

Analyst Take: It was an honor to sit down with Kavitha Prasad, Intel’s VP and GM of the Datacenter AI, Cloud Execution and Strategy Group, Lama Nachman, Intel Fellow, director of the Intelligent Systems Research Lab, and Jeff Boudier, Hugging Face Product Director to discuss some of the most pressing issues associated with an increasingly trending topic: generative AI.

For those unfamiliar, Intel and Hugging Face partnered together last summer to help democratize the acceleration of machine learning hardware. Hugging Face is a tech company committed to making good machine learning more widely available and maximizing the positive impact it can have on society and business. Intel is helping them do so by powering that acceleration on the Hugging Face website with its Intel Xeon Scalable CPU platform. On the Hugging Face hub, users can literally build, train and deploy open source ML models to grow the space thanks to Intel’s sponsorship. Both companies have great insights regarding both compute and ethical issues associated with AI.

As pointed out by Kavitha, AI isn’t new. The issues associated with AI aren’t new. The thing that makes it feel new is that the move from predictive to generative AI has made it easier for just about anyone to understand AI’s capabilities. That, in turn, means more and more users and developers are jumping into the space, all the while trying to sort ethical implications of making generative AI so widely available. It’s a huge issue—one with no clear resolution.

Perhaps the only thing that is clear about generative AI: the amount of compute needed to run the complex models associated with generative AI will make it a very expensive proposition for businesses, especially as generative AI models grow more complex. For instance, it’s estimated that it could cost OpenAI $40 million to process the millions of different problems that were fed into its ChatGPT software in a single month. It’s also estimated that the new Bing AI chatbot, powered by OpenAI’s ChatGPT, will need a minimum of $4 billion in infrastructure to answer the queries users are making. As these technologies are initially rolled out, the costs—free subscriptions, trials, etc.—will be somewhat murky. Ultimately, however, it will be the enterprise that bears that cost and determines whether generative AI is “worth it.”

Another cost we need to consider as generative AI becomes more popular: environmental cost. As demand for generative AI increases, so will the number of clouds and data centers we need to power it. All of that requires energy. And at some point, we will need to determine if the value we get from generative AI is worth the impact it could—and will—have on the environment.

Still, even with all of the gray areas, the one thing that kept me excited about these advancements in generative AI was our guests’ shared commitment to advancing and sustaining AI in an ethical way. For its part, Hugging Face has more than 150,000 AI models available free online to ensure open and democratic accessibility to the technology. And Intel has focused on ensuring that transparency is part of every step in the “AI supply chain.” This means that as developers continue to build upon one another’s code, it will always be clear how every previous step was trained—which biases it might have—and how it may need to be re-trained or un-trained to move sustainably forward.

Unlocking the full potential of generative AI will take an entire ecosystem of partners. It’s enlightening and reassuring to see that integrity and value are top of mind for powerhouses like Intel and Hugging Face as we move forward.

Disclosure: Futurum Research is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of Futurum Research as a whole.

Other insights from Futurum Research:

Adobe Announces Generative AI innovations Across Adobe Experience Cloud at Adobe Summit

Generative AI from IBM Comes to Golf at the Masters Tournament

Microsoft’s AI Approach Dissected: What It Plans for AI

Related Insights
CIO Take Smartsheet's Intelligent Work Management as a Strategic Execution Platform
December 22, 2025

CIO Take: Smartsheet’s Intelligent Work Management as a Strategic Execution Platform

Dion Hinchcliffe analyzes Smartsheet’s Intelligent Work Management announcements from a CIO lens—what’s real about agentic AI for execution at scale, what’s risky, and what to validate before standardizing....
Will Zoho’s Embedded AI Enterprise Spend and Billing Solutions Drive Growth
December 22, 2025

Will Zoho’s Embedded AI Enterprise Spend and Billing Solutions Drive Growth?

Keith Kirkpatrick, Research Director with Futurum, shares his insights on Zoho’s latest finance-focused releases, Zoho Spend and Zoho Billing Enterprise Edition, further underscoring Zoho’s drive to illustrate its enterprise-focused capabilities....
NVIDIA Bolsters AI/HPC Ecosystem with Nemotron 3 Models and SchedMD Buy
December 16, 2025

NVIDIA Bolsters AI/HPC Ecosystem with Nemotron 3 Models and SchedMD Buy

Nick Patience, AI Platforms Practice Lead at Futurum, shares his insights on NVIDIA's release of its Nemotron 3 family of open-source models and the acquisition of SchedMD, the developer of...
Will a Digital Adoption Platform Become a Must-Have App in 2026?
December 15, 2025

Will a DAP Become the Must-Have Software App in 2026?

Keith Kirkpatrick, Research Director with Futurum, covers WalkMe’s 2025 Analyst Day, and discusses the company’s key pillars for driving success with enterprise software in an AI- and agentic-dominated world heading...
Broadcom Q4 FY 2025 Earnings AI And Software Drive Beat
December 15, 2025

Broadcom Q4 FY 2025 Earnings: AI And Software Drive Beat

Futurum Research analyzes Broadcom’s Q4 FY 2025 results, highlighting accelerating AI semiconductor momentum, Ethernet AI switching backlog, and VMware Cloud Foundation gains, alongside system-level deliveries....
Oracle Q2 FY 2026 Cloud Grows; Capex Rises for AI Buildout
December 12, 2025

Oracle Q2 FY 2026: Cloud Grows; Capex Rises for AI Buildout

Futurum Research analyzes Oracle’s Q2 FY 2026 earnings, highlighting cloud infrastructure momentum, record RPO, rising AI-focused capex, and multicloud database traction driving workload growth across OCI and partner clouds....

Book a Demo

Newsletter Sign-up Form

Get important insights straight to your inbox, receive first looks at eBooks, exclusive event invitations, custom content, and more. We promise not to spam you or sell your name to anyone. You can always unsubscribe at any time.

All fields are required






Thank you, we received your request, a member of our team will be in contact with you.