Menu

Is ChatGPT’s OpenAI Looking to Make Its Own AI Chips?

Is ChatGPT’s OpenAI Looking to Make Its Own AI Chips?

Is ChatGPT’s OpenAI Looking to Make Its Own AI Chips?

Of all the AI chip strategies from all the generative AI startups in the world, ChatGPT’s Open AI is reportedly contemplating its own move to enter the AI chip-making business. According to a recent report by Reuters, OpenAI, the creator of ChatGPT, is exploring the idea of either building its own chips, acquiring an existing chip maker, or expanding its pool of chip suppliers beyond NVIDIA, which is its lone supplier today. The Reuters report said the company has been discussing various options since 2022 due to shortages of the kinds of AI chips that are needed in its work. No decision has been made by OpenAI on the matter, according to Reuters, but this possibility certainly raises some interesting issues.

Is ChatGPT’s OpenAI Looking to Make Its Own AI Chips?

I get it that ChatGPT’s OpenAI might want more control of the availability and pricing of AI chips, which are critical to the company’s operations and business. That makes perfect sense. So, the Reuters story is right on point and relates an intriguing storyline that is apparently unfolding in the executive offices and hallways of OpenAI.

At first glance, it looks like creating its own AI chips could be a good idea for OpenAI to consider. Sure, build your own chips so you do not have to rely on anyone else; you can produce and secure the chip supplies that you need to serve yourself and keep your company ahead of your competitors.

On second glance, though, you must think of the significant ramifications of making such a move―and they are not small. To start, there are the immense costs of taking on the manufacturing of AI chips. You think paying someone else for their chips is expensive? Then start looking at what it is going to cost you to design your own chips and then build your own chip-making facilities or line up a fab that might have the capacity to make them for you. And what is it going to cost you to develop a roadmap of new and better chips on a neverending schedule into the future? And as if that is not enough, what about your own supply chain worries about keeping the chips flowing and selling? There are an awful lot of zeroes in the price tags for such operations, even if you decide to go out and acquire an existing chip maker or hire a fab to make them for you.

Meanwhile, let us say that even with all these challenges you still decide to pursue the idea. Where does that leave you?

Well, none of these complex processes will happen quickly, so your new AI chips will just start coming to market years after you begin all this complex work. And that means that your competitors will have been working on their core technologies all that time―while you were just getting things off the ground. And in that timespan, those competitors will be upgrading and replacing their products with faster chips rather than spending their time and money just getting started. It seems to me that you might need a very long time to catch up and make it worthwhile, even if you could catch up.

You know, all of this makes my head spin. It makes me think of carmakers and the similar decisions they must make each year to introduce new car and truck models. Stamping machines that make body panels, engine production lines and casting systems that must be changed out, and a million other decisions and production steps are affected. And sometimes, by the time their new vehicle models are out after a few years, the market might have changed. Oops. OpenAI might not want to get into that situation at all.

My Bottom Line on the ChatGPT OpenAI Chip-Making Rumor

First, let us remember that so far, this story is just that, a rumor from an article from Reuters. Maybe this will not come to fruition. Maybe it is just the company venting in the marketplace.
However, maybe the idea for Open AI to produce its own chips is not so crazy.

Maybe OpenAI could pull off this idea with the continuing help of Microsoft and other financial backers that might envision real benefits from such an uphill battle. Yes, I might see it as more posturing and ego than as a smart strategy at this point, but maybe OpenAI foresees something I am missing in my own crystal ball.

Certainly, AI chips are never going to be inexpensive to develop. There must be a very attractive business reason to jump into that market, given all its risks. And maybe OpenAI has such a reason in mind that is on the company’s corporate radar and that none of us yet understand.

Or perhaps all this talk is just to stir the field and make some noise. OpenAI is not shy about making noise in the AI marketplace, so that is also possible.

For now, we will have to wait and see what happens. As technology watchers like me have followed the swift global expansion of generative AI and AI in the marketplace in the past few years, it would not have struck me that producing one’s own AI chips would be part of the strategy.

But as I think about it, OpenAI has never looked at things the way other companies look at things. With new ideas, visions, technologies, and directions, ChatGPT’s OpenAI could well be looking at solving its own IT challenges by using what could be yet another all-new approach, by producing its own AI chips. It will be fascinating to watch how this goes and to learn whether the rumors are true or not.

Other insights from The Futurum Group:

The Ramifications of ChatGPT Going Realtime Web

OpenAI ChatGPT Enterprise: A Tall Order

Google Cloud’s TPU v5e Accelerates the AI Compute War

Latest Insights:
Marvell Technology Q4 FY 2026 Earnings Raise Data Center Growth Outlook
March 9, 2026
Article
Article

Marvell Technology Q4 FY 2026 Earnings Raise Data Center Growth Outlook

Futurum Research analyzes Marvell’s Q4 FY 2026 earnings, focusing on raised data center outlook, scale-up networking expansion, and connectivity roadmap transitions....
Okta Q4 FY 2026 Earnings Highlight Agentic Identity Positioning
March 6, 2026
Article
Article

Okta Q4 FY 2026 Earnings Highlight Agentic Identity Positioning

Dion Hinchcliffe is Vice President & Practice Lead, CIO & Technology Buyers reviews Okta’s Q4 FY 2026 earnings, focusing on agentic identity positioning, evolving pricing models, and how large-customer platform expansion may...
Commvault-CrowdStrike SIEM Link Tests Bi-Directional Resilience
March 6, 2026
Article
Article

Commvault-CrowdStrike SIEM Link Tests Bi-Directional Resilience

Fernando Montenegro, VP and Practice Lead, Cybersecurity at Futurum, examines how Commvault’s bi-directional integration with CrowdStrike Falcon Next-Gen SIEM enables shared backup-integrity telemetry to fasten recovery after cyberattacks....
CrowdStrike Q4 FY 2026 Earnings Extend ARR Scale and AI Security Focus
March 6, 2026
Article
Article

CrowdStrike Q4 FY 2026 Earnings Extend ARR Scale and AI Security Focus

Fernando Montenegro, VP Cybersecurity at Futurum, highlights CrowdStrike’s Q4 FY26 earnings: Falcon expands into AI security, identity, and browser runtime, underscoring consolidation-driven cybersecurity strategies....
Latest Research:
SiTime's Titan Platform and the Importance of MEMS Resonators
March 4, 2026
Research
Research

SiTime’s Titan Platform and the Importance of MEMS Resonators

In our latest market report, SiTime’s Titan Platform and the Importance of MEMS Resonators, completed in partnership with SiTime, Futurum Research examines how Titan’s miniaturization, integration, and resilience advantages could...
Nokia’s Global Data Center Network Migration: From Legacy Complexity to Automated, Reliable Operations
March 3, 2026
Research
Research

Nokia’s Global Data Center Network Migration: From Legacy Complexity to Automated, Reliable Operations

In our latest report, Nokia’s Global Data Center Network Migration: From Legacy Complexity to Automated, Reliable Operations, completed in partnership with Nokia, Futurum Research details Nokia IT’s automation-first data center...
Cybersecurity in the Age of AI: Moving from Fragile to Resilient
February 27, 2026
Research
Research

Cybersecurity in the Age of AI: Moving from Fragile to Resilient

In this Futurum Research report, Cybersecurity in the Age of AI: Moving from Fragile to Resilient, created in collaboration with N-able, we outline a modern framework for business resilience built...

Book a Demo

Newsletter Sign-up Form

Get important insights straight to your inbox, receive first looks at eBooks, exclusive event invitations, custom content, and more. We promise not to spam you or sell your name to anyone. You can always unsubscribe at any time.

All fields are required






Thank you, we received your request, a member of our team will be in contact with you.