Menu

The Ramifications of ChatGPT Going Realtime Web

The Ramifications of ChatGPT Going Realtime Web

The News: On September 27, OpenAI announced on X that ChatGPT can now browse the internet in real time. “ChatGPT can now browse the internet to provide you with current and authoritative information, complete with direct links to sources. It is no longer limited to data before September 2021.”

Browsing is available to ChatGPT Plus and Enterprise customers as of September 27, when users choose “Browse with Bing” in GPT-4. OpenAI says browsing will be made available to all ChatGPT users soon.

Read OpenAI’s post about ChatGPT on X here.

The Ramifications of ChatGPT Going Realtime Web

Analyst Take: OpenAI’s move has implications for the evolving search space and for large language model (LLM) performance. Here is my take.

Search: Experimentation, Competition

The way we search has not really evolved much over the past 20 years – we are pointed to links, but not necessarily an exact point in a document where our answer might lie. The potential of generative AI to transform search is very appealing – can we ask a question and get a return with a specific answer?

So far, not exactly. Google Bard and Microsoft Bing have launched iterations of generative AI live search that return answer summaries that show sources and links. It is a work in progress that is not necessarily just a technical issue but a business issue as well – how do the search players continue to monetize search? Very important question for Google particularly.

So what does it mean for ChatGPT to join in search? It could certainly be disruptive in the near term. Why? Because OpenAI might not be that concerned about monetizing search from advertisers the way Google is at this point. It is a bit confusing that you can “search” with ChatGPT, using Bing, and also just search via Bing similarly – is this sort of a cannibalization move for Microsoft? Maybe the company does not mind which brand delivers search for them as long as it cuts into Google search market share.

LLM Performance – Better With Live Web?

In a great piece on the ChatGPT news from the BBC, it was pointed out that limiting an LLM’s access to data is a safety issue because the LLM does not necessarily have access to real-time misinformation and disinformation. As it stands, non-real-time data sets can be curated.
Check this out: “Asked why it had taken so long to allow users to search up-to-date information, the chatbot (ChatGPT) itself provided three answers:

It said developing language models took a long time and was resource-intensive, that using real-time data had the potential to introduce inaccuracies, and that there were some privacy and ethical concerns about accessing real-time information – particularly copyrighted content without permission.”

Setting that aside, in theory, with live web, ChatGPT can train on and infer from an even bigger data set, which could enable the engine to return more accurate results. It is unclear at this point how that works, though we will all be working with OpenAI, Microsoft, and Google to find answers to that question. In many ways, the potential to improve LLM results is a more intriguing market question for enterprises leveraging LLMs.

Bottom line, search via generative AI assistants such as ChatGPT, Bard, and Bing are a work in progress, and we are just getting started on their impact and effectiveness. Tread carefully.

Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.

Other insights from The Futurum Group:

OpenAI ChatGPT Enterprise: A Tall Order

Google Search Generative Experience: Will Gen AI Impact Search?

Google, Microsoft, OpenAI, and Anthropic Form AI Industry Group

Author Information

Based in Tampa, Florida, Mark is a veteran market research analyst with 25 years of experience interpreting technology business and holds a Bachelor of Science from the University of Florida.

Related Insights
Arm Q3 FY 2026 Earnings Highlight AI-Driven Royalty Momentum
February 6, 2026

Arm Q3 FY 2026 Earnings Highlight AI-Driven Royalty Momentum

Futurum Research analyzes Arm’s Q3 FY 2026 results, highlighting CPU-led AI inference momentum, CSS-driven royalty leverage, and diversification across data center, edge, and automotive, with guidance pointing to continued growth....
Qualcomm Q1 FY 2026 Earnings Record Revenue, Memory Headwinds
February 6, 2026

Qualcomm Q1 FY 2026 Earnings: Record Revenue, Memory Headwinds

Futurum Research analyzes Qualcomm’s Q1 FY 2026 earnings, highlighting AI-native device momentum, Snapdragon X PCs, and automotive SDV traction amid near-term handset build constraints from industry-wide memory tightness....
Alphabet Q4 FY 2025 Highlights Cloud Acceleration and Enterprise AI Momentum
February 6, 2026

Alphabet Q4 FY 2025 Highlights Cloud Acceleration and Enterprise AI Momentum

Nick Patience, VP and AI Practice Lead at Futurum analyzes Alphabet’s Q4 FY 2025 results, highlighting AI-driven momentum across Cloud and Search, Gemini scale, and 2026 capex priorities to expand...
Amazon CES 2026 Do Ring, Fire TV, and Alexa+ Add Up to One Strategy
February 5, 2026

Amazon CES 2026: Do Ring, Fire TV, and Alexa+ Add Up to One Strategy?

Olivier Blanchard, Research Director at The Futurum Group, examines Amazon’s CES 2026 announcements across Ring, Fire TV, and Alexa+, focusing on AI-powered security, faster interfaces, and expanded assistant access across...
Is 2026 the Turning Point for Industrial-Scale Agentic AI?
February 5, 2026

Is 2026 the Turning Point for Industrial-Scale Agentic AI?

VP and Practice Lead Fernando Montenegro shares insights from the Cisco AI Summit 2026, where leaders from the major AI ecosystem providers gathered to discuss bridging the AI ROI gap...
AMD Q4 FY 2025: Record Data Center And Client Momentum
February 5, 2026

AMD Q4 FY 2025: Record Data Center And Client Momentum

Futurum Research analyzes AMD’s Q4 FY 2025 results, highlighting data center CPU/GPU momentum, AI software progress, and a potential H2 FY 2026 rack-scale inflection, amid mixed client, gaming, and embedded...

Book a Demo

Newsletter Sign-up Form

Get important insights straight to your inbox, receive first looks at eBooks, exclusive event invitations, custom content, and more. We promise not to spam you or sell your name to anyone. You can always unsubscribe at any time.

All fields are required






Thank you, we received your request, a member of our team will be in contact with you.