OpenAI Announcements

OpenAI Announcements

The Six Five team discusses OpenAI Announcements.

If you are interested in watching the full episode you can check it out here.

Disclaimer: The Six Five Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we ask that you do not treat us as such.

Transcript:

Daniel Newman: I don’t know if you’ve followed what happens in the average week of OpenAI, but even when you just say OpenAI announcements, there was like five, there was a developer day, there was a GPT four turbo day, there is some new build your own LLM adventure day. So I’m going to talk about a couple of things that went on this week and maybe you can fill some gaps.

Patrick Moorhead: Good.

Daniel Newman: Two things I want to really talk about is one is GPT-4 turbo. So as we all know, there’s some big updates that need to be made and the company continues to make them. Now look, the growth of LLMs FMS models that companies can use continues to scale. And the idea of any one model sort of dominating all models I think is actually we’re seeing a fairly steep decline in that curve.

Having said that, OpenAI, is it the grandfather, grandmother of all LLMs? It’s the one that really kind of put this whole generative AI trend on the map. And so the company’s making changes to make it better. First change, it’s changing the knowledge cutoff. So one of the things that I think a lot of people complained about with OpenAI was its older cutoff. So as you know, when you asked it questions, it was like a 2021 date. So I think a lot of information was created in the last two years, so now the new one has taken us all the way through April 2023. Now between you, me and the fence post, that’s six months. Think about all the things that have taken place in the last six months. If you asked a question about, say, the conflict in the Middle East right now, six month old information would be very problematic.

So you can see why real time’s important, but it’s good to see them getting up to date. Increasing the semantic value, meaning being able to go with longer prompts. As you want to be more interactive and have more of a human machine empathic relationship, which isn’t real but possible, you need to be able to talk to it much like you talk to another human. You need to be able to give it more context. So it’s adding that and it also improved instruction following. They also, by the way, lowered their developer pricing. So this isn’t as much of a thing that you and I are going to be thinking about every day, but they’re down to a penny per 1,000 prompt tokens. So for developers, it basically costs less for them to be inputting and then basically being able to receive answers.

There were some other things that came out too, but the one other thing I just want to mention, I’m going to kind of use this as a handoff moment because you’ve also been talking about this pretty passionately, Pat, was this week they came out with anyone create your own version of chat, GPT. And so after spending a little bit of money earlier this year trying to start building the future of AI product and regretfully in less than six months, they basically enabled every company on the planet to build their own LLM and do it in a way that’s supposedly gives you developer capabilities, no code, upload your own data and keep it fully grounded. And keeping that data separate from the public domain data is now out.

So within one year it went from, hey, here’s one LLM model that works for everybody where everybody gets the same and consistent experience to everybody who basically subscribes for a handful of dollars a year, can now build their own personal digital assistant. And Pat, every day it feels like OpenAI and these other companies come out with a new feature that basically just ends an entire venture cycle. How many companies do you think were doing a build your own digital assistant product that just woke up one morning and they’re like, “Well, crap, our company is no longer relevant.” Now again, I realize that’s not always true. Competition’s good. We just said that in the Arms category, but if you’re a small startup, we all know, I think Rob Thomas at IBM Day talked about something like $500 million is the least you could invest if you were serious about building your own foundational models.

To know that overnight, they’re building basically a build your own, no code, you need no developer skills. All you need is to know how to upload documents and you can create a digital assistant and connect it to your calendars and connect it to your apps and connect ti … Wow. And so this came out this Monday on dev day. So Pat, with 100 million weekly users and a really, really big audience and the support of Microsoft and a whole big ecosystem around it, Pat, I’m totally blown away at how quick this company’s moving. I don’t want to take all the thunder out of this one, but they’re moving quickly. They’re making a lot of updates and now I guess apparently you and I with the no coding skills can build our own digital assistant with all the Chat GPT capabilities.

Patrick Moorhead: Yeah, I am very excited about these new, what they’re calling GBTs, right? And these are custom versions of this. It’s literally drag and drop capability. Dan, you and I had a conversation when this whole generative AI thing kicked off and we were thinking about different deployment methods and you were doing a lot of work on this upfront, which had a lot of value. And I was like, you know what? I’m going to wait until somebody comes out with the drag and drop and then lean into this thing and maybe slow roll it, but maybe not for a small and medium business, I think this is excellent. Essentially, you create the type of GPT that you want. They had, it’s funny, one of them was called Tech Advisor in there, a creative writing coach, laundry buddy, stacker whizz, just this. I think they gave a really good flavor of all the things you could do.

And it’s push button. There is no code. This is an application. And then what you do to customize it or ground the model is you throw all your content that you want it to learn on and it sucks it in, and you can even train it when it makes a mistake. Like, no, no, that’s not accurate. This is the right answer. So you’re creating your own, I think this is going to be seriously sticky. The one open throttle here is I need to do some more research on how they’re protecting data. It’s one thing, for example, for me to throw all my white papers, all my blogs, all the transcripts of all my videos in this thing. But what about other types of data? I think right now it’s probably really good for public data and oh, by the way, who owns the output of this and how can you monetize that?

But very impressed. I’ll be honest, I did not expect this because quite frankly, this is consumery, this is small business type of stuff that I didn’t expect them to bring this out as quickly. But yeah, I mean I remember you were trying out and I was trying out a few of these startups and what they can do, and they were using OpenAI as a backend. I’m wondering if this just completely wipes them out. I mean, we’ll see. It would not be fun to be a VC on somebody you gave $50 million to figure out if this was the right move.

Daniel Newman: It’s kind of like the people that are spending billion standing up a AI data center thinking that AWS isn’t going to build their own. I’m not saying AI data centers won’t be a thing. I’m just saying that there’s these kind of massive scale investments and it’s like, oh, we’re going to need this. And it’s like, oh, maybe we’re not. And so, hey man, listen, I can admit when I get something wrong, I was absolutely right about the opportunity, but I was absolutely wrong about the ability to differentiate and try to compete against big tech. Big tech is democratizing this thing. And by the way, Pat, this is great for the world. It’s really exciting. I’m very positive on it. It’s going to continue to push us to find ways to differentiate when all this is available to everybody. This isn’t different anymore. It’s just a thing. So it’s interesting.

Author Information

Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.

From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.

A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.

An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.

Related Insights
Can Claude Opus 4.7 and Ensemble AI Models Finally Make Code Review Reliable?
April 18, 2026

Can Claude Opus 4.7 and Ensemble AI Models Finally Make Code Review Reliable?

CodeRabbit's ensemble AI code review system using Claude Opus 4.7 catches subtle bugs and race conditions that single-model systems miss, signaling a major shift in software quality assurance....
Will GPT-Rosalind Redefine AI’s Role in Life Sciences R&D?
April 18, 2026

Will GPT-Rosalind Redefine AI’s Role in Life Sciences R&D?

OpenAI's GPT-Rosalind marks a pivotal shift in enterprise AI, delivering domain-specific reasoning for life sciences while intensifying competition between horizontal and vertical AI specialists....
Can Real-Time Code Quality Tools Like Qodo and Cursor Break the Pull Request Bottleneck?
April 18, 2026

Can Real-Time Code Quality Tools Like Qodo and Cursor Break the Pull Request Bottleneck?

Qodo's integration with Cursor demonstrates how real-time code quality tools are eliminating pull request bottlenecks by surfacing issues as developers write code, not after submission....
Can CodeRabbit's Multi-Repo Analysis End the Microservices Blind Spot in Code Review?
April 18, 2026

Can CodeRabbit’s Multi-Repo Analysis End the Microservices Blind Spot in Code Review?

CodeRabbit's new Multi-Repo Analysis feature surfaces cross-repository breaking changes that traditional code review tools miss, addressing a critical pain point for microservices architectures and distributed teams....
Is PyTorch Europe's Rise a Turning Point for Open Source AI Leadership?
April 17, 2026

Is PyTorch Europe’s Rise a Turning Point for Open Source AI Leadership?

PyTorch Conference Europe 2026 drew 600+ AI leaders to Paris, showing open source AI's growing enterprise influence as organizations shift from proprietary solutions toward agentic AI and hybrid deployments....
Agentic AI or Pipeline AI for Code Reviews? Why the Architecture Decision Now Shapes Dev Velocity
April 17, 2026

Agentic AI or Pipeline AI for Code Reviews? Why the Architecture Decision Now Shapes Dev Velocity

Enterprise leaders face a critical decision: agentic AI versus pipeline AI for code reviews. Futurum Group's latest analysis reveals how this architectural choice directly impacts developer velocity, risk management, and...

Book a Demo

Newsletter Sign-up Form

Get important insights straight to your inbox, receive first looks at eBooks, exclusive event invitations, custom content, and more. We promise not to spam you or sell your name to anyone. You can always unsubscribe at any time.

All fields are required






Thank you, we received your request, a member of our team will be in contact with you.