OpenAI Announcements

OpenAI Announcements

The Six Five team discusses OpenAI Announcements.

If you are interested in watching the full episode you can check it out here.

Disclaimer: The Six Five Webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we ask that you do not treat us as such.

Transcript:

Daniel Newman: I don’t know if you’ve followed what happens in the average week of OpenAI, but even when you just say OpenAI announcements, there was like five, there was a developer day, there was a GPT four turbo day, there is some new build your own LLM adventure day. So I’m going to talk about a couple of things that went on this week and maybe you can fill some gaps.

Patrick Moorhead: Good.

Daniel Newman: Two things I want to really talk about is one is GPT-4 turbo. So as we all know, there’s some big updates that need to be made and the company continues to make them. Now look, the growth of LLMs FMS models that companies can use continues to scale. And the idea of any one model sort of dominating all models I think is actually we’re seeing a fairly steep decline in that curve.

Having said that, OpenAI, is it the grandfather, grandmother of all LLMs? It’s the one that really kind of put this whole generative AI trend on the map. And so the company’s making changes to make it better. First change, it’s changing the knowledge cutoff. So one of the things that I think a lot of people complained about with OpenAI was its older cutoff. So as you know, when you asked it questions, it was like a 2021 date. So I think a lot of information was created in the last two years, so now the new one has taken us all the way through April 2023. Now between you, me and the fence post, that’s six months. Think about all the things that have taken place in the last six months. If you asked a question about, say, the conflict in the Middle East right now, six month old information would be very problematic.

So you can see why real time’s important, but it’s good to see them getting up to date. Increasing the semantic value, meaning being able to go with longer prompts. As you want to be more interactive and have more of a human machine empathic relationship, which isn’t real but possible, you need to be able to talk to it much like you talk to another human. You need to be able to give it more context. So it’s adding that and it also improved instruction following. They also, by the way, lowered their developer pricing. So this isn’t as much of a thing that you and I are going to be thinking about every day, but they’re down to a penny per 1,000 prompt tokens. So for developers, it basically costs less for them to be inputting and then basically being able to receive answers.

There were some other things that came out too, but the one other thing I just want to mention, I’m going to kind of use this as a handoff moment because you’ve also been talking about this pretty passionately, Pat, was this week they came out with anyone create your own version of chat, GPT. And so after spending a little bit of money earlier this year trying to start building the future of AI product and regretfully in less than six months, they basically enabled every company on the planet to build their own LLM and do it in a way that’s supposedly gives you developer capabilities, no code, upload your own data and keep it fully grounded. And keeping that data separate from the public domain data is now out.

So within one year it went from, hey, here’s one LLM model that works for everybody where everybody gets the same and consistent experience to everybody who basically subscribes for a handful of dollars a year, can now build their own personal digital assistant. And Pat, every day it feels like OpenAI and these other companies come out with a new feature that basically just ends an entire venture cycle. How many companies do you think were doing a build your own digital assistant product that just woke up one morning and they’re like, “Well, crap, our company is no longer relevant.” Now again, I realize that’s not always true. Competition’s good. We just said that in the Arms category, but if you’re a small startup, we all know, I think Rob Thomas at IBM Day talked about something like $500 million is the least you could invest if you were serious about building your own foundational models.

To know that overnight, they’re building basically a build your own, no code, you need no developer skills. All you need is to know how to upload documents and you can create a digital assistant and connect it to your calendars and connect it to your apps and connect ti … Wow. And so this came out this Monday on dev day. So Pat, with 100 million weekly users and a really, really big audience and the support of Microsoft and a whole big ecosystem around it, Pat, I’m totally blown away at how quick this company’s moving. I don’t want to take all the thunder out of this one, but they’re moving quickly. They’re making a lot of updates and now I guess apparently you and I with the no coding skills can build our own digital assistant with all the Chat GPT capabilities.

Patrick Moorhead: Yeah, I am very excited about these new, what they’re calling GBTs, right? And these are custom versions of this. It’s literally drag and drop capability. Dan, you and I had a conversation when this whole generative AI thing kicked off and we were thinking about different deployment methods and you were doing a lot of work on this upfront, which had a lot of value. And I was like, you know what? I’m going to wait until somebody comes out with the drag and drop and then lean into this thing and maybe slow roll it, but maybe not for a small and medium business, I think this is excellent. Essentially, you create the type of GPT that you want. They had, it’s funny, one of them was called Tech Advisor in there, a creative writing coach, laundry buddy, stacker whizz, just this. I think they gave a really good flavor of all the things you could do.

And it’s push button. There is no code. This is an application. And then what you do to customize it or ground the model is you throw all your content that you want it to learn on and it sucks it in, and you can even train it when it makes a mistake. Like, no, no, that’s not accurate. This is the right answer. So you’re creating your own, I think this is going to be seriously sticky. The one open throttle here is I need to do some more research on how they’re protecting data. It’s one thing, for example, for me to throw all my white papers, all my blogs, all the transcripts of all my videos in this thing. But what about other types of data? I think right now it’s probably really good for public data and oh, by the way, who owns the output of this and how can you monetize that?

But very impressed. I’ll be honest, I did not expect this because quite frankly, this is consumery, this is small business type of stuff that I didn’t expect them to bring this out as quickly. But yeah, I mean I remember you were trying out and I was trying out a few of these startups and what they can do, and they were using OpenAI as a backend. I’m wondering if this just completely wipes them out. I mean, we’ll see. It would not be fun to be a VC on somebody you gave $50 million to figure out if this was the right move.

Daniel Newman: It’s kind of like the people that are spending billion standing up a AI data center thinking that AWS isn’t going to build their own. I’m not saying AI data centers won’t be a thing. I’m just saying that there’s these kind of massive scale investments and it’s like, oh, we’re going to need this. And it’s like, oh, maybe we’re not. And so, hey man, listen, I can admit when I get something wrong, I was absolutely right about the opportunity, but I was absolutely wrong about the ability to differentiate and try to compete against big tech. Big tech is democratizing this thing. And by the way, Pat, this is great for the world. It’s really exciting. I’m very positive on it. It’s going to continue to push us to find ways to differentiate when all this is available to everybody. This isn’t different anymore. It’s just a thing. So it’s interesting.

Author Information

Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.

From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.

A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.

An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.

SHARE:

Latest Insights:

Brad Shimmin, VP and Practice Lead at The Futurum Group, examines why investors behind NVIDIA and Meta are backing Hammerspace to remove AI data bottlenecks and improve performance at scale.
Looking Beyond the Dashboard: Tableau Bets Big on AI Grounded in Semantic Data to Define Its Next Chapter
Futurum analysts Brad Shimmin and Keith Kirkpatrick cover the latest developments from Tableau Conference, focused on the new AI and data-management enhancements to the visualization platform.
Colleen Kapase, VP at Google Cloud, joins Tiffani Bova to share insights on enhancing partner opportunities and harnessing AI for growth.
Ericsson Introduces Wireless-First Branch Architecture for Agile, Secure Connectivity to Support AI-Driven Enterprise Innovation
The Futurum Group’s Ron Westfall shares his insights on why Ericsson’s new wireless-first architecture and the E400 fulfill key emerging enterprise trends, such as 5G Advanced, IoT proliferation, and increased reliance on wireless-first implementations.

Book a Demo

Thank you, we received your request, a member of our team will be in contact with you.