Enterprise-grade generative AI powered by Google Cloud – Six Five On the Road

Enterprise-grade generative AI powered by Google Cloud - Six Five On the Road

On this episode of The Six Five – On The Road, hosts Daniel Newman and Patrick Moorhead welcome Warren Barkley, Director, Product Management, Google Cloud AI for a conversation on Google’s decades of investments in research and AI, what innovating with Generative AI while maintaining standards of ethics and compliance looks like, and a glimpse into the future of AI from Google’s perspective.

Their discussion covers:

  • What uniquely positions Google Cloud to rise to the challenge and opportunity in front of Google today
  • The latest Generative AI innovations Google Cloud is bringing to enterprises today
  • What makes Google Cloud the partner of choice for enterprises embarking on their Generative AI journey
  • How Google is balancing the need to innovate and move fast, with the need to be responsible, secure, ethical, and compliant
  • A look ahead at the future of AI from Google’s perspective

Be sure to subscribe to The Six Five Webcast, so you never miss an episode.

Watch the video here:

Or Listen to the full audio here:

Disclaimer: The Six Five webcast is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded, and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.

Transcript:

Patrick Moorhead: Hi, this is Pat Moorhead and The Six Five is on the virtual road at Google Cloud Next 2023. It is post-event. Dan, you and I went to the event, we covered the pre-event and now we’re doing the follow-up. It’s kind of like wrapping a red ribbon around the tree.

Daniel Newman: Well, part of the fun of the analyst track, Pat, was we got to build some generative AI tools. They actually let us play with it. And so I just want to alert everybody that there is a chance that you’re actually talking to Dan Bot 1.0, and this is purely a generative version of my existence. And when we’re done, you can tell me how smart I sounded, ’cause if it was really smart, then it may really just be me. But no, seriously, Pat, great event. It was a lot of fun, a lot of action, and a lot of AI.

Patrick Moorhead: Yeah. Dan, you might want to work on the grounding of the Dan bot a little bit here. But no, in all seriousness, what is the last, gosh, 10 months of our analyst lives been about? It’s been about generative AI and at Google Cloud Next 2023, that was the primary topic of conversations. It’s what enterprises, partners and Google definitely delivered. And we have with us essentially Mr. Google Cloud AI. Warren, how are you? Thanks for coming on the show.

Warren Barkley: Yeah, of course. Thanks for having me. I’m super great. My voice made it through the week so I can-

Patrick Moorhead: Amazing.

Warren Barkley: Yeah, so I’m very happy to be here and survive, so it was great.

Patrick Moorhead: Thanks.

Daniel Newman: All right, so I see the keyboard Clavinova thing in the background.

Warren Barkley: Yes.

Daniel Newman: Do you play?

Warren Barkley: I have a music degree, but I don’t play piano very well.

Daniel Newman: Sure.

Warren Barkley: I more play other things, guitar and things like that.

Daniel Newman: Right.

Warren Barkley: But yeah, the long story background, it was music school or Caltech and I picked music school ’cause it looked more fun, and now I’m in technology.

Daniel Newman: And look at you now. I was going to say the irony is not lost. And all those people that say don’t get a music degree, don’t get a psychology degree, and now you are leading product for one of the largest technology companies on the planet. So I’d say there’s a lot of paths, a lot of different ways to get where you’re going, so I love that.

Warren Barkley: Exactly.

Daniel Newman: I actually was classically trained in the piano too, so I saw that, had to ask. Don’t know how I didn’t see it in the green room. Kind of want to go back there and jam out with you, but we’ll do it another time.

Warren Barkley: Yeah, you could whip out a little bit of a list or something like that and see.

Daniel Newman: Awesome. We will have to talk about that at some other point ’cause right now we’re going to talk about Google Cloud Next. So give us the background. It was a big exciting week. You could see a lot of progress. Of course Google, it has had a big year in AI, but Google has been doing this a long time. I mean you’re talking decades of investment and research around AI. Talk about how this ties to making Google Cloud so uniquely positioned to rise right now, to grow right now and to meet the opportunity of the day?

Warren Barkley: Yeah, we really have an abundance of riches when it comes to innovation in AI. And we’ve been doing this for 15 plus years and had these huge breakthroughs in things like Transformer, other things like that. I think it’s not just the fact that we’ve done that, but we also use it internally. We’re one of the bigger users of artificial intelligence in the world, and that’s taught us a whole bunch of things that allows us to kind of bring that to cloud.

The other piece is that it’s not just like 3000 researchers type thing, but it’s more the fact that we have a culture of innovation within research, that allows people to have these breakthroughs. And when we in cloud reach out to research and say, Hey, we got this problem, they’re right there, ready to help us. And it’s an amazing partnership to work with them on bringing this technology to market.

Patrick Moorhead: I like to use the term planet scale when it refers to Google. In fact, that was one of your biggest suppliers back in the mid 2000s and your scale even before the cloud was real and relevant. You had PlanetScale infrastructure and leveraging that know-how, that technology, and leveraging it on the enterprise side took a few years, but here you are. I think TK said it when I talked to him, you’re the number four or five. If you were a software company on your own, you’d be number four or five biggest provider out there. And I think that’s pretty cool.

So let’s talk about generative AI and what was announced, one of my biggest takeaways that I was really impressed with, and even though I know generative AI is a marathon versus a sprint, part of the market’s looking as it’s a sprint. Hey, this company had a 60-day lead on the press release or the announcement or the demo or private preview, public preview, general availability, but I was very impressed with the amount of generative AI capabilities that went GA while we were sitting there. Can you talk about what some of these are that you’re bringing to customers today?

Warren Barkley: Yeah, absolutely. I mean we’ve been moving at just an amazing pace over the last year or so, to bring this stuff to market. Someone asked me, what am I most proud of that we brought ’cause the 30 something services in the portfolio and there’s literally dozens of releases. One of the things that I think really brings interesting value to customers is fine-tuning, ability to fine tune on Llama 2, you can do that on our models.

Patrick Moorhead: Yes.

Warren Barkley: And then the fact that you can do this adapter model technology that we innovated, which allows you to keep the tuning and keep the data within your tenet. I think that that’s something that people really were looking for. There’s a lot of scares about, hey, is this going to escape? So innovating in some of those areas has been really a big deal for us.

Duet is pretty cool. It really does some cool things, in a way helping you just be an administrator in Google Cloud, how can you actually help it write you code and those types of pieces. And so we brought a whole bunch of things, whole bunch of modalities, image, speech, etcetera, etcetera, and really took those from experiment, to public preview, to GA super quickly. And there’s obviously more going to come.

Daniel Newman: So one of the things right now that I think is important, Warren, is going to be winning at the enterprise. Of course you’ve got a lot of partners building on Google, and we noticed that the ecosystem at Google Cloud Next was extremely rich. But I think right now there’s a bit of a gold rush to be able to say that we are the company, we are the hyperscale AI cloud partner to the enterprise and we’re going to be the enabler of them doing their most complex AI projects and of course just the everyday. And of course there’s some choice in the marketplace.

When you’re saying in short, this is the reason to go with us and of course you’re in the room, probably have one, two, three others that you’re often up against in these decision moments. How are you driving this? And of course specifically with generative, how are you making sure that Google wins those types of duals?

Warren Barkley: Yeah, I think part of it is choice, right? We’re giving people a choice. You saw the announcements around Llama 2, around Code Llama. Code Llama came out three, four days before Next and it was shipped by the time Next opened.

Daniel Newman: Right.

Warren Barkley: And so our ability to kind of pick up these models, we talk about Anthropic, we have a hundred Osse models in the Model Garden, as well as our first party ones. That’s a big part of it. That Model Garden piece is really important. And I think giving people choice because the reality is that there’s not one model to rule them all, and you want to have choice. I think that’s the first piece. I think it’s super interesting for customers and it lands really well. They want to do those types of things.

The second piece is really around the tooling and where the layers of the tooling happen. So if you want to go build conversational bots, do you really want to use the model directly? Some people do ’cause they have very specific scenarios. Other people want to use conversational AI and the bot building features and functionality of that. And so adding and allowing people, whether they’re partners or customers, the ability to go, hey, as deep as you possibly want, right down into the model itself to, hey, I just need an API or I just need an iframe. Just give me something really simple that I can go build.

And so things like info bot, being able to point that at your website and create a bot in an hour or two hours, something like that. These are the types of things that no matter where you come from, you have the tool set that you can use to really deploy it.

Patrick Moorhead: So there was, I think, a sigh of relief with enterprises when Google showed up with open models and it should have been a… I think people were looking at the consumer side and they’re like, oh my gosh, is this how this is going to be in the future? And for the most part, and industry doesn’t agree on everything all the time, but it’s going to be a combination of proprietary models, specialty models, open models, different ways of grounding APIs like you talked about. I do think it’s interesting, some of the innovations you brought to the table, of people who wanted pretty much that guarantee that their proprietary data is going to be protected. And I think this is part and parcel to get a wide distribution of that.

There’s a lot of talk about proprietary data, but there’s also a lot of talk about responsible, ethical compliance as well. I mean, I spent… The other analysts hate it when I say this, I spent half my career, I had a real job. For 20 years I was in product management and product marketing and I know product management might say, eh, product marketing. But I was in products and my teams and I were always in front of, hey, we can double the pace, but this is the risk profile. Do we want to do this or not? I’m curious, how do you balance moving fast or faster with the very important elements of responsibility, security, ethics, data compliance?

Warren Barkley: Yeah, I think that it can be really tough. I think at the beginning of this whole GenAI wave, folks were like, Hey, where are you?

Patrick Moorhead: Right.

Warren Barkley: And part of it is that there’s this natural conservatism because we’re worried about the safety, really worried about it. And one of the stories that happened was a couple months ago before we actually went GA with one of our model sets, I got a call at just before midnight saying, Hey, by the way, the last test run found that in certain languages, our safety filters aren’t working really well.

Patrick Moorhead: Yeah.

Warren Barkley: And I was like, okay, don’t ship it. And so that’s where we take it, that’s how seriously we take it. And I think part of that is also exposing some of that stuff to our customers, so that when you call an API with our models, it actually returns all the scores that we look at, hate and other things, all the categories and gives you the ability to then block. If you get a 0.6 in one category and you’re sensitive about that type of thing, you have the ability to then block the actual response. It’s like giving people the flexibility but understanding that safety’s first within that, and I think that that’s a big piece of it.

The other thing is we’re really pushing hard in some of the other areas, like in images and things, where we announced watermarking, partnering with a DeepMind and Synth ID and those things. And I think that the industry has to wrap their hands around that stuff, and we want to help lead folks and make that happen. But I think ultimately, we have very big team. We’re very concerned about safety, and making sure that what we’re doing has a net positive effect on the market and giving our enterprise customers tools to understand it, because it’s not good enough just to say we block it. We want to make sure that they understand what’s going on.

Patrick Moorhead: Well, and it’s very different across countries and across industries inside of countries as well, and belief systems in a certain area of the United States aren’t equivalent in all areas. This has to be one of the toughest things. I try to put myself in your shoes and this would be really tough. It’s not something that you haven’t been doing for a long time, it’s just that generative AI kind of exasperates this conversation and not only what it can do, but the multi-variate types of data sets, the melding of private public, but mostly what the technology can do.

Warren Barkley: Yeah, I mean we’re lucky, unlucky in some respects here, because we have the consumer side and so we have innovative things around disinformation like Jigsaw, if you’ve ever looked at that project and those types of things. And so we’ve been able to take a lot of that tech and be able to apply it in an enterprise way, but also give that tooling exactly to your point, some parts of the world, certain things are more sensitive than others. Let people have the ability to block it if they want to, and even know that that’s where it is, because it’s not always fundamentally obvious.

And that’s part of it. The security thing is really interesting too. We release this secure AI framework initiative, how do you stop people from really pushing the models in bad ways? And that’s another piece that we have a really big focus on, and giving people tooling to be able to protect their businesses and their interests, and that’s a place that we focus on a lot as well.

Daniel Newman: Yeah, it’s actually very interesting, just going back to something you said before about the advent of GenAI, and one of the things, and I’m just curious on your take on this, but people believed it was new in November, but if I wasn’t mistaken, generative tools were being utilized for a few years. I’m pretty sure Workspace had been finishing my sentences somewhat accurately for two years, three years. I just think it’s kind of funny how there’s an inflection where suddenly something becomes real to the world, but actually AI has been around for a long time.

Pat, we interviewed someone on this show by the way, that had a PhD in AI from the 80’s. I’m just saying. So Google’s been at this a long time, and that provenance and that pedigree should be weighed by enterprises and customers, whether it’s the vast dataset, the filtering and recommendation engines has been able to build over. It has a lot of experience in that experience of building something for yourself is generally a very good catalyst for building something that will be good for others. Not always, but usually, and especially when it’s being used for something so expansive.

So let’s do the look ahead. We’re future makers, Dan Bot here. Hopefully we did a good job in my hosting this show, but what are you excited about? I know you can’t give us all of the roadmap, but within what was shared and what is now public information, what are you fired up about? What are you really excited about? What’s going to be the big drivers for you in your product in the next six, 12 months?

Warren Barkley: Yeah, we used to say a couple of years ago that there’s a revolutionary nine months in AI, and we were pretty impressed with ourselves that we kept up with it, and now it’s every nine weeks. And so if you look at the stream of innovation that’s come out, it’s enormous. So it’s really hard when you look at the future to figure out. The things that I can see right now, multimodal is definitely going to be more of a thing. We have multimodal models today within Chirp, our speech model. You see multimodal embedding models exist, but in a bigger way.

When you look at a document, most documents aren’t just texts. They have texts and they have charts and they have pictures and things like that. So having models that understand textually where the actual picture is, and what the chart means in the context with the language and things like that. I think that you’re going to start seeing that type of stuff come next, which is beyond just text, beyond just images, all being dealt with in an independent way. So I think that’s going to be a big thing.

The other thing is they’re faster and cheaper. You’re going to see that come out. Definitely, we have an awesome position because we have our own TPUs, and you saw some of the announcements that came out around how efficient they are, but we also have a great partnership with Nvidia and GPU, and so we’re driving efficiencies in there like crazy. And I think that what you’re going to see over time is that things are going to get faster and they’re going to get cheaper, and you can kind of see those just over the horizon, most definitely.

Patrick Moorhead: It’s interesting. I understand why you’re excited. I’m excited about those things too. As an industry analyst trying to represent the needs of big enterprises, I’d love to see the continued focus on… And this is literally the second question I get after, what can I do with this, it’s how do I keep my data secure? Should I take this leap to the public cloud? I mean, we’re 14 years into the public cloud and between 90 and 75% of enterprise data is still on-prem, right? It’s like, so how do I get this data either moving it to the public cloud or I do some stuff on-prem with a…

By the way, TK kind of blew my mind when he said something to me and he wasn’t challenging me on something. But he’s like, “Well, don’t forget too, that public data sets would have to come on-prem to do this as well. I was like, oh wow, interesting. I’m usually talking about the cost of moving the data up, obviously the outcomes, but I think if you can develop the technologies and capabilities that puts that at ease, this thing is going to be endless in terms of what enterprise could do, and I’m the most excited about that. I know you didn’t ask me, but I wanted to get that out there cause I’m really excited about this ’cause I’m having to explain to these folks on a daily, weekly basis.

Warren Barkley: Well, I have good news on that front, and part of the good news is just fundamentally GenAI needs less data. And so if you think about it, for me to train a model and fine tune a model or just fine tune a model, let’s just say…

Patrick Moorhead: Yeah.

Warren Barkley: A GenAI and get it to understand the style of the photos I wanted to create or the style of the blog I wanted to use, or even be able to understand taxonomy in a domain area.

Patrick Moorhead: Right.

Warren Barkley: The number of examples is just a few hundred. A year ago when we were doing something like that, we’d need three, 4 million samples. And so even the amount of data you need to actually fine tune these models, is so much less that the starting point is much easier than in the past.

Patrick Moorhead: Yeah.

Warren Barkley: And I think that that opens up a world of people getting going on this thing as we look over the next couple of quarters.

Daniel Newman: And just to be clear, and we will let you go, I promise, Warren, but when you say less data, you’re kind of talking about the waterfall effect of the large models, and then over time as they get refined, require less and less data in terms of the size of the smaller models that will be used for enterprise specific use cases and such. Is that kind of what you meant? ‘Cause obviously people keep hearing about the huge data and the huge requirements required to develop a large language model, but it’s not the case as you get more narrow in terms of what you’re trying to use generate tools for.

Patrick Moorhead: I took it as training versus grounding.

Daniel Newman: I’m just wondering.

Warren Barkley: Yeah, it’s true, it’s a combination of both those things. And I think the first thing is when you have a pre-trained model like text-bison, any of those types of models, Llama 2… Not Llama 2, but some of the other models like that, what you end up having is that it’s already 98% there. And so when you want to get it to a 100% so then it can understand legal terminology, let’s say, then the amount of examples you have to give it is a few hundred, not a few million.

Daniel Newman: Wow.

Warren Barkley: And that just makes it so much easier than having to go, Hey, I’ve got this blank model and I got to train it from scratch. And then the grounding piece is really interesting because as we announced this week, the ability, you can ask the model a question, it does great at understanding questions and language and things like that, but it may not have the data to do that. And so you can actually redirect to a corpus, your own data set that is the authoritative answer for that question.

And so we have people who have benefits questions, right? Okay, it handles being able to understand the question, but where’s the answer? The answer in your enterprise. You can make it go there and then have it answer from there so you have an authoritative answer. So you get two really good things, less data to train and tune, and also ability to have authority without having to do a ton of work.

Daniel Newman: All right, this time I will let you go, but Warren, thank you so much for taking the time. There’s so much going on here. Excellent to kind of get under the cover, talk about the event. Pat, I got to say I really do like the format of these kind of post-event conversations.

Patrick Moorhead: Yeah.

Daniel Newman: It gives me a little bit more time and being that I’m pretty selfish. Joke. I do like to have some time to think about what I hear, to talk to customers, to talk to other analysts.

Patrick Moorhead: Yeah.

Daniel Newman: Of course to you, and kind of process what we’ve heard. But Warren, you guys have a very exciting path forward.

Patrick Moorhead: Yeah.

Daniel Newman: Congratulations on all the success. I’m sure I could say for Pat and myself, we’re both excited to continue to track what Google Cloud is doing and of course communicate and synthesize these advancements as they continue to hit the market. So hope you’ll come back soon.

Warren Barkley: Great, thanks. Pleasure talking with you guys.

Daniel Newman: All right, everybody hit that subscribe button, join us for all of our episodes here on The Six Five. We’re on the road virtually here, but we appreciate it. Google Cloud Next was a lot of fun. Join us for all of our shows. For this one though, it’s time to say goodbye. We’ll see you later.

Author Information

Daniel is the CEO of The Futurum Group. Living his life at the intersection of people and technology, Daniel works with the world’s largest technology brands exploring Digital Transformation and how it is influencing the enterprise.

From the leading edge of AI to global technology policy, Daniel makes the connections between business, people and tech that are required for companies to benefit most from their technology investments. Daniel is a top 5 globally ranked industry analyst and his ideas are regularly cited or shared in television appearances by CNBC, Bloomberg, Wall Street Journal and hundreds of other sites around the world.

A 7x Best-Selling Author including his most recent book “Human/Machine.” Daniel is also a Forbes and MarketWatch (Dow Jones) contributor.

An MBA and Former Graduate Adjunct Faculty, Daniel is an Austin Texas transplant after 40 years in Chicago. His speaking takes him around the world each year as he shares his vision of the role technology will play in our future.

SHARE:

Latest Insights:

Brad Shimmin, VP and Practice Lead at The Futurum Group, examines why investors behind NVIDIA and Meta are backing Hammerspace to remove AI data bottlenecks and improve performance at scale.
Looking Beyond the Dashboard: Tableau Bets Big on AI Grounded in Semantic Data to Define Its Next Chapter
Futurum analysts Brad Shimmin and Keith Kirkpatrick cover the latest developments from Tableau Conference, focused on the new AI and data-management enhancements to the visualization platform.
Colleen Kapase, VP at Google Cloud, joins Tiffani Bova to share insights on enhancing partner opportunities and harnessing AI for growth.
Ericsson Introduces Wireless-First Branch Architecture for Agile, Secure Connectivity to Support AI-Driven Enterprise Innovation
The Futurum Group’s Ron Westfall shares his insights on why Ericsson’s new wireless-first architecture and the E400 fulfill key emerging enterprise trends, such as 5G Advanced, IoT proliferation, and increased reliance on wireless-first implementations.

Book a Demo

Thank you, we received your request, a member of our team will be in contact with you.