Robby Stein, Google Search VP was interviewed by Hassan Bhatti and Zain Khan last month and just today someone DMed me about & shared this interview with me (earlier I wasn’t aware of it). But anyway today I watched this interview and found it interesting that’s why sharing transcript and key points here.
Interview is available on X and Youtube too!
In the text below I’ve added relevant links rest is just what was discussed in the interview
Key Points from Robby Stein
- Motivation for AI Search: Robby Stein joined Google to leverage the profound opportunity presented by AI, enabling Google Search to answer complex, multi-sentence questions in a way that wasn’t previously possible by tapping into its vast, real-time information graphs.
- Changing User Behaviour: Users are already adapting to AI in search. They ask longer, more specific questions, and multimodal search (using voice and camera) is growing rapidly, with a 65% year-over-year increase in visual searching. Younger users are leading this trend.
- Key AI Features: Current flagship features include AI Overviews (used by 1.5 billion people for quick summaries), Multimodal Search (like Lens and Circle to Search), AI Mode for deep, conversational follow-ups, and the experimental Search Live for voice-based interaction.
- Advice for Builders: Focus on solving a valuable problem with clear metrics for success (like “helpfulness”). Use specific fine-tuning and prompting to align AI models with your product goals to ensure a consistent user experience.
- Product & Design Philosophy: When introducing transformative technology, start from the first principles of the user’s problem. However, lean on “well-worn paths” and existing user mental models. A new design pattern is only justified if it provides a massive leap in value that overcomes the cost of changing user habits.
- Google Search vs. Gemini: Use AI in Search for informational needs—research, shopping, local discovery, and connecting to the web. Use Gemini as a creative and productivity assistant for tasks like generating images, writing code, or creating business plans.
- Future of Content & SEO: The fundamental principle remains the same: create helpful, high-quality content. Google’s AI systems use the same signals that reward valuable content, creating an opportunity for discovery rather than just replacing web traffic.
- Personalization as the Future: AI will enable deeply personalised search results by understanding user context, history, and even information from other Google products (like Gmail) to tailor answers for subjective needs like trip planning or major purchase decisions.
Here is the full transcript of this interview
Hassan Bhatti: We have a very special guest here today with us, Robbie Stein, welcome.
Robby Stein: Thanks so much.
Hassan Bhatti: So Robbie has been a startup founder. He’s worked on some iconic products like Instagram, and today he’s leading the AI search team at Google. So Robbie, welcome again. And this is a amazing opportunity, right? Like what you’re doing at Google. It’s a change of the world, how search is reshaping the world we see today.
Would love to understand from you when you took this role, what was the core problem or opportunity that you felt you absolutely had to come in for and solve at Google?
Robby Stein: Yeah, I mean it was very clear to me that there’s a unique moment in time happening right now where it’s possible for services like Google to just do so much more for people, given the capabilities of AI. And it’s something that I’ve spent a lot of time on, thinking about it in the startup world in terms of AI products and ranking recommendations and AI experiences.
And it seemed like a profound opportunity in time to have it be really possible for Google search to really answer anything you might have on your mind now in a way that really wasn’t possible before AI. And so that really motivated me to rejoin Google and take on this really exciting opportunity.
Zain Khan: Oh, so one follow-up question there, right? Like it also, I think was a very challenging opportunity, right? Like Google obviously is the dominating player in search. And or the last kind of like two years with the LLMs, a lot of people are like, hey, like has Google search dominance kind of decreased, like, you know, there are other players coming up.
So it’s a very challenging role as you were coming in. Like what were some thoughts going through your mind? Like you’re like, hey, why can I why can you are the best guy person for this? What does Google have which you can transform it to like in the space?
Robby Stein: Yeah, I mean, billions of people trust Google for so much. And I think, you know, it’s really an just unbelievable breadth of things people come to Google for every day. Whether it’s specific information about local businesses, shopping and product research. I mean, it’s just the list goes on and on and on. And so to be able to have people already coming to such a trusted product that’s delivered for so long, but then be able to do so much more for them.
So now instead of, you know, potentially asking the more basic version of a question, you could ask a multi-sense multi-sentence question comparing multiple products on multiple dimensions. Put that right into Google search and now with AI it can do a bunch of the research for you, but still tap in to all of that incredible knowledge of the web, context of the web, but also the information in Google’s real-time information systems. 50 billion products in the product graph, 250 million places in the places DB.
There’s just so much information that over 25 years could if encoded correctly into models and accessible via models just can really meaningfully expand what’s possible. And that I think was the opportunity that I felt, looking at this.
Hassan Bhatti: I think a lot of people still use Google and, you know, in the classic search experience way, like you kind of go to the Google domain, you put in some keywords and you get to some page with like 10 relevant links and that’s kind of like what’s standard for maybe like majority of people and people are still kind of getting used to like the new AI features and the LLMs kind of doing search. What do you think or the next 12 to 24 months search would look like in the mainstream?
Robby Stein: Yeah. Well it’s been actually a pretty profound shift in how people are using search even today. So 1.5 billion users now are using AI overviews, which is a quick and easy way to get information and context before obviously being able to go much deeper with your journey. And what’s neat about that is they’re actually asking different kinds of questions now.
They’re asking much longer questions, more specific questions. And you see this learning effect over time as it’s used more and more. So its effect is intensifying. The other area that we’ve seen a tremendous amount of growth in is around multimodal. So people are using their camera. They’re using on-screen context.
They take a screenshot on Android devices, you circle to search to ask a question about anything that you’re looking at. whether you’re cooking and you need to take a picture and ask for a next step or you’re take a picture of a flower you see, take a picture of your friend’s shoes, and ask, you know, where can I buy this? That’s now what people are doing.
And in both cases, they’re producing a lot of growth. I mean in the multimodal case, there’s 65% increase year-over-year on visual searching on Google. And for both AI overviews and lens, the people who are using it the most are actually some of our younger users who naturally are able to move between these modalities, between photo, between voice, between text and just ask kind of what’s on their mind more naturally. And I think that’s going to become more and more the norm for everyday people.
It already is for for most people. And then with the introduction of new experiences such as AI mode, which we’ll talk about I’m sure a little bit. It’s easy to then follow up and go back and forth and really tap into the state-of-the-art models for your hardest questions. And so that’s really exciting when you see all of those things come together.
Hassan Bhatti: I’ve become like a big fan of Google search again, like after the Google IO. So like, you know, it’s an amazing product with like all of the things which are happening. So, we have a lot of builders who listen to our podcast and read our newsletter. And you know, a lot of them would love to like leverage the Google’s search capabilities into like the products they’re building and like, you know, the APIs and things like that. And Google has multiple of those products as well.
And it seems like, you know, search to some extent is a context problem. Curious how you have kind of learned like as you’re going to work with this search problem at Google, what are some lessons you can share with like these builders who are trying to build products with search as an integral part of it. What can they do better to kind of like improve the experience for their users, any lessons you have to share.
Robby Stein: Yeah, I mean I think the big one is, you know, the first is to build something really valuable, particularly in the informational space, but just in general. You have to really have a clear sense of what problem you’re solving and then a clear sense metrics-wise of if you’ve achieved it or not. And you know, we spend a lot of time at Google studying helpfulness. What does it mean for information to be helpful for people?
And is it specific? Is it trustworthy? Is it something that’s backed up by another source? You know, right? And so I think being able to really define success is critical because these AI experience models, there’s so many degrees of freedom and they can go in multiple directions. And so it ends up taking via fine-tuning or prompting, I think like pretty specific will to really define what you’re trying to do with them so that it produces the response that’s most connected to your goal.
Otherwise, I think people kind of end up with unwieldy or inconsistent experiences. And it’s a little trickier. So there’s a few things that we’ve thought about as we’ve built in the space obviously and maybe as a builder, I would be thinking about as well, because otherwise it’s just hard to know if you’re doing well and I think you need that North star.
Zain Khan: Yeah. We’re seeing a ton of features coming out of both Gemini and out of Google search. I think AI reviews is something that most listeners hopefully have seen by this point. If you’re using Google, you probably have seen one of those AI summaries at the top at this point I assume. But there’s a lot of other stuff that’s happening.
You talked about circle to search, there’s other stuff that I’ve seen inside the Google app. For folks who are listening and maybe haven’t gone that deep on the AI features, could you just tell them like what the big stuff is that you’ve shipped that they should definitely try out?
Robby Stein: I mean, the first one’s obviously AI overviews, which is now, you know, able to show up for an increasingly complex set of needs. And so if you just ask a long question, and you just go right to Google and just put in a question right now. You can just think, you know, how do I and then just ask a normal natural language question.
Could be instructions for how to fix something, how to clean something. You will very likely see that AI overview show up with helpful information and links to click in and dive in and go deeper. And that’s a big one that obviously has been exciting to see how much adoption the people have had of it. The second one is in this lens and multimodal context.
So if you open up the Google app on your phone, you can just use the camera and you can take a picture of something and a lot of the time, if we think that an AI response is needed, sometimes it may bring up a shopping grid for example. you’ll actually just get an AI response right there. and it’s incredibly fun and useful.
You can even, you know, use some of our newest features that were out in labs and are just starting to roll out. So these are really the foundational experiences that were out over the last year and then this IO, we announced AI mode, which is a way to have a fully end-to-end generative experience that really lets you go deep with search and have a back and forth with follow-up questions to get at whatever you have on your mind. And you can use that by using the little mode switcher.
So you could still type your question into Google and then click AI mode and it will trigger our most sophisticated frontier models. It will use Google search as a tool. so it’ll think of a bunch of queries and do this fan out process. It’ll bring in all of this information into this response and you’ll be able to follow up with it and have a back and forth.
And you can also go directly to it from the apps as well. You’ll see a little icon for AI mode and that’s been really exciting. And then the last one, which is actually in labs now is there’s a search live experiment. we just actually just rolled out for voice where you can have a live way to talk to search.
So the insight is that once you have a model that is able to you know, really understand the world’s knowledge and context of the web, you can really ask it anything that’s on your mind. And what’s neat is you can do it in a live back and forth. So, if you’re in the car, you can just turn it on, you’re going for a walk, you can just like literally start talking to Google, which is really neat and something you couldn’t do before. So these are all available today for people.
The labs one I believe is in the US, so you’d have to turn that one on. And AI mode is also in the US, but we just announced the labs, it’s available in India as well. And we’re looking to bring many of these frontier AI experiences obviously out globally over time.
Zain Khan: I think for folks who are listening, a lot of this, I mean, some of this definitely exists inside the Google app. Like if I’m going to be completely honest, like I just had never like used the Google app per se because I always just used to go to google.com. But now that I’ve seen obviously I’ve seen part of then covering the industry, I have to keep a track of all these features. I actually do have the Google app and I use it and a lot of the AI stuff is just, yeah, I think it’s very interesting to go check that out. especially we like
Robby Stein: I have one recommendation for for viewers by the way. If there’s one tool that you want to upgrade your like you like your utility life in and your efficiency, add the widget, the Google app widget to your home screen. Like obviously you have there’s a search box available on many Android phones, but if on iOS, you can add a widget that has four icons on it and it has search, it has AI mode, voice and camera. And so whenever you’re out in the world, you can just quickly get to the camera and then see this experience I’m mentioning and take a picture of something and you see AI. or you can use voice and talk to it. It’s really cool.
Hassan Bhatti: I know you mentioned earlier that people are asking much longer questions have much longer queries now. I remember when I was at the IO event, someone from the Google team told me that there’s a lot more, there’s a lot more questions related to work now as well. Just want to ask you on your end, like what were some other interesting or new user behaviors that you’re seeing when it comes to search and AI?
Robby Stein: Yeah. The questions are definitely getting longer and more specific. So, I think an example we’ve used before is something you may say like things to do in Nashville. But in the new world, it’s easy to say things to do in Nashville that allow for outdoor seating and I have a dog and an allergy in a specific way.
You could put that into Google now and that’s really fascinating and allows you to get that level of specificity to perfectly match your needs and we’re seeing that. I think we’re also seeing in a bunch of the areas where it’s particularly helpful to have more of a back and forth and you’re trying to understand something new.
They seem to be incredibly useful for for learning just in general. So you see people trying to do everything from homework to learning about new technical topics that they read about, maybe in their work. It could be just topics or concepts in general. they ask follow up questions. they want to understand more. They wanted to give examples. And so you see that as well. And I think you also see it in things in just com in general complex questions and these people have.
So health is one. People put in lots of information and then it’s like, what does this all mean? And then they want to understand better. Recommendations. So I have this issue, I’m looking at a car, I got two kids, I have this price point, this is the gas mileage or the the charge mileage I need that I expect for an EV. Whatever it is goes into Google and you can and that and particularly with our newer experiences like AI mode where you can have a back and forth.
That’s been, you know, I think pretty popular as well. So those are the things that I think are are increase and obviously everything that’s included in the multimodal universe. Two biggest ones are around shopping and education. So people are taking pictures of and screenshots of clothes that maybe celebrities or influencers are wearing, they’re trying to shop the look and understand more there. Obviously with homework when you take a picture of a geometry question and get help about it because it can see what you’re looking at. And those kinds of needs are now growing really quickly.
And that’s very exciting because we can do hopefully a lot more for people in their day now with these products.
Zain Khan: Yeah, absolutely. I think one question that a lot of folks would have especially if they’re new to AI right now, Google has two very strong AI products. There is Google search with AI, but then there’s also Gemini. How should users think about this? Like when should someone be using Gemini and when should someone be looking to go for AI in search?
Robby Stein: AI mode and the AI products of Google Search, they’re search. It’s really about getting you information as effortlessly as possible and connecting you to information and the web in a rich way. And I think why do you come to Google search today? What kind of question would you put in there?
And now that you can kind of unlock your imagination of how specific your question could be, you can put it into search. So those are the kind of domains we talked about, right? either kind of informational questions, maybe you’re shopping, maybe you’re looking at you’re trying to compare restaurants to go to, you have a health question.
These are the kinds of things that we see people coming to search for a lot. I think in Gemini, obviously is an incredibly powerful product for lots of things, but one of the things it does as an assistant is it also creates a lot of things for people. So you can generate images, you can code with it, you can it can make a business plan for you.
It can generate a, you know, like a whole outline for your offsite. And so I think it’s really this kind of this assistant that’s with you, co-working with you and building with you. obviously you can ask it informational questions as well. and I think that’s great. And so it really just depends on I think, on kind of what you’re looking to do in that way. But those are ways that I think I’ve thought about how to use the products.
Hassan Bhatti: So your team is distinct from the core Gemini Model team, the Google’s deep minds research team. So what are like some of the collaboration between all of these three organizations just
Robby Stein: So we have a deep and embedded partnership with Deep Mind, with Google Deep Mind. And so I spent a lot of time with Google Deep Mind leadership. Our teams are co-working on many projects together. And so one way to think about it is obviously foundational model capabilities that are invented and that are key capabilities.
It could be things like thinking, it could be things like agentic capabilities with project Marine that was announced with browser and tool use for example. And so then what happens is we think about how we can customize and build on top of these experiences for these informational needs that I mentioned. So for example, the fact that the models can do query fan out and do multiple queries and research with Google systems understanding Google’s knowledge bases.
That’s a big thing that we have to spend a lot of time thinking about. You have areas that we talked about as things that are going to be coming hopefully soon over the summer around things like agentic where you can use search built on top of the capabilities of something like project Mariner to tap into let’s say you’re trying to book tickets and that’s a very cumbersome project.
You can do that all within the Google experience and with Google’s kind of informational services there. So we work very closely together in making these experiences possible. It’s a deeply integrated partnership. and that’s how we are able to make so much progress.
Hassan Bhatti: So in essence like it’s like the interlinking of like all the different facets of Google which are like very powerful on its own but like for the end user, you’re can trying to give them like, hey, you have a question in mind, you need an end outcome which could be a search question, which could be a transaction, which could be like, you know, shopping experience, whatever that you can now get that because of this collaboration within Google.
Robby Stein: Exactly. That’s exactly right.
Zain Khan: So, Robbie like you have also worked on like some, you know, an amazing consumer product, right? Like that millions used. Still use at Instagram and now you’re driving like a major evolution in search. Two pretty iconic kind of like pieces of things to kind of work on. What are some core product principles that guide you when you are introducing something as significant as AI into Google search and you were when you were working on Instagram, like any things like, you know, for new PMs or just product people in general?
Robby Stein: Yeah, I mean, I definitely start as much as I can from first principles of what people are trying to do with the products they’re using today. I think many people try to approach the problem as starting with a prior of what exists today and then building on top of that prior.
But I think certainly during, you know, transformational moments, it’s obvious that there’s new ways of doing things that can just unlock even more value for people if you were to change it out. So you have to kind of like you have to really think how is it, how is the product you’re building going to be exceptionally good at solving the problem that people are trying to do with it and not be too wed to the existing world. And so that’s one way you do it.
But then the second thing you have to think about is you need to use well-worn paths. So if we you can’t just change something that people are using all the time. You have to honor these paths and you have to lean into people’s natural expectations around design and around products.
And so you want to it’s easier to add something new and say there’s a new thing you can do here then to change something radically where you wake up one night and all of a sudden your product experience is completely different than it was after using the same product for like 20 years. Like that’s not going to work well, you know.
And it’s also not going to change people’s expectations of what you can do differently with that product because it’s so ingrained. And so I think that when you kind of really think about it through those two ways, that’s how you’re able to build, you know, more transformative things. And those are the things that’ve been guided me a lot, you know, over the years.
Hassan Bhatti: What are there’s like some interesting things obviously like, you know, Google’s product is used by billion and search is like the fundamental kind of like entry point for a lot of folks as well to the internet. As you guys were thinking about like, you know, adding this AI overview, AI mode, were there some decisions which you had to kind of make which you are you able to share some examples around them?
Robby Stein: Yeah. I mean I think that there I mean similar actually very similar to this. I think there were questions as to whether, you know, AI mode needed the space to breathe in this way. by well or we could kind of build it directly into AI overviews. And that we actually had an experiment in AI overviews a while ago that let you ask follow-up questions in there and it didn’t work for a bunch of different reasons.
But I think it’s kind of in the similar vein of trying to if you want new behaviors and you want really to just best solve the problem. Like you’re never going to really make it natural and easy for people to have a back and forth unless there’s a natural way to go do that. And so we decided to have this separate space with this linkage where you can follow up with AI overviews and it takes you into AI mode, but it brings you to this new space that gives all full screen real estate or app real estate on the phone to have a generative AI back and forth and you can then have the bar at the bottom where you expect there to be a conversation.
And it’s just this natural design paradigm that people expect out of a generative AI product now versus if it’s at the top, it’s a small decision, but if the search bar is at the top, the top isn’t a place people expect to have conversation and go back and forth. So it’s like a weird mental model.
So going back to, you know, really understanding people’s mental models for products and they think very spatially and not changing the way things use if you expect something to work a certain way, there’s an expectation that that continues to be the case going forward. You have to really honor that. And so that was a one kind of decision that we made along those lines and I mean there were many others.
Zain Khan: We obviously have like a media company we do podcast, but we build products as well. Hassan, you know, has built and exited two companies. We’re continuing to build more products now and running experiments.
So one thing we always go back and forth when whenever we’re designing an MVP is should it start out with the sort of like chatbot experience that’s dominating so many products right now. Like you’ve probably have seen all these memes on on Twitter how everyone’s trying to build the exact same thing and so on.
I always lean on the side of like, this is what people expect when they tap into an air product now, so we should just lean into the learn behavior. But then Hassan always challenges me validly a lot of times and says, you know, maybe we need to disrupt and come up with a new design pattern.
But to me it’s just like sometimes when you see a completely new design pattern, you’re not sure how to behave around that design pattern. Like you said something interesting when just like when the search bar is at the top or the conversation bar is at the top, you don’t use it for conversation. It kind of like a light bulb went off in my head and I was like, oh, interesting because we had done that in one of our MVPs and it totally makes sense that it didn’t work out. So sorry, just to reface the question. Do you think we should lean into that kind of search like the dominant search patterns or the user experience patterns or do you think we should really try to do something new here?
Robby Stein: You know, I think when I when someone asks a question like this, I think a lot about this book, the design of everyday things. Great book. is amazing like classic design book and there’s a whole chapter in there about the design of doors and how when you expect when you see a handle, you kind of know how to use a handle, right?
But that actually people still design doors that you don’t know whether you should pull or push or pull because like there’s fancy ones that like have symmetry in a way where you don’t quite remember which way it’s supposed to go. And so it might look cooler and be distinct, but like you bump into it and it doesn’t work like particularly well.
And so I think about that all the time, which is if there’s a door handle that you can give a user that they can just press, you should probably give it to them. If that’s the best way for them to achieve the thing you’re trying to do. And then I think from the first principle’s point I made originally, you I think you should be challenging it, but you should assume that any change to a prior has a very large cost and so the rate of unlock of what new things you should be able to do should be sufficient to overcome that cost.
And so if you want to go fancy door handles, you can do it but like is it really worth messing with someone’s door handle just so that it looks slightly cooler if like they can’t use the door, right? But sometimes when you challenge these paradigms, you’re like, wow, it feels so refreshing to change this.
I can do 5X the amount of queries or something with this move. And so sure it’s annoying, but once you’re in it, are you getting this huge leap in value or are you just kind of like doing something different to do something different?
Like all the time I’ve had funny conversations with designers where we’ve tried to design new logo like little icons for things. and there’s a thing in the PM world where like labels always beat icons because like everyone wants to design like a really cool, funny, like cute whatever icon that like looks fun and is like a little person eating ramen and that’s supposed to mean like quick takeout or whatever. And it’s like I don’t like no one knows what these things mean.
And so if you just use like, you know, icons that people actually use a gear for settings. Everyone knows it’s a setting. And if you use like an atom with spinning electrons, it’s like I guess it could be settings, but like why? right? Like there’s no real marginal value to a user and so you’re probably just going to have less people understand what it is.
Zain Khan: That’s a great way like, you know, like what’s the outcome? Is the outcome like super massive value creation, if not, then, you know, itmight just be a cool thing but like we’re not really impact you’re hoping to have. I think that’s a great point.
Hassan Bhatti: So probably like one question I’ve always like love asking like our guests is, you know, who are like the two, three people who have had like a great impact on your career and really helped you build this amazing arc. I was very I was very interesting to hear bit about that.
Robby Stein: I think obviously a big one is Kevin Systrom who’s the founder of Instagram and a very close partner and friend and we spent a lot of time building products together for a really long time. We also built the startup together over the last few years before I was at Google and I think it’s had probably the most profound impact on how I think about product.
And I think a lot of it comes back to these first principles of why what problem are you actually solving? Really understanding the system analytically, focusing the problems on if we solve this is it going to matter and is this actually how do we know this is the most important thing to solve?
So many times we just build things that are never going to matter. Either because they don’t matter to users or they’re implemented in a way so far down the stack, it’ll never be used by a large percentage of people. It’s just like why? Why are we building it? And then obviously on design, it’s like this simply is this simplicity first kind of mindset of do the thing that the user is most helpful to the user and lots of good things will follow. And I think those have kind of like become like kind of foundational for me in how I think about products.
Zain Khan: What are some like changes you’re seeing in either like design philosophy or product development philosophy now that LLM’s are coming into the process? Some startups that I talk to now say, hey, you know, they’ll whip something up with something like versal V0 and they’ll take it, you know, straight to, you know, something like a web coding tool and then ship it that way. obviously Google probably has more robust systems in place to do these things. But are you internally seeing LLM’s impact the way that you’re approaching products or how your teams are approaching product now?
Robby Stein: Yeah, absolutely. I mean, it’s on every facet. It’s on being an idea, thought partner and actually really thinking of it as a co-worker. I think, you know, there’s this there’s this another book but you you can read about but it’s a I think it’s Ethan’s book, Ethan Mox’s book but it’s like basically the more you think of AI as you would a person, like what kind of thing would you ask a co-worker? You can get so much value out of an AI even just for ideas.
Then you talk about testing. And with testing, you can basically prototype a product now by yourself, which is kind of unbelievable because usually as a product person, I’ve always the biggest barrier I’ve always felt is been the speed which you could produce an MVP and get feedback on it. and now you can do it in like 10 minutes. and you could go to a mall and ask 40 people to look at this thing and tell you what would you use this for? Like do you naturally like how would you explain this to a friend? It was like I have no idea. You’re like, all right, put that in the garbage.
Whereas that whole process would have taken like a month or months to build a prototype that like worked on a phone, that you could hand to someone. Especially if you’re writing client code, like it’s a real pain. So that’s pretty unbelievable. And then in terms of understanding your results and then iterating and finding market fit, it’s also pretty incredible.
Like I’ve at the in our startup, you know, I’ve uploaded like millions of rows of log data to LLMs and been like, can you, you know, plot out J curves and look at retention analysis and understand these different cohorts if I give you these attributes, like each column does this, this is what the data represents. like analyze like it’s almost using as a data scientist. to help improve it.
So it’s really the whole way through, I think is fairly transformative. But you have to know how to apply it. I don’t think the LLMs naturally do the right thing every time. Although they’re starting to do much more of that.
Hassan Bhatti: Let’s move a little bit towards like advice for maybe people who are getting into product, whether they’re founders or designers or just PMs. Do you feel like the fundamentals of building strong product are still the same and that’s where they should focus or do you feel like with AI and LLM’s and all these new things coming into the mix that people who are coming into the market maybe need to change their mindset and focus in new things.
Robby Stein: I think they’re fundamentals of the product are exactly the same. And I think it’s about really understanding people and being curious and a student of why things work and don’t work. So you’re really studying causation and trying to understand why is it that someone didn’t use this thing? Okay, well it was probably more this.
And how do I know if I’m right? Well, I asked 25 people and every single one said this is their first thing, or I surveyed people and it was always the first thing that people mentioned. It’s probably a good thing to go work on because it’s the reason they didn’t use your product versus like a random cool idea that seems cool, you know, and like LLM’s can help you think through those kinds of things, but it’s not just going to do that, you know, I think that’s really the value add of product leadership.
And then, you know, obviously in terms of building and designing, there’s similarities there too. Like I think, overall, I think we’ll start to get really great products just produced by AI for sure. And they’ll be really well designed and I think they will feel really good to people. And so potentially that actually could be one people get more leverage on over time, but I think the real deep root of like what do we build?
Why do we build? And the decision-making element is really what product people and product leaders need to spend most of their time understanding. And I think that feels like you have new tools to amplify your decision-making ability, but I don’t think that’s the kind of thing that you can outsource or you will be able to for quite a long time.
Zain Khan: That’s a great answer. I want to ask a few questions about like what’s happening on the internet more broadly and how Google and AI kind of fits into it. So, SEO obviously is a big part of search. I’m not in the SEO side of things, so I talk to a few people who work in SEO whether it is, you know, running, you know, like on the publisher side or on the agency side just to, you know, see what their take on this was.
People obviously, you know, there’s a huge uncertainty but there’s also huge excitement. A lot of people are like, you know, oh, we’re afraid traffic’s going to go down. A lot of people are like, no, but LLM optimization is now going to be like this huge next thing. So, you know, there is there is sort of like pros and cons situation.
One of the most frequent questions that I got that as I talked to people was or point of concern that I got is now that AI overviews can just surface an answer for you or if you go, you know, just talk to even something like Gemini and that just surface the answer for you. a lot of these websites might now get less traffic or fewer clicks. How do you sort of think about the incentives of like the publishers that provide a lot of this content that makes search engine valuable versus be actually useful to users and giving them more control and power with these AI tools.
Robby Stein: Yeah, so first of all, for Google Search, connecting to the web is absolutely critical and a core design principle of absolutely everything that we do. And that’s essential. And we think that’s because actually people want this. Like they we see people want to go and see where information came from.
They don’t want to take any one single word for it. They want to compare and contrast different perspectives and they want to understand that at that level of depth. I think the things that’s changing is the expectation for users that they get some more context first before and they understand a little bit more before they dive in.
And I think that you’re seeing that as like a change in user behavior and user expectation. But I think we still see in both AI overviews, in all of our AI experiences, people have this desire to go deeper, to click in, to engage. And when they do, they’re actually even better customers of those web pages and publishers because they have a little bit of context. They’re not just bouncing around just to understand.
Then obviously, this is, we hope a very expansionary moment. So hopefully AI is this incredible discovery engine as it unlocks all these needs for people over time. And obviously Google will continue to send billions of clicks a day and that’s not changing. But like I think it’s this is pronounced sense of of growth, that’s the opportunity here. And how we think about it. And hopefully like a lot of the work that’s been done to build valuable content, continues to be useful in this context.
So these AI systems have been using the same signals. So if you ask a question, let’s say I take a picture of my bookshelf, for example, lens will segment each book, it will do visual understanding and fan out queries. It will then find because as if like a person understood all of that and then did a bunch of Google searching. So web pages are going to start showing up for those now like downstream like questions in the thinking and reasoning process.
And those websites, the ones that rank the best because people have done great work to build great content that people have valued, they’re going to show up likely in this AI experience too as like further reading and like great lists of sci-fi books from people who have taste and I’m going to want to see what that is, you know? And so that’s the vision of how this comes together.
Hassan Bhatti: Yeah, that makes a lot of sense. A little bit of a follow-up question on that as well. A lot of folks are, you know, talking about LLM search optimization now going on from like SEO. Google over the years has put out some awesome resources in terms of you know, search optimization for for publishers, agencies, other folks as well.
People are asking new kinds of questions, people are asking more in depth questions now. Is there any particular advice that you’d have for people who are creating content in the internet whether that’s like a blog, a website, anything else on how to create content that surfaces over LLMs or do you feel like maybe it’s too early to have a strong opinion on that?
Robby Stein: Well, I think the good news, you know, similar to what I said before is that, you know, Google’s systems are being used very directly in all of our AI products. So for instance something like spam and and trust issues that we detect, right? Like those we’re not we don’t want to feed in, you know, extremely spamy or manipulative kind of websites or anything like that. trying to take advantage of people into AI responses.
It’s like those systems are used. On the other side of the equation, the most helpful website should be are going to be used, you know, hopefully a lot. And those links and opportunities to click and provide context is going to be very strong there. And so what’s neat is if you just, I think the focus on building really helpful content.
We talked about being students of helpfulness. That’s how Google systems have been optimized for a really long time. So it’s like does, you know, if a if you’re a blogger and you write about finance for something, and someone’s interested in stock performance and some some, you’d kind of think about who are the kinds of people who would want to read my article. Is it clear? Is it is it grounded and cited? Is it specific? and does it answer the user’s question that you think is coming who would be coming in asking this?
These are the kinds of things systems are designed to do. and hopefully that creates lots of goodness in this AI Discovery world too. But obviously it’s still really early and I think everyone’s learning what’s how to best contribute, you know, to AI discovery and I think over time like we hope that there’s more and more feedback and information given to make that helpful for everyone building.
Zain Khan: From the user experience side of things on search, People use the word AI a lot now just to denote, you know, kind of low value or useless AI content that’s decreasing on the internet. Looks like the amount of AI generated content that exists on the internet might exceed the amount of human generated content in the on the internet in the near future. Have you given any thought to to how Google or any search engine is going to be able to handle and differentiate, what to give priority to?
Robby Stein: I mean we think about this a lot. And actually there was a change recently, to you know, really make it a lot harder for really, low value content that probably was designed more to siphon traffic and manipulate in ways, wouldn’t be able to get, you know, the right how have that type of impact on the system. So there’s a lot of kind of attention and understanding that really well and being careful there too, obviously.
But I think on the positive side, it’s not clear to me that AI content can’t contribute a lot actually. But I think the principles are really similar, which is like, hey, does this thing answer your question or is a bunch of general gibberish that just has like a bunch of keywords in it so that if you search for something it shows up, right? versus I mean a very powerful model can produce a fascinating document and potentially soon web pages and whole rich experiences that people may really enjoy and value a ton.
But this is where I was talking about before with first principle thinking. If there’s a format and there’s a thing that’s incredibly useful to people and it’s accurate and original in some respect and valued, that’s great and like that kind of content could be something really helpful for someone. So I don’t think we think of it as like AI versus not AI. I think we think about it more as helpful versus not helpful.
Hassan Bhatti: Yeah. How are you using AI products yourself on a day-to-day basis, whether that’s for for work or maybe at home?
Robby Stein: I mean, I got to say like 97% of all of my use are on Google AI products right now. Like I’m living and breathing it. It’s my, it’s what I’m here to do. and it’s what I love doing and it’s giving me the most value. So that’s where I’m at. But, like we talked about this widget, like I, I think what is interesting is I’ve started to now ask Google virtually any question that I have on my mind when I have kind of informational needs.
And so, for example, I went to plan like a daddy daughter beach day as an example with my daughter. and originally I was start I was starting to think about it as just, you know, searching up like beaches SF or something. But I put in this massive question in AI mode directly from that direct entry point. Like I talk to it because it was like kind of long. I was like, hey, I want to go take my daughter somewhere. I think I want it to be a beach but she’s really young, like three.
So I really hope it’s not a lot of walking everywhere. And it recommended me this really fun beach day in SF that also had links and access to all the beach information like wasn’t like good accessibility, closing hours where parking would work. It was awesome. And so I’ve started to do that. I also asked for shopping.
I’ve did a bunch of we’ve added a bunch of rich shopping experiences. So I can be like, I took a picture of my shoes. I was like, can you find me more shoes that are like these and then potentially the ones that are good deals and can you provide links to buy them and you know, I definitely really like these specific shoes but I want these kind of color combinations.
And then the last way I think in more of a pro tools thing is use is getting using it more as like a technical advisor and like really putting code in or data in and asking it to give me real-time feedback on something. But that’s more as like a co-worker use case. So those are a few examples.
Hassan Bhatti: So, Robbie, like one feature which I love on the search is like the personalisation and I love kind of turning it on on my search and like my Gemini app. How do you think of that evolving over the next 6 to 12 months? Secure to know, but I think it’s so powerful already, but just want to get your thoughts on that.
Robby Stein: Yeah, I mean, there’s already a fair amount of personalisation happening today. Like you probably see like recent searches or like if you typically go to the target, let’s say to buy, you know, certain things and you search for something, you’ll see like a recent kind of label there.
So that we can more efficiently get you to things that you’re, you know, typically excited about on Google. But I think with AI, there’s this opportunity to really deeply understand you so that to save you a ton of time and really tailor things to be uniquely helpful for you specifically.
I think it’s usually in these classes of areas that feel like, you know, matters of taste, like they’re less facts, but they’re more like, you know, local recommendations, for example, or if you’re trying to buy something or you’re trying to purchase a car, you know, understand a big decision in your life.
It’s really your personal context will control for the answer, right? And what you want to do. And so the way we think about it is, obviously people have a tremendous amount of history with Google, particularly Google Search. And so, you know, we have, obviously announced at IO, we’re making, you know, really large investments in personalised experiences here because not just with search but with other Google experiences like Gmail.
You know, imagine being able to have this really powerful context where, you know, if I have access to, you know, recent, receipts of things that I’ve purchased or trips that I’ve gone on, it’s going to be uniquely helpful when I go on my next trip because there’s so much I would want to tell someone. if you’re asking someone to plan a trip for me, there’s so much context I’d want to give them, but a lot of it already is actually available relatively readily and it could and it can be really feel like, oh my god, this feels like magic for me. And I think that’s very exciting for you know, one of the future themes for us.
Hassan Bhatti: what can users expect in the next six months which are things which you are able to share which they would see in their Google search experiences getting better even more.
Robby Stein: Yeah, I mean I think a few of the things that we’re really excited about is actually one is personalisation and having personal context be part of everything you search for particularly when AI can be really helpful uniquely for you.
I think the other one is around these live modalities where you can just talk to Google and making it really easy to get information on the go or just naturally, you know, you’re on your couch, you just kind of want to have a conversation, you’re in a different experience. I think obviously deep search and kind of powerful search is also very top of mind.
I actually was recently trying to buy a safe because some documents I got a new birth certificate for my daughter and I was like, oh, I actually kind of should look at this. It’s actually really complicated. There’s like fire ratings and burglary ratings and implications on your insurance. It’s crazy.
So I used an internal version of it that, you know, we’ve announced we’re rolling out, you know, over the summer. And it’s incredibly powerful. It blew me away. It gave me the framework for everything, you know, links to how to like read all the reviews of the various things. So that’s going to be really exciting.
And then I think inspiration and, you know, really expanding AI’s ability to help you with these tasks that are less informational, more inspirational. I think is tremendously interesting. And I think, you know, we talked about how shopping could work for example, where you just like, hey, I like two kids and I need a rug, that’s pretty good, but I like gray kind of monochromatic vibes.
And then you get actually like a visual presentation and you can go back and forth with it. I think that’s super really interesting thing. I think really innovative approaches given I think with lens and Google’s visual understanding and image search, there’s like a lot of really cool things that we can do with that.
Hassan Bhatti: That’s awesome. I’m super excited about the inspiration, inspiration peace and Robbie, I really appreciate you taking the time. you’re this was an amazing chat and I’m sure our audience will like enjoy it. Thank you so much.
Robby Stein: Likewise, thank you.
Leave a Reply