Google CEO: Our ultimate moonshot is still Search
For Google (GOOG, GOOGL) and Alphabet CEO Sundar Pichai, his "ultimate moonshot" is still Search, the company's iconic and enormously successful search engine.
"[For] me, our ultimate moonshot is still Search. I know people would be surprised to hear that 20 years in. Search works very well, but because I'm working on Search, I see all the limitations. Even today, when people type in a complex query, we're looking at keywords trying to match it. We still have a long way to go to actually understand what the user's intent is, the context, where they are coming from, and giving the best answer. So that is still the moonshot," Pichai told Yahoo Finance in a wide-ranging exclusive interview.
Pichai joined Google in 2004 and quickly became a rising star. Ten years later, he was appointed to lead product and engineering for Google's products and platforms, including Search, Maps, Play, Android, Chrome, Gmail, and G Suite. He became CEO of Google in 2015 and joined the board of directors of Alphabet in 2017. He took over as CEO of Google's parent company, Alphabet, in December 2019.
Even with all of the demands that come with leading a $1.5 trillion market-cap company, the 48-year-old CEO is still very much involved in the product development process. This week, the Google I/O developer conference returned after a yearlong hiatus because of the pandemic with a slew of new product announcements and features. Among the announcements included a look at the upcoming Android 12 operating system, three-dimensional video conferencing dubbed "Project Starline," artificial intelligence to spot dermatological conditions, and LaMDA — a language model for dialogue applications, to name a few.
Pichai told Yahoo Finance that technology like LaMDA "is the work we are doing to make progress against the moonshot" in Search.
While the technology is still in research and development, it's being used internally at Google for novel interactions. It's open-domain and designed to converse on any topic. For example, at Google I/O, LaMDA was demonstrated via conversations with Pluto, the planet, and a paper airplane.
"I had a chance to play around with Pluto with my son. And I could have sat down and had a conversation with him about space. That could have worked. But doing it that way, I can see it captured his imagination, right? And that's the power of all of this, being able to do it in a more natural way," Pichai said.
At Google I/O, Pichai noted that via LaMDA, "actual conversations are generated, and they never take the same path, twice. LaMDA is able to carry a conversation, no matter what we talked about."
To be sure, LaMDA "doesn't get everything right" and sometimes it gives "nonsensical responses," Pichai noted at I/O.
Still, Pichai sees LaMDA's conversational capabilities as having the "potential to make information and computing radically more accessible and easier to use," including to add better conversational features in Google Assistant, Search, and Workspace.
While LaMDA is a "huge step forward" in more natural conversation, as of right now, it's only trained in text. Pichai told the I/O crowd the team needs to build multimodal models (MUM) to allow people to ask questions across different types of information.
"Even now, sitting in this conversation, we are taking in everything. It's not just what we are speaking — we're in a place, we are seeing each other. That's how humans interact with the world," Pichai said.
He added that multimodal models could help "computers understand images, text, audio, video the same way humans perceive information so that they can help them in more meaningful ways. And that's what excites me about the changes we are making."
Julia La Roche is a correspondent for Yahoo Finance. Follow her on Twitter.