Editions
? 2024 All rights reserved.
Advertisement
Advertisement
Advertisement
Reuters

Google exec explains how Google Assistant just got smarter

David Pogue


At this week’s keynote show for its big developers conference, Google (GOOG, GOOGL) unveiled news about features and products coming down the pike this year, both for developers and the masses.

A highlight: Google Assistant is growing up. (For the uninitiated, Assistant is Google’s version of Siri.)

Google’s vice president for Assistant, Scott Huffman, sat down with me to discuss the announcements.

First things first: What is Assistant? An app? A product? A feature?

Advertisement
Advertisement

“The Google Assistant is not an app or a device. What you really want from an Assistant is not just a thing that’s in one place. You want something that you can have a conversation with and get things done wherever you are, whatever context you’re in,” he says.

That could include your phone, your car, or your Google Home.

“I leave home,” Huffman says by way of example. “I say to my Google Home, ‘how late’s Home Depot open? Well, give me the directions.’ It should say, ‘Sure, they’re on your phone.’ As you walk out the door, the Assistant on your phone picks up the conversation.”

Assistant is built into every Android phone (long-press the Home button to bring it up)—but starting this week, it’s also available on the iPhone, as the Google Assistant app.

Advertisement
Advertisement

Either way, you can now type those questions and commands to Assistant instead of speaking them, if you prefer—something you can’t do with, say, Siri. Handy when it’d be inappropriate to talk aloud.

image

Then there’s Google Lens.

“Google’s been making deep investments in vision and machine perception,” Huffman says, “and so we’re building that into the Assistant. So now, I can just open the viewfinder inside the assistant and say, hey, what about this? And the assistant starts to give me options.”

For example, you can point the camera at a flower, a building, a painting, a book cover, a restaurant storefront. The Assistant recognizes what you’re looking at, and instantly gives you information: identification of the flower, ratings for the restaurant, and so on.

Advertisement
Advertisement

And not just details, but actions to choose. “One of the examples we showed is pointing the camera at a marquee of a show where it says, this band at this time. And then you get options like, you want to hear that band’s music? Do you want to buy tickets? You want to add to your calendar? You do want to share it with your friend?”

So just how smart can Assistant get? Huffman knows where he wants it to go.

“I can tell you how I say it to my team,” he says. “I say, ‘Hey guys, we’re just building this really simple thing. All it has to be is that anyone can have a conversation with it anywhere, anytime, with no friction. We should understand that conversation, whatever it’s about. And then just do whatever they ask us to do. Let’s just build that.”

Sounds good. Get to it, team!

Advertisement
Advertisement

More from David Pogue:

Inside the World’s Greatest Scavenger Hunt: Part 1 ? Part 2 ? Part 3 ? Part 4 ? Part 5

The David Pogue Review: Windows 10 Creators Update

Now I get it: Bitcoin

David Pogue tested 47 pill-reminder apps to find the best one

David Pogue’s search for the world’s best air-travel app

The little-known iPhone feature that lets blind people see with their fingers

David Pogue, tech columnist for Yahoo Finance, welcomes nontoxic comments in the comments section below. On the web, he’s davidpogue.com. On Twitter, he’s @pogue. On email, he’s [email protected]. You can read all his articles here, or you can sign up to get his columns by email.

Advertisement
Advertisement