If you were under any illusion that the Facebook Cambridge Analytica scandal had wounded the hubris of Silicon Valley, Google I/O 2018 proved otherwise.
On stage during the opening keynote, CEO Sundar Pichai showed off a new application of Google’s AI, called Google Duplex, which can make phone calls from a Google Home or phone on your behalf. No, it’s not going to send someone your pre-recorded message – it’s going to actually have a conversation with someone else for you.
Two different demos showed Assistant calling small businesses to book appointments on behalf of the user, the first for a hair salon appointment and the second to book a table at a restaurant. Google’s Assistant not only managed to follow basic conversational maxims, but even used “umms” and pauses to emulate human speech patterns. It was impressive and creepy. Amazing and questionable.
In fact it raised more questions than it answered, and to be fair, Google knows it. Google Home VP Rishi Chandra told The Ambient that Google still doesn’t know how some of this will work in practice. There are not just ethical but legal considerations here, which Google will have to navigate.
What’s actually happening here?
Duplex brings together various AI technologies that Google's been working on for years, culminating in its most human application yet. The aim of Duplex, says Google, is to automate calls to businesses and book appointments on your behalf, so you don't have to pick up the phone. In the first demo we saw the Google Assistant call a hair salon to book an appointment, the receptionist apparently unaware that she was speaking to an algorithm.
This isn't something you'll be using to call your friends and family, but for offloading the task of making bookings. Pichai said that 60% of small business in the US don't have an online booking system – and Google sees an opportunity.
Duplex is powered by Google’s WaveNet natural speech generator and has been trained on what Google calls "closed domains". Essentially, Duplex doesn't know how to talk to your parents, but it knows how to speak naturally when booking a haircut or ordering a plumber.
Duplex also has a self-monitoring capability, which means that should it get into a more complex exchange that it can't handle, it can tag in a human operator to do the job. Whether or not the Assistant will be supervised come the time Duplex goes public remains to be seen, but Google's keeping a human eye on it during its training.
Is this… legal? Is it ethical?
The legality depends on how Duplex actually shakes out. Certainly, in some US states it’s illegal to record a phone conversation without the other party giving their consent. It’s unclear what this could mean for Duplex; even if it’s not recording the entire conversation, would information taken and stored by Google suffice as a “recording”?
More concerning are the numerous potential ethical issues. Is it OK to call someone and not let them know they’re speaking to a robot? Google told Cnet that Duplex will have some sort of disclosure built in so the person on the other end of the line will know they're speaking to a robot at the start of the conversation. One thing’s clear: a lot of this is still in progress.
There’s also the obvious question of whether businesses will be happy dealing with a robot. We can’t see any scenario where this technology will be perfect from day one; we can only imagine how frustrating a scrimmage with a confused Assistant could be for someone whose day is busy enough without having to pander to an AI. Furthermore, even if the person knows they’re talking to a robot, who’s to say they’ll want to, no matter how human-like it sounds?
But there are positives to remember too. For people who are elderly or disabled, Duplex is an entirely new assistive technology that could be incredibly helpful.
When will Duplex roll out?
Surprisingly, Google says beta testing will start this summer for some tasks including making reservations and checking business opening hours, however a full launch of this technology feels like a long way off. There's a lot to work through here, and there's still so much we don't know. What if it reaches an automated responder and suddenly it's a robot face-off? What about the potential for abuses of this technology? There's a lot to think about.
"We're still developing this technology and we actually want to work hard to get this right, get the user experience and the expectation right for both businesses and users," said Pichai at I/O. "But done correctly, it will save time for people and generate a lot of value for businesses."