MOUNTAIN VIEW, California — Technology has given some people a voice, but now Google is ready to let its artificial intelligence do more of the talking.
Google Duplex, a new artificial-intelligence technology announced on Tuesday at Google I/O, the company’s annual developer conference, can now call on behalf of a human to schedule an appointment or restaurant reservation. And yes, it’s all done while conversing with a real person on the other end of the line.
Google CEO Sundar Pichai played real-life calls using the feature during his presentation. He acknowledged that the tech isn’t perfect, but it will still be rolled out in the coming weeks.
"The technology is directed towards completing specific tasks, such as scheduling certain types of appointments,” Yaniv Leviathan, principal engineer, and Yossi Matias, vice president for engineering, said in a blog post. “For such tasks, the system makes the conversational experience as natural as possible, allowing people to speak normally, like they would to another person, without having to adapt to a machine.”
The AI sounds natural, even adding human vocal tics like “um.” It can also ask follow-up questions as it works with the human on the other end of the line to schedule an appointment.
“We are working hard to get this right,” Pichai said.
Byers Market Newsletter
Get breaking news and insider analysis on the rapidly changing world of media and technology right to your inbox.
In fact, the AI voice sounds so convincing that it’s plausible that humans might not even realize they're speaking with a machine. The system relies on Google’s real-time supervised training to improve on its conversational skills and processing.
While it’s intended to be useful, it could also raise some potential privacy red flags.
“We want to be clear about the intent of the call so businesses understand the context,” the blog post said. “We’ll be experimenting with the right approach over the coming months.”
The human-involved training relies on “experienced operators” to monitor the system when it makes a call in a new domain. If needed, those operators “can affect the behavior of the system in real time as needed.”
“This continues until the system performs at the desired quality level, at which point the supervision stops and the system can make calls autonomously,” Leviathan and Matias explained.
The announcement did not elaborate on how Google is considering baking more transparency into the process.
Following the recent redesign of Gmail, Google is also rolling out smart replies in Gmail that will predict what users want to say when drafting an email, all before they even type it.
It’s not just “Have a great weekend!” The artificially-intelligent replies can also suggest complete sentences for emails as specific as planning a “Taco Tuesday” with friends.
The new feature is being released in the coming weeks to users who have the new Gmail, which can be engaged by clicking on the settings gear at the top of the inbox and selecting “Try the new Gmail.”
Aside from outsourcing mundane tasks to AI, Google also took a more methodical approach to helping people have healthier relationships with their devices. It’s a phenomenon Pichai called “the joy of missing out.”
It’s not enough to be “wide eyed” about technology, Pichai said.
“The path ahead needs to be navigated carefully and deliberately, and we feel a sense of responsibility to get it right,” he said.
Google is planning to roll out features to help foster a slightly healthier relationship with tech, including a digest of notifications, “take a break” reminders and a tally of how much a person is using his or her phone.
The idea is to help people create “balance” so they can spend more time with family, Pichai said. Those who already treat their Google Home speaker as another member of the family may also want to take note of another announcement.
Google Assistant now has a “pretty please” mode, which requires people to use manners when speaking to it and offers positive reinforcement for when someone says "please."