Google's "terrifying" new artificial intelligence feature has prompted concerns about the takeover of the robots and the abuse of AI.
Many commentators have suggested that "Duplex" is not only strange but entirely unethical, and that it could signal an important moment in the acceptance and use of artificial intelligence.
This week, Google announced a whole range of new AI features. But it was just one of them that took almost all of the attention: Duplex.
That feature allows the assistant to call people on its owners behalf, booking them appointments or checking on the availability of restaurants. And it does so while sounding like a real human and without identifying itself, meaning that anyone speaking to it will probably think it is a real human.
The outcry was immediate, and vociferous. Many said that such a feature was immoral if it did not make clear it was a robot, and that such features that tricked people into think they are real could cause problems in the future.
Many commentators pointed out that, as such, the feature had been explicitly constructed to be deceptive, and to trick the people who use it. Google said that the phone calls it played during the demonstration were "real" and spoke positively about how the feature was able to route its way around any problems that were thrown up "gracefully".
Now, Google has said that the bot will identify itself as such. In Google's initial demo during its I/O conference, the voice sounded entirely natural and obscured the fact it was powered by AI – referring to its owner only as its client.
"We understand and value the discussion around Google Duplex – as we've said from the beginning, transparency in the technology is important," Google said in a statement. "We are designing this feature with disclosure built-in, and we'll make sure the system is appropriately identified. What we showed at I/O was an early technology demo, and we look forward to incorporating feedback as we develop this into a product."
Google's statement came after and apparently in response to the outcry about the feature. It has received criticism from a range of ethicists and commentators, who argue that more thought needs to given to the ethical questions around the tool.
As well as the deception, Google appeared to imply that it might be possible to synthesise any person's voice into a personal assistant. During the conference, for instance, it showed how it had been able to take choice snippets of John Legend speaking and make them into an entire catalogue of his voice – something that could presumably done with the Duplex voice, too.
The feature is not yet available in any consumer product. But Google said it had been working on it for years and suggested it could be enabled soon.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies