The technology giant admitted this week that ”snippets” of recordings are analysed by language experts, claiming it helps improve its artificial intelligence voice recognition systems.
Google said in a blog post defending the practice its language analysts only review around 0.2 per cent of audio recordings.
However, it also revealed some of these recordings had been leaked by a worker in the Netherlands.
This led many people across social media to question what information smart speakers like Google Home and Amazon Echo actually collect.
Which recordings do Google listen to?
Google explained in its blog that Google Assistant only sends audio to Google after a device detects the “wake” words that indicate a person is interacting with the Assistant – for example, by saying “Hey Google” or by physically triggering the Google Assistant.
It also admitted the voice assistant is sometimes triggered by accident and any recordings made as a result could also be sent for analysis.
This means private conversations and intimate moments could be captured by the smart speaker and then sent to people employed by Google without the user even realising.
This is also the same for accidental interactions with Amazon’s voice assistant Alexa, which shares snippets of conversations with language analysts to improve “customer experience”.
Could your private conversations be leaked?
Both Amazon and Google say the recordings are anonymised before being sent to the analysts, though this is only the location data of the device, not the actual sound of the voice.
There is no evidence any Amazon Echo recordings have ever been leaked, and Google’s revelations were the first time such an incident had been publicly acknowledged.
Google said it was investigating the situation, though it is not yet clear what changes – if any – will be made.
“Our Security and Privacy Response teams have been activated on this issue, are investigating, and we will take action,” Google wrote in the blog.
“We are conducting a full review of our safeguards in this space to prevent misconduct like this from happening again.”
Can you stop smart speakers from listening to private conversations?
Smart speakers need to be always listening in order to function, as they are constantly listening for their “wake” word.
This does not mean they are always recording, though an always-on microphone is a tempting target for hackers.
Both Amazon and Google say they employ the highest security standards to prevent third-parties from listening in, though as Google’s recent revelations prove, this does not prevent employees from listening to and sharing recordings.
“We hold ourselves to high standards of privacy and security in product development, and hold our partners to these same standards,” Google said.
“We also provide you with tools to manage and control the data stored in your account. You can turn off storing audio data to your Google account completely, or choose to auto-delete data after every 3 months or 18 months.”
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies