Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Google accidentally reveals names of rape victims despite law

The company says it will remove any similar examples in the future

Andrew Griffin
Wednesday 23 May 2018 09:55 BST
Comments
A man checks Google devices outside the tech giant's booth at the Mobile World Congress in Barcelona, Spain
A man checks Google devices outside the tech giant's booth at the Mobile World Congress in Barcelona, Spain (Reuters)

Google has been inadvertently revealing the names of rape victims that should remain secret.

Searching for details of certain cases can bring up predictions that reveal details of those cases that are confidential. Victims in rape cases are supposed to remain confidential forever, even if the accused is found not guilty.

A number of examples of searches that revealed the names of accusers were seen by The Times. Google has admitted that the illegal information was shown, but said it would work to stop them showing in the future.

Google's autocomplete feature pops up whenever a user starts typing, and attempts to guess what the person is looking for so they can get there more quickly. It does so using an algorithm, by logging common and trending searches, and using information about location and previous searches.

But because that information is being pulled in from the internet, it can show irrelevant and sometimes shocking results. In the case of the rape results, the details appear to be pulled in from discussions on social media posts in which people are illegally naming accusers.

Google has systems in place that are intended to catch inappropriate predictions and avoiding showing them to users. But the huge number of searches going through the site – 15 per cent of which are new – make catching all of them difficult.

Google said that the autocomplete predictions highlighted by The Times were against its rules. It has removed the examples it has been alerted to and would remove any similar ones in the future, it said.

"We don't allow these kinds of autocomplete predictions or related searches that violate laws or our own policies and we have removed the examples we’ve been made aware of in this case," a Google spokesperson said. "We recently expanded our removals policy to cover predictions which disparage victims of violence and atrocities, and we encourage people to send us feedback about any sensitive or bad predictions."

It is far from the first time that Google's autocomplete feature has brought trouble to the company. Because it is based on searches and other data from the internet, it can occasionally show highly offensive predictions even for apparently innocent searches.

Google's rules for what shows on search results pages – as opposed to the autocomplete box that pops up to help users get to them – are enforced differently, and it is not thought that any of the problem searches were displayed when users actually clicked through to the results.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in