Google said it would seek to block child sex abuse images after suggestions that it was failing in its ‘moral duty’ / Getty

Google and Bing thwart some computer users, but others seek images in countries that are more lax

There has been a “precipitous drop” in the number of online searches for images of child sex abuse after internet firms took steps to block them, according to new research.

Google and Microsoft announced they would seek to block such images in 2013, after David Cameron condemned service providers for failing in their “moral duty” to protect children.

The new study, by Chad Steel, a professor at George Mason University in the US, in the journal Child Abuse & Neglect, found that the new controls had a “rapid and significant impact” on the number of searches for explicit pictures of children, which fell by nearly 70 per cent. Professor Steel warned, however, that people were still actively using search engines without filtering, hosted in countries with more lax regulatory environments, such as Russia. He said the internet companies’ actions demonstrate that “technical controls aimed at prevention can be effective”.

“The blocking efforts by Microsoft and Google… had a rapid and significant impact on child sexual exploitation material searches [CSEM],” he added. “The results show a precipitous drop in such searches starting in July 2013. CSEM query volume fell by 67 per cent.”

The study, the first to look at the global impact of the moves and the use of mobile devices by paedophiles, said that despite legal defence arguments that individuals “stumble” across CSEM, there was little evidence of people accidentally finding such material.

In November 2013, Microsoft and Google announced that they were removing child abuse images from their indices, filtering search results and showing warnings when specific searches were used.

To quantify the impact, the researchers looked at search traffic levels on Google and Microsoft’s Bing between January 2011 and August 2014 for key words widely used by people looking for CSEM.

They compared the data to searches on another provider, Yandex, which did not implement similar controls and which showed no drop in CSEM searches.

“One possibility for use of the Yandex search engine by those seeking CSEM is the fact that it does not filter results,” the paper said. “Additionally, Yandex is a Russian company, and possession of CSEM is not illegal in Russia – only distribution and production are illegal, and only of children younger than 14.”

The report also found that more than 34 per cent of all web-based queries for CSEM on Bing were conducted using smartphones and other mobile devices.

Jon Brown, head of sexual abuse programmes for the National Society for the Prevention of Cruelty to Children charity, found the report’s findings “encouraging”. He said someone now searching for CSEM would instead find a warning that they were potentially looking for something that was illegal and abusive. However, he questioned the suggestion that people did not stumble on it.

“People can drift through legal adult porn sites, searching for teens. Some are then going through to other sites and looking at child abuse images. These websites are all hosted outside the European Union and are very difficult to control,” he added.