Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

This article wouldn’t get past a public filter, Mr Cameron

In 20 years of internet use, I’ve never once stumbled on images or videos of child abuse

Rhodri Marsden
Tuesday 23 July 2013 03:07 BST
Comments

There’s something about the fearsome technological and social complexity of the internet that makes it easy to pick holes in regulatory measures proposed by politicians, even if there’s unanimous agreement about their aims. David Cameron wants child abuse images to be eradicated and for children to be shielded from all forms of pornography. Barely anyone would disagree with this. But the conflation of these two very separate issues in Monday’s announcement was unhelpful, and in some ways seemed to betray and perpetuate a fundamental misunderstanding of the way the internet works.

A month ago, Google announced – to much less fanfare than greeted Mr Cameron’s statement – that it was pumping cash into further developing technology that automatically flags potential images of abuse. This is a fight that’s ongoing, and has been for many years; police around the world already use image technology to track down producers and consumers of online child abuse images.

But the vast majority of this disturbing activity takes place behind heavily secured networks that you’d never accidentally stumble across when using a search engine. In 20 years of indescribably heavy internet use, I’ve never once been confronted with images or videos of child abuse – although admittedly, unlike some writers working for other newspapers, I’ve never actively set out to seek them in order to write a story about their supposed prevalence.

You can create a blacklist of certain Google search terms as long as your arm, or longer, but the real battle against child abuse images is an incredibly complex technological one. On Monday Paul Jones, father of murdered schoolgirl April Jones, asked: “Why can’t they take this stuff off the internet?” It’s a question to which we’d all love to give him an answer, but it’s hard to do so because of the difficulty of defining “they”, and the problems associated with locating the “stuff”.

And then there’s legal pornography. Many would agree with the assertion that all pornography is bad, but the vast majority of it isn’t illegal, and to say that there’s a lot of it is a colossal understatement. Thousands of hours of new material, both home and studio-made, are uploaded to the internet every day, and while filters are becoming more savvy, they’ll always be imperfect.

It’s likely that this article would be blocked by a filtered public Wi-Fi point because of the language I’ve used; that’s no big deal, but just as easily as non-pornographic comment pieces can be blocked, pornographic images can get through.

Again, you can introduce as many opt-in schemes for pornography as you want, but an equally important measure is to educate parents to educate their children about the realities and dangers of content online.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in