Creating a porn filter is one thing, creating a transparent and functional one quite another

The Government has entrusted the safeguarding of children to a bunch of non-UK, unregulated, commercial interests

And so the truth slips out.  Attempts to provide online protection for children on the cheap have delivered as warned: less than protection; and a deeply illiberal attack on minorities. Nothing so crass as out-and-out blocks on LGBT sites or sites providing support for the vulnerable and abused – but software that offers parents that option.

It could all have been – could still be – so much better.  The real question is whether government wants it to be.  Or will it just make excuses, blame the messenger, and reveal this as only a cheap publicity stunt?

A key test of their integrity is how they respond to initiatives for regulating the regulators. Three US/Chinese companies provide the bulk of domestic blocking: they are unelected, secretive, commercial enterprises. Ensuring they uphold British values is possible and would have the merit of rescuing a scheme that looks to be struggling under the weight of its own contradictions – most pressingly, an assumption that the best way to impose acceptable standards is to keep them secret.

For censorship is nothing new, though the UK approach was always hands off: government set direction: implementation was overseen – transparently - by the the regulated industry.

In 1909, Government required the licensing of cinemas.  Their reasoning was primarily safety, though of the pragmatic, rather than moral kind: cellulose-based film had a tendency to catch fire and explode.  Still, this created risks for the nascent film industry: to wit, the danger that each and every local authority go its own way: censoring capriciously, creating a nightmare of inconsistency. 

So film-makers clubbed together and in 1912 set up the British Board of Film Censors as means to rein in the oncoming juggernaut.

A similar problem arose in the 1950s, when a Conservative moral crusade persuaded police and local magistrates to take action against seaside postcards.  Inconsistently, of course: a postcard seized and burnt in one area might still be sold a few miles further down the coast.

On this occasion, the Director of Public Prosecutions attempted to bring order to this chaos of random censorship with his very own dirty postcard collection.  This was smut with purpose, designed to separate out cards considered obscene from the merely salacious: providing central guidance to stop the law from falling into disrepute.

Fast forward thirty years to the “video nasty”.  Different police, different magistrates, were seizing and prosecuting different videos: one force optimistically impounded light comedy The Best Little Whorehouse in Texas and provocatively named, but otherwise innocuous, war movie The Big Red One.

Once more, before legislation and the involvement of the BBFC, the DPP intervened to impose consistency and preserve the dignity of the law, creating his own list of “definitely nasty”.

As then, so now:  a major issue with current filtering and blocking is consistency. Another: irrationality and discrimination. With companies vying to prove their software is best, marketing advantage is claimed by offering unique systems that cannot be directly compared: the only consistency seems to be the way they clamp down on any non-normative sexuality or identity, far exceeding their initial anti-porn brief.

Though what else to expect when government has entrusted the safeguarding of children to a bunch of non-UK, unregulated, competing, commercial interests? It does not take an Einstein to see how this could cause problems.

To save the approach from death by a thousand critics it has been suggested that blocking and filtering systems need to:

1. Use standard categories of blocking/filtering;

2. Subscribe to a central clearing station for complaints about blocks;

3. Agree minimum standards for over- and under-blocking

4. Comply with existing equalities legislation

This would need a licensing regime and a regulator with power to audit individual system performance. The internet service industry has questioned who would pay for this.

That, though, is detail. If government considers this objective so important then perhaps it will come up with the funding: if the public are equally motivated, perhaps they would happily pay a small levy on their internet connections?

What cannot be is that government, industry and public continue as they have: demanding that “something be done”, yet ignoring the consequences, utterly unwilling to pay the price for turning that “something” into “something that works”.

The gauntlet is thrown: if UK.gov truly is serious about internet regulation, its time it moderated the rhetoric, and started to think seriously about the solution.

Jane Fae is working on a book on the regulation of online pornography, due to be published in early 2014

Comments