Facebook's battle with the BBC: How it should have handled scandal over pictures of children

The network compounded a controversy with the way it reacted to the news organisation's investigation into image moderation that uncovered troubling evidence 

Click to follow
The Independent Online

The latest Facebook scandal is starting to make the fake news furore look like a storm in a tea cup. 

In case you’ve missed the story, it concerns a BBC investigation into the network’s moderation of imagines shared on the site. 

The news outlet’s contention was that it is not up to scratch, and it came up with some troubling evidence that that is indeed the case.

The investigation found posts of pictures that included images of child abuse. That in itself is deeply disturbing. 

What happened next, if the BBC’s account is reliable, only deepens the controversy. 

The Beeb says Facebook asked to see the pictures it had discovered. When it complied with the request, Facebook reported the corporation to the police, and cancelled its interviews. 

It claimed its action was standard industry practice when receiving the type of images that the BBC supplied it with. 

Now it is possible that lumpen response simply represents the triumph of bureaucratic business process over common sense. Big businesses have a nasty habit of getting egg on their faces as a result of that. Alternatively it might be down to a cock up, a risible one, but a cock up all the same. In either case, the network should own up to it. 

But what if it is something else. What if this was an attempt to gag its critic? If that is even close to being true, then Facebook has some serious questions to answer. 

As Damian Collins, chair of the House of Commons media committee, noted, the BBC was actually doing the network a favour. Its investigation could help keep Facebook's network clean and safe. 

The company might not love the bad publicity, but a business like Facebook needs to learn how to deal with that. See it, own it, act on it, sort out the mess and move on. 

Unfortunately, we’ve been here before when it comes to content posted on the network. Pictures of breastfeeding mothers? Get thee gone! Beheading videos? It’s free speech innit. People should be able to make their own mind up. 

And don’t even get me started on that fake news. 

Facebook’s policy director for the UK Simon Milner has told the Independent that the content is no longer on the platform, that the company takes the matter very seriously and that it will improve its reporting and take down measures. 

That’s the very least it should do. If I was in charge I would have said something more like this: “We want to thank the BBC for highlighting a serious issue that we were previously unaware of. 

“We’re launching an urgent investigation as to how this material was allowed to appear on our site, and will be upgrading our moderation of images as a matter of urgency.

"We are determined that our site should not be used for the sharing of abhorrent images such as those found by the BBC and we will make every effort to ensure that this is not repeated.”

See? It isn’t that hard. Unfortunately, despite all the clever people Facebook employs, this is not the first time it has tripped up. They need to get a grip, and quickly.