Lee Rigby report: What’s the use of spotting terrorist activity if you don’t tell anyone?

Companies like Facebook don't seem to think it's their job to report any signs of terrorist activity

Click to follow
The Independent Online

If you thought you could stop a terrorist attack, would you not do everything possible to try? Anyone who travels on public transport – particularly in London – is reminded all the time of the need to be vigilant. If you see something suspicious, like a rucksack left on a train seat, you should report it to the authorities. Because every train and bus cannot have a police officer on board, it is up to us, responsible citizens of our modern society, to stay alert in the interests of protecting ourselves and the wider public.

When the Intelligence and Security Committee (ISC) published its report into the Woolwich murder of Fusilier Lee Rigby, the attached summary began like many Establishment reports do: with careful, nothing-to-see-here language it concluded there was nothing the intelligence agencies could have done to prevent the killing by Michael Adebolajo and Michael Adebowale.

But this report was different in that it then went on to castigate an unnamed American company, later revealed to be Facebook, for not reporting to the police, MI5 or MI6 that a number of Adebowale’s accounts on the network had been closed because of concerns about terror. Sir Malcolm Rifkind, the ISC chairman, said this company was the “one party which could have made a difference” in preventing the atrocity.

These conclusions have outraged civil liberties groups for putting “spin” on the Woolwich murder for the sake of deflecting attention from MI5 and MI6, as well as appearing to lay the groundwork for the Government’s new anti-terror legislation to close loopholes in internet law which, by no coincidence, is being launched this week. I agree, the timing of the Woolwich Report and Theresa May’s announcement of a new anti-terror bill is very convenient for the Government.

But so what if it is? I am a passionate believer in liberty and free speech, but not to the extent that extremists and terrorists are allowed to use social networks as “safe havens”, as the ISC said. I want soldiers like Lee Rigby to walk the streets of Britain without fear of attack. I want the security agencies to protect citizens. That is not incompatible with a desire for free speech.


It was Facebook’s automated system, not human employees, that spotted Adebowale’s extremism. But what is the point of having an algorithm that spots terrible things if nothing more is done than closing a user’s account? True, an exchange by Adebowale with another extremist that went into graphic detail of his desire to kill a soldier was not spotted by Facebook (a failing of that automated system) until after the murder in June 2013, yet, as the ISC’s report makes clear, the firm could have done more in general to act against extremists on its books.

Not only that, Facebook and other communication service providers do not feel under any obligation to report this sort of activity. The ISC report says none of the US companies proactively monitor suspicious content, instead relying on fellow users to report it – which in the case of extremists trying to evade capture is presumably useless. Because they are based in the US, these firms don’t feel the need to comply with any UK arrest warrants. The report says: “Therefore, even if MI5 had sought information - under a warrant - before the attack, the company might not have responded. They appear to accept no responsibility for the services they provide.”

Legal and intelligence experts say getting international agreement on anti-terror law, applied to the internet, will be extremely difficult. Meanwhile Facebook is only too happy to operate in their own world according to their own rules. They can conduct secret social psychology experiments on users by fiddling with our timelines without our knowledge. They can bamboozle us with 14,000 words of legal terms and conditions, knowing that the vast majority of us won’t read them, which they also know means that they effectively own the rights to our pictures. They can take down a joyous photograph of a mother breastfeeding her severely premature baby just because it shows a centimetre of nipple.

In response to the ISC report, the Silicon Valley giant emphasised that they do not allow terrorist content and “take steps to prevent” people using their site for that purpose. But clearly not enough. So maybe it is time for us to forget the law and just make a simple, human appeal to the company to act as a responsible member of society, like a passenger travelling on a rush hour train. So I ask Facebook, if you could stop a terrorist attack, what would you do?