Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Facebook: Financial powerhouse needs to find answer to fake news question

Even a messy court case couldn’t knock the gloss of its latest results, but it actually helped to highlight the company’s big issue 

James Moore
Thursday 02 February 2017 14:57 GMT
Comments
Security keys make the two-factor authentication process a little slicker
Security keys make the two-factor authentication process a little slicker (Reuters)

Apparently Facebook’s results were “overshadowed” by the loss of a court case that saw the company ordered to pay $500m (£394m) to a video game developer over the technology used to launch a virtual reality headset.

That had all the effect of a little fluffy cloud on the sun’s glare in Death Valley as far as Wall Street was concerned.

Small wonder as Facebook again proved what a financial powerhouse it is, more than doubling net profit to $3.7bn just in its fourth quarter on revenues of $8.6bn. It has close to 2bn users, and it’s yet to tap into China (where the authorities are somewhat wary).

A court case, even one with a $500m price tag (an appeal is pending), looks like a minor issue compared to that.

But the case, which concerned allegations that Facebook purchase Oculus used code obtained from rival gaming firm ZeniMax to develop its headset, does lead into one of the company’s biggest issues if you take the finances out of the picture.

I don’t mean the claim and counterclaim over the technology and how it was used. I mean the man who founded Oculus, a tubby near-billionaire Trumpkin from Southern California. Palmer Luckey caused no end of embarrassment to Facebook when it emerged that he had been active in the dark side of the movement backing the President’s barnstorming run for office (is there a lightside?) through funding pro-Trump trolls.

The controversy was over Luckey’s activities on Reddit, not Facebook, but memes, internet abuse and fake new are big issues for Facebook too and it has proved sensitive to criticism on the subject. As it should be.

Last year it got rid of human editors, amid questionable accusations that the stories on its trending feature were biased against conservatives. Facebook had denied the accusation.

Then an algorithm promoting stores based on what users were talking about was found, within short order, to have highlighted allegations that former Fox News icon Megyn Kelly was a pro-Clinton traitor who had been fired (she hadn’t been, although she has since left for pastures new).

Now, we are told, the company is looking at how artificial intelligence might help root out dodgy stories promulgated by dodgy people.

The company is very keen on the tech, and is a member of the Partnership on AI, along with some of Silicon Valley’s biggest firms (and also the American Civil Liberties Union).

The partnership’s stated goals are to advance understanding, support best practice and to create an open platform for discussion and engagement.

It all sounds wonderful, and very necessary because AI can have a dark side, and I’m not talking about Skynet and its Terminator robots here.

It was less than a year ago that Twitter managed to corrupt Microsoft’s chatbot Tay in less than a day. Tay, a virtual teenage girl that used the sort of language popular with that age group, became a full-blown Nazi with an oedipal complex in less than a day in the wake of its interactions with users.

Of course, things have moved on since then, at a rapid rate, and we don’t know how good the next generation Siris, Curtanas and Alexas that are now in development have got.

Still, is it so hard to imagine a Facebook fake news policebot either missing the fake stuff and deleting real news, or even disseminating fact-free stories?

Here’s my idea. It might take a little investment, but if Facebook has $2bn to blow on the likes of Oculus, surely it can stand to spend a little on getting some facts to where they are badly needed.

To do that it should go back to the future by rehiring a team of journalists and fact checkers.

Set up an editorial board with a clear mission – something like the BBC uses for its journalists would do nicely – and allow them to operate at arm’s length from corporate HQ, putting up trending topics, rooting out fake news, and making the platform a reliable source, which it needs to become given the number of people that use it as their primary source.

To oversee their work, set up an independent ombudsman service to deal with complaints (that Facebook funds but doesn’t influence) and if people moan tell them to approach it. If they don’t, or don’t like its rulings, tell them (politely) to go to hell.

Facebook is going to incur complaints regardless of what it does. It’s too big and too powerful not to (the same is true of the BBC, by the way).

Founder Mark Zuckerberg keeps saying he wants to connect people and make a positive contribution to society by doing so. This would show that he’s serious. It’s an old economy solution. But sometimes they work the best.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in