The Independent’s journalism is supported by our readers. When you purchase through links on our site, we may earn commission.

Mark Zuckerberg: Facebook hiring 3,000 to stop 'heartbreaking' violent videos

'If we're going to build a safe community, we need to respond quickly'

Aatif Sulleyman
Wednesday 03 May 2017 15:43
Facebook's algorithms aren't always effective
Facebook's algorithms aren't always effective

Mark Zuckerberg has announced a range of new measures designed to help Facebook remove unacceptable content more quickly.

The social network has come under heavy fire over recent weeks, after videos of two murders were uploaded to the site and watched by hundreds of thousands of users before eventually being taken down.

One of those videos, which showed an 11-month-old child being killed, was on the site for around 24 hours before moderators acted.

The other, which showed the murder of Robert Godwin Snr., was only reported by a user over an hour and a half after it was uploaded.

Mr Zuckerberg has revealed that Facebook will attempt to tackle the issue by adding 3,000 people to its community operations team over the next 12 months.

The team is currently made up of 4,500 members of staff, who have the mammoth task of moderating the billions of posts that go up every day.

“Over the last few weeks, we've seen people hurting themselves and others on Facebook -- either live or in video posted later,” Mr Zuckerberg wrote in a Facebook update. “It's heartbreaking, and I've been reflecting on how we can do better for our community.

“If we're going to build a safe community, we need to respond quickly. We're working to make these videos easier to report so we can take the right action sooner -- whether that's responding quickly when someone needs help or taking a post down.”

Facebook’s moderators are assisted by algorithms, which are designed to cut through the noise to filter out innocent updates, allowing staff to focus on a much more manageable sample of data.

Unfortunately, it isn’t always effective.

“In addition to investing in more people, we're also building better tools to keep our community safe,” added Mr Zuckerberg. “We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help. As these become available they should help make our community safer.

“This is important. Just last week, we got a report that someone on Live was considering suicide. We immediately reached out to law enforcement, and they were able to prevent him from hurting himself. In other cases, we weren't so fortunate.

“No one should be in this situation in the first place, but if they are, then we should build a safe community that gets them the help they need.”

Register for free to continue reading

Registration is a free and easy way to support our truly independent journalism

By registering, you will also enjoy limited access to Premium articles, exclusive newsletters, commenting, and virtual events with our leading journalists

Already have an account? sign in

By clicking ‘Register’ you confirm that your data has been entered correctly and you have read and agree to our Terms of use, Cookie policy and Privacy notice.

This site is protected by reCAPTCHA and the Google Privacy policy and Terms of service apply.

Join our new commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies


Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in