Mark Zuckerberg has announced a range of new measures designed to help Facebook remove unacceptable content more quickly.
The social network has come under heavy fire over recent weeks, after videos of two murders were uploaded to the site and watched by hundreds of thousands of users before eventually being taken down.
One of those videos, which showed an 11-month-old child being killed, was on the site for around 24 hours before moderators acted.
The other, which showed the murder of Robert Godwin Snr., was only reported by a user over an hour and a half after it was uploaded.
Mr Zuckerberg has revealed that Facebook will attempt to tackle the issue by adding 3,000 people to its community operations team over the next 12 months.
The team is currently made up of 4,500 members of staff, who have the mammoth task of moderating the billions of posts that go up every day.
“Over the last few weeks, we've seen people hurting themselves and others on Facebook -- either live or in video posted later,” Mr Zuckerberg wrote in a Facebook update. “It's heartbreaking, and I've been reflecting on how we can do better for our community.
“If we're going to build a safe community, we need to respond quickly. We're working to make these videos easier to report so we can take the right action sooner -- whether that's responding quickly when someone needs help or taking a post down.”
Facebook’s moderators are assisted by algorithms, which are designed to cut through the noise to filter out innocent updates, allowing staff to focus on a much more manageable sample of data.
Unfortunately, it isn’t always effective.
“In addition to investing in more people, we're also building better tools to keep our community safe,” added Mr Zuckerberg. “We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help. As these become available they should help make our community safer.
“This is important. Just last week, we got a report that someone on Live was considering suicide. We immediately reached out to law enforcement, and they were able to prevent him from hurting himself. In other cases, we weren't so fortunate.
“No one should be in this situation in the first place, but if they are, then we should build a safe community that gets them the help they need.”
Register for free to continue reading
Registration is a free and easy way to support our truly independent journalism
By registering, you will also enjoy limited access to Premium articles, exclusive newsletters, commenting, and virtual events with our leading journalists
Already have an account? sign in
Join our new commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies