Facebook is stepping up its efforts to keep inappropriate and violent material – including recent high-profile videos of murders and suicides, hate speech and extremist propaganda – off of its site.
The world’s biggest social network said it plans to hire 3,000 more people to review videos and other posts after being criticised for not responding quickly enough to murders shown on its service.
The hiring drive over the next year will be on top of the 4,500 people Facebook already tasks with identifying criminal and other questionable material for removal.
Chief executive Mark Zuckerberg said the company was “working to make these videos easier to report so we can take the right action sooner – whether that’s responding quickly when someone needs help or taking a post down”.
Facebook, which had 18,770 employees at the end of March, would not say if the new hires would be contractors or full-time workers. David Fischer, head of its advertising business, said the detection and removal of hate speech and content that promotes violence or terrorism is an “ongoing priority” for the company, and the community operations teams are a “continued investment.”
Videos and posts that glorify violence are against Facebook’s rules, but it has drawn criticism for responding slowly to such items, including video of a slaying in Cleveland and the live-streamed killing of a baby in Thailand. The Thailand video was up for 24 hours before it was removed.
Facebook also reported stronger-than-expected quarterly results. The company earned $3.06 billion (£2.4bn) in the three months to March – up 76 per cent from the $1.74bn it reported a year earlier and ahead of analysts’ expectations. Revenues also beat Wall Street forecasts, rising 49 per cent to $8.03bn.
The social network had 1.94 billion monthly active users as of the end of March, up 17 per cent from a year earlier. Daily active users were 1.28 billion, on average, for the month.