Facebook to hire 3,000 more workers to monitor content amid surge of violent videos
The new hires will join the 4,500 people Facebook already has reviewing content.
-- Amid a spate of live broadcasts of grisly incidents, Facebook CEO Mark Zuckerberg announced today he will be hiring 3,000 additional employees over the next year to monitor content and remove violent videos.
"Over the last few weeks, we've seen people hurting themselves and others on Facebook -- either live or in video posted later. It's heartbreaking, and I've been reflecting on how we can do better for our community," Zuckerberg said in a Facebook post. "If we're going to build a safe community, we need to respond quickly. We're working to make these videos easier to report so we can take the right action sooner -- whether that's responding quickly when someone needs help or taking a post down."
The new hires will join Facebook's community operations team. They will review the "millions of reports" Facebook receives each week regarding posts that may violate its terms of service, in addition to the 4,500 employees who already review posts. The move will hopefully "improve the process for doing it quickly," Zuckerberg said.
"These reviewers will also help us get better at removing things we don't allow on Facebook like hate speech and child exploitation. And we'll keep working with local community groups and law enforcement who are in the best position to help someone if they need it -- either because they're about to harm themselves, or because they're in danger from someone else," he said in the Facebook post.
In addition to hiring more reviewers, Zuckerberg said Facebook will also be "building better tools to keep our community safe."
"We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help," he said. "No one should be in this situation in the first place, but if they are, then we should build a safe community that gets them the help they need."
Facebook has been criticized recently for not doing enough to prevent videos showing violent incidents, including a murder in Cleveland and a killing of a baby in Thailand, from spreading on the social network. The company does not allow videos and posts that glorify violence, but this content is often only reviewed and possibly removed if users report it.
ABC News' Daniel Linden contributed to this report.