Advertisement

Facebook to add 3,000 video moderators following spate of killing videos

The announcement comes after two high-profile killings were broadcast on the social network.

CREDIT: AP Photo/Noah Berger
CREDIT: AP Photo/Noah Berger

Facebook CEO Mark Zuckerberg announced Wednesday that he is adding 3,000 staff over the next year to moderate videos on the platform for violent and graphic content.

“If we’re going to build a safe community, we need to respond quickly. We’re working to make these videos easier to report so we can take the right action sooner,” Zuckerberg announced on his Facebook page. “Over the next year, we’ll be adding 3,000 people to our community operations team around the world — on top of the 4,500 we have today — to review the millions of reports we get every week, and improve the process for doing it quickly.”

The announcement comes after two high-profile killings were broadcast on the social network. Cleveland, Ohio man Steve Stephens, 37, uploaded a video of himself fatally shooting 74-year-old Robert Goodwin Sr. on Easter Sunday. The video remained on Facebook for three hours. On April 25 in Thailand, Wuttisan Wongtalay, 20, used Facebook Live to broadcast him killing his 11-month year old daughter before hanging himself. The video remained online for 20 hours.

Advertisement

Zuckerberg briefly touched on the Cleveland murder during the company’s F8 conference in April, saying, “We have a lot of work, and we will keep doing all we can to prevent tragedies like this from happening.”

Until Wednesday’s post, however, the number of moderators Facebook employed was unknown. The company refused to divulge those numbers during a UK Parliament investigation pertaining to the Facebook and other social media companies’ slow response to taking down extremist and violent content.

Zuckerberg also mentioned in his post that the new moderators would help Facebook better remove content with hate speech and child exploitation.

“We’re going to make it simpler to report problems to us, faster for our reviewers to determine which poses violate our standards and easier for them to contact law enforcement if someone needs help,” he wrote. “As these become available, they should help make our community safer.”