As videos and livestreams of people hurting themselves or others on Facebook become a regular part of the news cycle, the company has been facing mounting pressure to, well, do something about it. Today, Mark Zuckerberg announced that Facebook would “be adding 3,000 people to our community operations team around the world — on top of the 4,500 we have today — to review the millions of reports we get every week, and improve the process for doing it quickly.” Those reviewers will remove videos featuring hate speech, child exploitation, and, you know, murder and suicide. They’ll also coordinate with law enforcement if need be.
A few things jump out about this announcement. One is the language with which Zuckerberg describes his new gore squad: “[A]dding 3,000 people to our community operations team” — i.e., not hiring for Facebook itself — likely implies that the company is contracting a third party to review user content, rather than hiring in-house. This is not new or shocking — pretty much every major platform does this — but it does mean that Facebook is not responsible for administrating or caring for thousands of people tasked with viewing truly heinous shit in often terrible working environments. Facebook declined to specify to Slate’s Will Oremus whether the workers would be employees, thus entitled to benefits, or contractors. Read into that lack of specificity what you will.
The announcement also runs counter to Facebook’s overarching agenda this year. At their annual F8 conference less than a month ago, the company spent much of its keynote talking about artificial intelligence, machine learning, and deep neural networks — buzzwords that translate to “computer smart.” Now, Facebook is bringing in 6,000 eyeballs to answer tough questions that computers can’t yet address, like, “Is this footage of a person getting shot?” Of course, the F8 keynote kicked off with Zuckerberg expressing remorse over Facebook Live’s role in the murder of 74-year-old Robert Godwin Sr. in Cleveland, so it’s clearly on the CEO’s mind.
One question left unaddressed by Zuckerberg: Why did it take so long? Just three months after Facebook began heavily pushing Live in April 2016 — Zuckerberg called the format “emotional and raw and visceral” in an interview with BuzzFeed — video of Philando Castile’s death at the hands of police was everywhere, thanks to the platform. Throughout 2016, Facebook continuously stumbled over itself in pushing live video, paying celebrities and media companies to manufacture spontaneity. The end goal for Live was to get users to share more, post more, and spend more time on the site, letting Facebook know more about each user and serve them more ads. It never takes very long for a product sitting in front of a billion and a half people to be used to spread graphic violence, and Facebook, in seeking to make broadcasting easy and spontaneous, enables this behavior.
Adding moderators — if we make the generous assumption that they’re properly compensated and protected — will help fix the problem. But Facebook should have seen it coming from the start.