
As it turns out, watching stuff that’s too horrible to be allowed on social media is not good for your mental health. This is not new news, but it is back in the headlines again following a report published by the Verge this morning, detailing conditions at Cognizant, a company Facebook contracts to help perform moderation tasks. At the level of the individual moderator, this generally means sitting at a computer all day looking at terrible stuff — violence, gore, hate speech, conspiracy theories — and deciding whether or not it violates Facebook’s community guidelines.
This would be a punishing job even in the most ideal conditions, and Cognizant’s moderators were working under conditions far from ideal. In their Phoenix facility, they were paid only $28,000 per year (Facebook’s median salary, for perspective, is $240,000), and every minute was rationed. They got nine minutes of “wellness time” per day, and employees are heavily monitored. Some developed PTSD-like symptoms, and others have come to believe the conspiracy theories (the Earth is flat, the Holocaust didn’t happen) that they are supposed to be adjudicating. The environment of a firm like Cognizant bears resemblance to the working conditions of an Amazon warehouse, where workers are pushed to the limit physically in the name of efficiency.
The situation is a bit of a conundrum. Calls for Facebook to take more aggressive action against people posting potentially harmful or disturbing content on the site — whether that’s straight-up gore or intentionally deceptive election hoaxes — have grown louder in recent years. And yet employing more stringent filters means employing more human moderators to watch terrible stuff and bear the mental strain.
“Checkmate!” says the Facebook Defense Force, which mobilized after the Verge’s report came out. “The same FB critics who call on the company to take on responsibility for moderating content (an operational job they don’t want, and had to be pressed to perform), will of course be shocked, shocked at the human cost in reviewing billions of pieces of random content,” former Facebook employee Antonio García Martínez wrote on Twitter. Alex Stamos, the former head of security at Facebook, echoed the sentiment: “AGM has a point, however, about journalists having it both ways. Nice to not be responsible.”
I mean, yeah. It’s easy for me to say that Facebook’s governance of its own platform sucks, and that the solution they’ve implemented also sucks. Like, Facebook let a bunch of people puke all over a floor the size of a football field, and there’s only one janitor mopping it up, and the janitor only gets nine minutes a day to not breath in the puke fumes. Also, the puke is arriving faster than the janitor can mop.
Still, according to García Martinez and Stamos, it’s the fault of journalists and critics for pointing out that Facebook has a substantial puke problem. The pair have styled themselves as “gotta hear both sides” pundits fighting straw men who supposedly don’t understand this deeply nuanced issue. In reality, many predicted this terrible outcome. (Here’s a good Wired report from 2014.)
Back in May 2017, Facebook announced it was going to add 3,000 more human moderators to its legion. Here’s what Intelligencer wrote at the time:
A few things jump out about this announcement. One is the language with which Zuckerberg describes his new gore squad: “[A]dding 3,000 people to our community operations team” — i.e., not hiring for Facebook itself — likely implies that the company is contracting a third party to review user content, rather than hiring in-house. This is not new or shocking — pretty much every major platform does this — but it does mean that Facebook is not responsible for administrating or caring for thousands of people tasked with viewing truly heinous shit in often terrible working environments.
At the time, Facebook declined to comment when asked if the 3,000 additions would be employees or contractors — usually not a great sign. Was Facebook pressured to be more proactive in cleaning up its own site? Sure. But nobody was asking Facebook — a company that is still smashing revenue estimates — to do this by outsourcing the work to poorly compensated, third-party employees forced to operate at a breakneck pace. (Or rather, nobody who wasn’t a stakeholder focused on the profit margins was asking for this.)
The failure of Facebook to adequately compensate its moderators and ensure their wellbeing is not a failure of pundits to understand the consequences of pressuring Facebook to take action. It is a failure of ingenuity on Facebook’s part, because the only solution the platform understands is throwing more bodies at the problem. Facebook does this frequently. Facebook announces it is hiring more moderators. Facebook says it’s collecting more data to better train its computer-automated moderation. Facebook decides it is bringing in more fact checkers to combat misinformation. (“The issue here is there aren’t enough of them,” Zuckerberg said last week. “There just aren’t a lot of fact checkers.” There’s only so much meat in the meatspace.)
If you can’t keep throwing bodies at the problem, what can you do? I am a bit of a broken record on this issue, but the simple fact is that Facebook’s current scale and structure is untenable. The News Feed software — meant to try and intuit what you want instead of fulfilling a user-submitted request — cannot bear the strain of all of the content flowing through Facebook’s network, because a News Feed used and cross-pollinated by two billion people cannot abide by a single set of rules. It is a flattened environment where a link to a flat-Earth proselytizer’s hour-long “documentary” is treated with the same consideration and weight as a link to a mainstream media outlet. Moderating the News Feed has scaled into an endless, thankless task, because humans contain multitudes. Gross, disgusting multitudes.
Let’s return to the metaphor of Puke Stadium. The News Feed is the field of aforementioned puke stadium and everyone in the stands is a News Feed user. They’re all looking at the big pile of puke from different angle but they’re all at least going to see some puke. The Jumbotron is the News Feed algorithm that decides what posts you see. Sometimes the Jumbotron will focus on a truly gross part of the puke and everyone will get mad and say “Look at the gross thing on the Jumbotron! Take it off!” Some spectators will even head down to the guard rail and add some puke to the pile.
Now consider this: what if we divided up the field of puke stadium into a grid and people volunteered to take over sections of the grid and accept specific kinds of puke — hot dog puke, beer puke, puke that says the holocaust was a hoax. This sort of function already exists, in the form of Facebook Groups — a more sectioned, potentially more private and isolating system that allows users to set individual guidelines for their own cohort. The problem is that it’s still happening on the field, in the stadium — the Jumbotron is still plucking things from each section of the grid, out of context and showing the specific pukes to people who might not want to see said puke. Everything flows back through the News Feed.
Maybe I’ve lost you with my extended metaphor. The point is: maybe Puke Stadium shouldn’t exist. Facebook is a highly centralized system that can’t possibly satisfy everyone, and Facebook operates at a scale where even a tiny portion of dissatisfied users amounts to tens millions of people. Facebook’s response to the Verge’s report today repeatedly emphasizes the sheer scale of the issue. The only solution for Facebook is to scale down, or restructure itself in a more balkanized way. It’s working on it, but the News Feed is a cash cow, and so long as the News Feed exists, it will have a substantial moderation problem. Unfortunately, it’s the lowest-level workers, like the contractors in Phoenix, who are bearing most of the strain.