select all

Facebook’s Training Documents: ‘We Protect White Men’

Mark Zuckerberg. Photo: Justin Sullivan/Getty Images

How does Facebook determine whether a post reported for abuse gets taken down, or how an account gets suspended? The moderation guidelines, which must cover every eventuality for the 2-billion-user social network, are among the company’s most controversial and secretive policies, and the complex flow chart of factors is rarely glimpsed by regular Facebook users. A new report from ProPublica elaborates on this convoluted process, which Facebook’s team of 7,500 content moderators must learn the ins and outs of. One hugely ill-conceived slide used for training asks, “Which of the below subsets do we protect?” Female drivers, black children, or white men? The answer is … white men, illustrated, naturally, by a picture of the Backstreet Boys.

Previous reports have shown that Facebook approaches moderation a bit like an engineering problem: There are “protected classes,” like race, and “non-protected classes,” like age, and a formula to determine how the categorization scheme applies to a given post.

Of course, Facebook’s determination of protected and unprotected classes is fraught with political and ethical problems. The protected categories on Facebook are sex, race, religious affiliation, ethnicity, national origin, sexual orientation, gender identity, and serious disability or disease. The unprotected categories include social class, occupation, continental origin, political ideology, appearance, religion, age, or country. If a post targets all members of a protected class — say, women — it’s penalized. If it targets a subset — that is, an unprotected category within a protected category, like women drivers — it’s not.

Facebook’s slide makes perfect sense according to its rules: “Black children” is a specific subset, whereas the category of “white men” is not. But it’s a viscerally unpleasant reminder that arguments that make sense when pitched as algorithmic rules for decision-making are suddenly rendered ridiculous when put into place. This mode of thinking also sheds light on why religious affiliation is protected, but religion is not (consider the difference between criticizing Jews and Judaism), yet the scant difference between the two leads to a lot of gray areas when moderators make their decisions. As Dave Willner, a former Facebook content moderator who helped build the rule book, told ProPublica, the system is “more utilitarian than we are used to in our justice system … It’s fundamentally not rights-oriented.”

Facebook’s Training Documents: ‘We Protect White Men’