Facebook’s black box is slowly being opened. Last week, after speaking with a number of former Facebook news curators, Gizmodo revealed how the “Trending Topics” sidebar is generated: Its stories are selected and summarized by human editors choosing from a group of topics culled by an algorithm. Today, in a follow-up report, new details were revealed — most juicily, that “workers prevented stories about … conservative topics from appearing” in the sidebar.
Gizmodo’s reporting on the subject is enormously valuable to anyone interested in Facebook’s increasing power over news media. But it also raises the question: What do we want out of Facebook? And what does Facebook itself want to be?
Take, for example, the claim that Facebook workers “routinely suppressed conservative news stories.” “Depending on who was on shift, things would be blacklisted or trending,” Gizmodo’s source — a conservative former curator — tells reporter Michael Nuñez.
“I’d come on shift and I’d discover that CPAC or Mitt Romney or Glenn Beck or popular conservative topics wouldn’t be trending because either the curator didn’t recognize the news topic or it was like they had a bias against Ted Cruz.”
The former curator was so troubled by the omissions that they kept a running log of them at the time; this individual provided the notes to Gizmodo. Among the deep-sixed or suppressed topics on the list: former IRS official Lois Lerner, who was accused by Republicans of inappropriately scrutinizing conservative groups; Wisconsin Gov. Scott Walker; popular conservative news aggregator the Drudge Report; Chris Kyle, the former Navy SEAL who was murdered in 2013; and former Fox News contributor Steven Crowder.
This is telling: The claim isn’t quite that conservative news was “suppressed” — as Nuñez acknowledges, “there is no evidence that Facebook management mandated or was even aware of any political bias at work” — but that editors in charge of the sidebar weren’t picking out the conservative trending topics. (And, given that list of overlooked topics, which range from IRS conspiracy theories, to an unreliable news aggregator, to a brutally unfunny conservative comedian, can you blame them?)
As an exercise, imagine reading the quotation above and learning it was being spoken about the New York Times or the Washington Post. Weird, right? Because it wouldn’t, really, be news: When you buy or read the Times or the Post you expect to be consuming the end result of a series of editorial decisions about the topics to cover, and the angles and prominence of the coverage. (In fact, those editorial decisions are the reason to buy the paper!) If the Times declines to cover Steven Crowder, it’s not because it’s “suppressing” conservative news. It’s because its editors have made the judgment that Crowder, and whatever thing he did, are not newsworthy.
So Facebook, as Gizmodo writes, “operates like a traditional newsroom,” and ultimately how you feel about that rests less on your politics than on your understanding of what Facebook is, and what it’s for. If you understand Facebook as a publication, it’s hardly surprising to learn that decisions are being made about what gets coverage (and how).
But Facebook has implicitly and explicitly presented itself not as a publication but as a distribution mechanism — an unbiased and neutral platform whose sidebar lists nothing more complicated than “topics that have recently become popular on Facebook.” As with its Messenger service, it obfuscates the humans making decisions on the other side of the screen.
More than the politics at play, it’s the lack of transparency that’s troubling. Facebook has some 1.5 billion users around the globe; the algorithms that determine the sorting of its News Feed can create and destroy entire companies; and it wasn’t until Gizmodo’s story last week that anyone outside Facebook had a clear sense of how it determined its Trending Topics. (Last August, for example, Recode reported that “what Facebook adds to the trending section” is chosen “automatically by the algorithm.”)
None of which is to argue that Facebook shouldn’t have human curators sorting through its topics. There will never be a perfect algorithm to identify and summarize popular stories. That Facebook is using human editorial judgment to determine which topics are worth highlighting and which are not is almost cheering — an indication that the company has some sense of the enormous power it now holds. Facebook is the world’s biggest publication, after all. It’s just not willing to admit it.