4 Takeaways From Facebook’s Trending-Topics Controversy

By

This week, Facebook came under scrutiny for how it handles trending news, following a series of Gizmodo stories revealing that the “Trending Topics” the company had previously claimed were algorithmically determined were in fact selected by a combination of machine and human decision-making. Obviously (obviously) the ins and outs of how secretive technology platforms surface information, and the protocols put in place to to ensure neutrality, are extremely interesting and you know all about it. But if you just want the “what did we actually learn this week,” read on.

Human editors play an important role in selecting what news stories are featured as “Trending Topics.”

The key news from Gizmodo’s reports concerns the previously unknown process for creating the Trending Topics sidebar: As it happens, contrary to Facebook’s official messaging, a bunch of contractors hired by Facebook have editorial control over what does or does not appear in the News Feed. Facebook eventually laid out the process fairly succinctly: First, automated programs (a.k.a. an algorithm) identify news stories by scraping both Facebook’s own data and a selection of RSS feeds from news outlets. Human editors pick stories, confirm them against reporting in other news outlets, and then write short summary descriptions. Once they’re published, another algorithm determines which Facebook users see which trending topics. Also important: Editors could also blacklist trending topics that were untrue or irrelevant, or “inject” news topics that were important but not trending.

Facebook is not, as an institution, suppressing conservative news.

That Facebook reportedly excluded certain conservative news topics from the Trending section — as an anonymous conservative former contractor told Gizmodo — is less a nefarious act of censorship and more a product of 1) adding human judgment into the equation, and 2) the fact that many prominent conservative news sites are … bad. In other words, if conservative comedian Steven Crowder, to name one example from Gizmodo’s post, did not appear in Trending Topics, it’s not because Mark Zuckerberg instructed his contractors to suppress Crowder’s work — it’s because an editor at some point in the process decided that “Steven Crowder” was, well, not news.

Trending Topics, the feature at the center of all this, is fairly unimportant in the context of Facebook.

“Trending Topics” itself — written in an off-putting alien tongue, containing three or four hedges where they need only one — is not hugely important to Facebook or Facebook’s business plan. It’s not even hugely important to Facebook’s readers: It’s only really a prominent feature on the desktop version of the site, and the vast majority of time spent on Facebook is spent in its mobile app. When journalists and media critics talk about Facebook’s power, they’re talking more about the sorting algorithms that govern the central News Feed, which is only indirectly connected to the Trending Topics.

But learning more about Facebook’s decision-making process can tell us about how Facebook sees itself.

Facebook, with its 1.6 billion monthly users around the world, is probably the single most important media company on the planet. But until now it’s been frustratingly opaque about how products like Trending Topics are created, preferring to hand-wave about algorithmic selection and sorting. Even a small glimpse into Facebook’s policies and procedures gives us a sense of the company’s sense of itself and its relationship to news media. While it might not be a lightning-bolt revelation to learn that Facebook still relies on legacy media to confirm viral stories, or that the company thinks of itself as an open and neutral platform, it helps us better understand the ideologies that shape the presentation of news on one of the world’s largest websites. Most important, users should know that, despite Facebook’s past statements, deliberate decisions are being made about what information they receive. They’re not insidious, but they’re not neutral, either. Getting your news from only one source, even if that source is Facebook, is never a good idea. Mix up your browsing habits!