It hasn’t been Facebook’s week. Senator John Thune is demanding answers after a series of Gizmodo stories revealed the practices behind the website’s Trending Topics section — including claims from an anonymous former employee that editorial contractors overlooked or avoided certain stories or outlets with a conservative bent. And this all just weeks after a triumphant earnings report, in which Facebook unveiled its multi-year plan to dominate not just messaging and social networks, but new frontiers in bots, virtual reality, and internet connectivity.
Facebook’s response to Gizmodo’s scoops has been, so far, underwhelming. In a post on Facebook yesterday, Tom Stocky, the company’s VP of search, assured users that “[t]here are rigorous guidelines in place for the review team to ensure consistency and neutrality,” and, further, that “these guidelines do not permit the suppression of political perspectives.” And, don’t worry: “our reviewers’ actions are logged and reviewed, and violating our guidelines is a fireable offense.” Left unsaid was exactly what those “rigorous guidelines” were or how they were determined — or, for that matter, what “consistency and neutrality” would even mean in this context.
Let’s get this out of the way: Facebook won’t face any punishment over this, from the Senate or any other body — not that they even should, from a legal perspective. Facebook is a private technology company, after all. But the blowup over Trending Topics, severe enough to reach the halls of Congress, illustrates that a considerable number of the service’s billion and a half users have come around to the understanding that Facebook, private company or not, is enormously powerful. And the Gizmodo stories — and Facebook’s mealymouthed public response — reveal a company unsure how to wield that power, and seemingly unable to accept any of the responsibilities that come with it.
The lack of further explanation (or even acknowledgment of a problem) in Stocky’s post isn’t unprecedented. For the first decade of its life, Facebook did the smartest thing and kept quiet. The company, smooth and clean and functional in a way its rivals weren’t, never over-explained or pre-announced features. New technology is popular in part because it can seem like magic. A growing, learning directory of all friends new and old, giving you exactly the information that you crave before you even know that you want it!
New features, like the News Feed, in 2009, were never broken down beyond the vague idea of a black-box “algorithm” sorting. With News Feed, you could give Facebook thousands of signals about what you like and disliked, they’d pass behind the digital curtain, and relevant content would come out the other side. This approach made Facebook’s ecosystem easier to enter, and easy to return to. To explain any more would be to ruin the magic of the experience. It was Apple’s mantra, “it just works,” applied to a social network.
The black-box, it-just-works approach is great when it’s true of one independent website. And as a private company, Facebook has no particular obligation to reveal the inner workings of its products. But Facebook isn’t shy about its ambitions (or successes): It hasn’t just upended journalism and advertising; it plans on upending the retail, telecommunications, and entertainment sectors, too, inserting itself not as a competitor but as an entirely new layer in businesses around the world. It’s building drones that provide internet access through lasers. Its black-box sorting now governs a whole variety of experiences, or even entire industries. “It just works,” a selling point on your desktop, is less compelling at global scale. How does it work? How does the system serve up information? More important: What are we not seeing?
“There are rigorous guidelines in place” isn’t exactly a satisfactory answer. Unsurprisingly, Facebook has been unwilling to increase its transparency as it increases its power. It’s not obligated to, but it would be nice for a company with the reach and ubiquity of a public institution to have a clear sense of purpose beyond sheer growth, and an explanation of how its products serve that purpose. This, more than unreconstructed bias or shadowy suppression, is what Gizmodo’s stories reveal: a company unsure how to wield its own power. What would you do if, having annexed news media, you suddenly had the attention of their millions of former viewers? Facebook has directed this attention, in part, to plainly terrified, deeply hedged, anonymous summaries of junk-food stories, written by young journalists with no mandate or mission, hired on contract and treated poorly. When challenged on this, it shifts responsibility off of its managers and leaders and onto its system — the rigorous-guidelines black box that made its News Feed so impenetrable, and consequently so rudderless. Facebook has conquered itself an empire and now seems uninterested in ruling it.
This isn’t necessarily a call for Facebook to start directly sorting each individual News Feed according to some new editorial guidelines, or to eliminate the Trending Topics sidebar in favor of a more heavily curated selection of important stories. Easiest and most important would be for the company to stop falling back on (or hiding behind) algorithmic sorting and outdated ideas of journalistic neutrality. Every system created and controlled by Facebook is developed, administered, or altered by humans. Facebook has always seemed deeply uncomfortable about this fact, especially with respect to the News Feed — nervous, maybe, about exactly the accusations of bias that it’s facing now. But a company with an understanding of its power also needs to take responsibility for that power. As long as there are human links in the chain, there will be bias or, less insidiously, intention. (Would you, for instance, like an algorithm developed by someone who crosses out graffiti supporting Black Lives Matter, as someone on Facebook’s campus did?) We’ve reached the point where Facebook can no longer refuse to acknowledge that humans play a part in its ecosystem.