Donald Trump Won Because of Facebook

By
Photo: Chip Somodevilla/Getty Images

A close and — to pundits, journalists, and Democrats — unexpected victory like Republican presidential candidate Donald Trump’s is always overdetermined, and no one particular thing pushed Trump over the edge on Tuesday night. His chosen party’s lately increasing openness to explicit white nationalism, the still-recent global-scale failure of the liberal economic consensus, the apparently deep-seated misogyny and racism of the American electorate, Hillary Clinton’s multiple shortcomings as a candidate, or even the last-minute intervention of FBI director James Comey might each have been, on its own, sufficient to hand the election to a man who is, by any reckoning, a dangerous and unpredictable bigot.

Still, it can be clarifying to identify the conditions that allowed access to the highest levels of the political syste a man so far outside what was, until recently, the political mainstream that not a single former presidential candidate from his own party would endorse him. In this case, the condition was: Facebook.

To some extent I’m using “Facebook” here as a stand-in for the half-dozen large and influential message boards and social-media platforms where Americans now congregate to discuss politics, but Facebook’s size, reach, wealth, and power make it effectively the only one that matters. And, boy, does it matter. At the risk of being hyperbolic, I think there are few events over the last decade more significant than the social network’s wholesale acquisition of the traditional functions of news media (not to mention the political-party apparatus). Trump’s ascendancy is far from the first material consequence of Facebook’s conquering invasion of our social, cultural, and political lives, but it’s still a bracing reminder of the extent to which the social network is able to upend existing structure and transform society — and often not for the better.

The most obvious way in which Facebook enabled a Trump victory has been its inability (or refusal) to address the problem of hoax or fake news. Fake news is not a problem unique to Facebook, but Facebook’s enormous audience, and the mechanisms of distribution on which the site relies — i.e., the emotionally charged activity of sharing, and the show-me-more-like-this feedback loop of the news feed algorithm — makes it the only site to support a genuinely lucrative market in which shady publishers arbitrage traffic by enticing people off of Facebook and onto ad-festooned websites, using stories that are alternately made up, incorrect, exaggerated beyond all relationship to truth, or all three. (To really hammer home the cyberdystopia aspect of this: A significant number of the sites are run by Macedonian teenagers looking to make some scratch.)

All throughout the election, these fake stories, sometimes papered over with flimsy “parody site” disclosures somewhere in small type, circulated throughout Facebook: The Pope endorses Trump. Hillary Clinton bought $137 million in illegal arms. The Clintons bought a $200 million house in the Maldives. Many got hundreds of thousands, if not millions, of shares, likes, and comments; enough people clicked through to the posts to generate significant profits for their creators. The valiant efforts of Snopes and other debunking organizations were insufficient; Facebook’s labyrinthine sharing and privacy settings mean that fact-checks get lost in the shuffle. Often, no one would even need to click on and read the story for the headline itself to become a widely distributed talking point, repeated elsewhere online, or, sometimes, in real life. (Here’s an in-the-wild sighting of a man telling a woman that Clinton and her longtime aide Huma Abedin are lovers, based on “material that appeared to have been printed off the internet.”)

Profit motive, on the part of Macedonians or Americans, was not the only reason to share fake news, of course — there was an obvious ideological motivation to lie to or mislead potential voters — but the fake-news industry’s commitment to “engagement” above any particular political program has given it a terrifyingly nihilistic sheen that old-fashioned propagandists never displayed. (Say what you will about ratfucking, dude, at least it’s an ethos.) And at the heart of the problem, anyway, is not the motivations of the hoaxers but the structure of social media itself. Tens of millions of people, invigorated by insurgent outsider candidates and anger at perceived political enemies, were served up or shared emotionally charged news stories about the candidates, because Facebook’s sorting algorithm understood from experience that they were seeking such stories. Many of those stories were lies, or “parodies,” but their appearance and placement in a news feed were no different from those of any publisher with a commitment to, you know, not lying. As those people and their followers clicked on, shared, or otherwise engaged with those stories — which they did, because Trump drives engagement extremely bigly — they were served up even more of them. The engagement-driving feedback loop reached the heights of Facebook itself, which shared fake news to its front page on more than one occasion after firing the small team of editorial employees tasked with passing news judgment. Flush with Trump’s uniquely passionate supporter base, Facebook’s vast, personalized sewer system has become clogged with toxic fatbergs.

And it is, truly, vast: Something like 170 million people in North America use Facebook every day, a number that’s not only several orders of magnitude larger than even the most optimistic circulation reckonings of major news outlets but also about one-and-a-half times as many people as voted on Tuesday. Forty-four percent of all adults in the United States say they get news from Facebook, and access to to an audience of that size would seem to demand some kind of civic responsibility — an obligation to ensure that a group of people more sizable than the American electorate is not being misled. But whether through a failure of resources, of ideology, or of imagination, Facebook has seemed both uninterested in and incapable of even acknowledging that it has become the most efficient distributor of misinformation in human history.

Worst of all, it’s not clear there’s any remedy. The truth is that Facebook seems less malevolent here than insecure about its power, unsure of its purpose, and unclear about what its responsibilities really are. Frankly, too, I’m not sure I feel comfortable allowing Facebook’s heavy hand, which infamously censored the iconic “napalm girl” photograph, determining what is legitimate and what is illegitimate news; I feel even less comfortable ceding that determination to an algorithmic sorting mechanism as opaque as Facebook’s. Media columnist Jim Rutenberg’s suggestion in the New York Times that “[t]he cure for fake journalism is an overwhelming dose of good journalism” is inspiring, but it seems to me that the problem we face is not a lack of journalism, good or bad, but an overwhelming abundance of it. Fake-news attacks discourse in structurally similar ways to the DDoS attacks that recently crippled internet infrastructure for a day: Hoaxes overwhelm political conversation (facts, ideas, stories) with junk, aware that the rules of the system (in this case, freedom of speech) prevent it from distinguishing from “legitimate” and “illegitimate,” and therefore from stopping the attack. An overwhelming dose of good journalism, rather than addressing or rebutting lies and hoaxes, would simply add to the cacophony; presented identically on Facebook alongside fake journalism, it would merely appear as another opinion in a swarm of them.

Of course, lies and exaggerations have always been central to real political campaigns; Facebook has simply made them easier to spread, and discovered that it suffers no particular market punishment for doing so — humans seem to have a strong bias toward news that confirms their beliefs, and environments where those beliefs are unlikely to be challenged.

Really, I’m not sure that the most significant effect of Facebook’s dominance is the way it abets the already extant spread of mis- or disinformation. Rather (and I’m cribbing here from sociologist Zeynep Tufekci and media pundit Clay Shirky) I think it’s the way it’s crowbarred open the window of acceptable political discourse, giving rise to communities and ideological alignments that would have been unable to survive in an era where information and political organization were tightly controlled by corporate publishers and Establishment political parties. Put another way, it’s not just that Facebook makes politics worse, it’s that it changes politics entirely.

For most of the 20th century, the flow of information was controlled by a relatively small number of media companies — large newspapers, and, later, the major broadcast networks. These companies were large and generally corporate, Establishment-friendly and politically centrist: They limited, mostly, the acceptable range of political opinion, because that nice middle was where advertising and subscription business models were most profitable. This limitation on the supply of information and opinion both enabled, and was enabled by, two Establishment political parties, which had effective control over political mobilization, having cobbled together unstable coalitions of voters bound by fairly tight ideological windows.

Facebook assaults both components of this power dynamic, providing the platform and audience that only news-media outlets could once command, and the organizing power that only parties once held. As then-Senator Obama understood well in 2008, the internet provides political candidates a previously unimaginable opportunity to identify, communicate with, and organize supporters — an opportunity that, significantly, exists outside the traditional party apparatus. Trump, like Obama before him, was able to connect with voters outside the more stifling confines of political-party organizing. Trump, a longtime Democrat with liberal social positions, rose to the nomination because he could express a political position — essentially, white welfare-state ethno-nationalism — that the party would once have choked off for threatening its delicate coalition of business interests and white workers.

Facebook connected those supporters to each other and to the candidate, gave them platforms far beyond what even the largest Establishment media organizations might have imagined, and allowed them to effectively self-organize outside the party structure. Who needs a GOTV database when you have millions of voters worked into a frenzy by nine months of sharing impassioned lies on Facebook, encouraging each other to participate?

Even better, Facebook allowed Trump to directly combat the hugely negative media coverage directed at him, simply by giving his campaign and its supporters another host of channels to distribute counterprogramming. This, precisely, is why more good journalism would have been unlikely to change anyone’s mind: The Post and the Times no longer have a monopoly on information about a candidate. Endless reports of corruption, venality, misogyny, and incompetence merely settle in a Facebook feed next to a hundred other articles from pro-Trump sources (if they settle into a Trump supporter’s feed at all) disputing or ignoring the deeply reported claims, or, as is often the case, just making up new and different stories.

None of this is, in particular, new; the structures of political power have been challenged frequently in the past century, mostly by the arrival of new media — radio, television, cable — that changed the scale of the audience, and, consequently, the political and social culture of the country. Every time a new medium expands the possible audience of mass media, and opens up new spaces for new voices to be heard, it upsets the delicate balances of power that rested upon the previous media structure. You know: If you thought radio changed politics, just wait till television. And if you thought television changed politics, just wait until Facebook really hits its stride. Or. Well. I guess it just did.