Just days after last year’s election, Facebook CEO Mark Zuckerberg told an audience that the suggestion that misinformation on his social network had any substantial effect on the outcome was “pretty crazy.” Now, imagine a disembodied Ron Howard narrator voice saying, “It wasn’t.” And then smash cut to … September 2017: Facebook is turning over evidence to federal investigators that Russian government–linked agencies bought Facebook advertisements with the intent of influencing the election. Today, it gets even weirder: The Daily Beast reports that a Russian-created, Trump-supporting Facebook group actually threw well-attended pro-Trump rallies in Florida.
As stories of Facebook being used by Russian agencies and trolls to influence the election trickle out, the general narrative — certainly Facebook’s narrative — has been that this is a misuse of Facebook, an abuse of its platform. In April, the company called the strategies used in last year’s election “insidious forms of misuse, including attempts to manipulate civic discourse and deceive people.”
Certainly, we can agree that it’s bad that hostile foreign powers are able to easily and cheaply sow discord and division among American voters. But it’s not at all clear to me that what Russia is doing is a “misuse” of Facebook. Isn’t this the company that explicitly markets its ability to influence and swing voters? Isn’t this the company whose decadelong mission has been to allow people on one side of the globe to communicate and influence people on the other side?
Didn’t Zuckerberg say last year, “We stand for connecting every person. For a global community. For bringing people together. For giving all people a voice. For free flow of ideas and culture across nations”? You don’t have to be particularly cynical to see how Russians sharing Trump memes falls under the “free flow of ideas across nations.” As Zuck put it: “We’ve gone from a world of isolated communities to one global community, and we are all better off for it.” Well, maybe we wouldn’t go that far.
The point is this: Facebook has always wanted, from a business and ideological perspective, to be a tool with which people can reach across the ocean and exert influence on one another. The problem is that 2016 is a case study in why mere connection is not enough to make something good. Cynically motivated Russian actors used Facebook to pose as grassroots Americans, and did so in support of an authoritarian reality-television star.
And Facebook should have seen this coming. There were warning signs of this, much earlier than 2016; in 2012, Zeynep Tufekci wrote an editorial for the New York Times called “Beware the Smart Campaign” in the wake of Obama’s reelection. She wrote:
[Online targeting takes] persuasion into a private, invisible realm. Misleading TV ads can be countered and fact-checked. A misleading message sent in just the kind of e-mail you will open or ad you will click on remains hidden from challenge by the other campaign or the media. Or someone who visits evangelical Web sites might be carefully shielded from messages about gay rights, and someone who has hostile views toward environmentalism may receive messages stroking that sentiment even if the broader campaign woos the green vote elsewhere.
Tufekci concluded, “You should be worried even if your candidate is — for the moment — better at these methods. Democracy should not just be about how to persuade people to vote for one candidate over another by any means necessary.”
As it rejiggered itself earlier this year, pushing its Groups product for smaller and slightly more isolated communities, Facebook seemed to have stopped pushing the vision of a hyperconnected global community. After a decade of mind-blowing growth, the company seemed to realize that there is no one definition for “making the world a better place,” and that building an incredibly powerful and ostensibly value-neutral platform will lead people to use it in ways that you think are wrong, and they believe are right.
And yet, somehow, Facebook is still learning that if you give people a blank slate, they’ll find a way to use it for causes that any half-decent person would find objectionable. Last week, ProPublica discovered that Facebook allows advertisers to target users who describe themselves as “Jew haters.” Facebook COO Sheryl Sandberg announced changes to the company’s policies today, acknowledging that “because these terms were used so infrequently, we did not discover this until ProPublica brought it to our attention. We never intended or anticipated this functionality being used this way — and that is on us. And we did not find it ourselves — and that is also on us.”
Being unable to anticipate offensive behavior on the internet is, in 2017, an embarrassment for a company that counts a quarter of the global population among its users, and employs more than 20,000 people. It defies belief. To tout your platform’s global scale and political influence, and then be shocked when it is used for those exact purposes is similarly disconcerting. It suggests that Facebook’s executives and engineers are either incompetent at their jobs or unfamiliar with their product. Or, worse, that they don’t actually care.