select all

How Do You Solve a Problem Like Pizzagate?

Photo: Deb Lindsey/The Washington Post/Getty Images

In the week since Edgar Welch, armed with an AR-15, walked into Washington, D.C., pizza joint Comet Ping Pong and fired off a round in the spirit of “shin[ing] some light” on disturbing allegations he’d read involving Hillary Clinton and a supposed child sex-trafficking ring based at the restaurant, two other pizza places in different American cities reported being targeted in similar ways. Roberta’s, in Brooklyn, received phone calls threatening torture and death after a video consisting of collaged “incriminating” pictures sourced from Instagram circulated on social media. East Side Pies of Austin, Texas, faced lesser harassment, with some redditors claiming that a co-owner’s alma mater, the Culinary Institute of America, equated a past with the CIA. “The dots they are trying to connect are so ludicrous,” the other co-owner told the Austin American-Statesman.

Clearly, there is not a nationwide conspiracy of linked pedophile-run metropolitan pizzerias, as some poorly edited videos and unhinged Reddit posts — themselves largely sourced to some of John Podesta’s emails about food — would have you believe. This revelation in and of itself has led many to label the incident as one involving media bugbear du jour “fake news”; as Welch himself put it to the Times, “The intel on this wasn’t 100 percent.” But it doesn’t look or feel much like the “fake news” that we talk about proliferating on Facebook, the mindless right-wing babble spat out of the Balkans by teenagers, or even the two decades of shrieking information warfare overseen by serial sexual harasser and paranoid monster Roger Ailes. It feels much more like coordinated mob harassment, exemplified by previous incidents like Gamergate.

It’s very hard to argue that “fake news” — often grossly misidentified by gullible legacy-news outlets and politicians alike — requires a legislative or judicial remedy, given that the idea of allowing judges, regulators, or elected representatives any further leverage to shape news coverage is fairly stomach-churning. But increasingly, the case for something needing to be done to prevent another Edgar Welch from shooting up another Comet Ping Pong makes itself.

Pizzagate, like so many other examples of conspiracy-driven online mob harassment that result in violent real-life encounters, was not the result of accidental misunderstandings but a willful effort to combine facts — Comet had hosted a Clinton fundraiser and a co-owner had dated David Brock —with rumors — the most salacious, bottom-of-the-barrel InfoWars stuff about the Clintons, involving child sex-trafficking rings — and misrepresented “evidence,” in the form of otherwise innocent Instagram pictures now suddenly laced with innuendo. The combination of these things, and the propagation of what, if taken at face value, are genuinely disturbing rumors, are purposeful acts.

A federal statute, Section 230 of the Communications Decency Act, which has been upheld by the Supreme Court, prevents websites from being held responsible for content that they host. That goes as much for now-banned ISIS-linked Twitter users as it does for actual child sex trafficking on Backpage.com, as Kate Knibbs recently reported on at length for the Ringer. Should Comet Ping Pong want to pursue damages, or prosecutors charges, they would need to take action against (often anonymous) individual posters, which obviously does nothing to combat the larger issue at hand: the mob harassment that follows the original conspiracy theory. (Never mind that both would have to prove that the posts themselves “incited” Welch’s behavior.) As such, “this is a problem that’s not going to be solved, probably, by the law,” said Danielle Citron, a law professor at the University of Maryland and author of Hate Crimes in Cyberspace.

In a 2009 paper, Citron outlined a “Cyber Civil Rights” strategy centered around protecting free speech online — specifically, the speech of those that targeted mobs would otherwise stifle. “Although online mobs express themselves and their autonomy through their assaults, their actions also implicate their victims’ autonomy and ability to participate in political and social discourse,” she wrote. “Self-expression should receive no protection if its sole purpose is to extinguish the self-expression of another.” Still, as she spells out later in the article, an incident of online abuse must constitute a “true threat” to warrant criminal charges or pass the threshold for a tort lawsuit. (To illustrate how difficult a bar this is to pass, criminal charges against a college student who posted rape, torture, and murder fantasies about a specific classmate of his on a forum were dismissed as not being a “true threat” — though there are exceptions.)

That’s not to say there haven’t been lawsuits (though there really are very few criminal cases; as Fordham University law professor Alice Marwick noted in a primer on online harassment, “During the course of our research, we were unable to find many published opinions in which perpetrators have faced criminal penalties.”) In 2004, a Missouri appellate court affirmed a lower court’s ruling that forum postings “on Internet websites catering to homosexuals” spreading rumors about a high school principal by his wife’s lover were defamatory.

“We see online, and anyone who didn’t grow up in a bubble growing up can see that the answer to bad speech is not always more speech — in fact, more speech can often get you in a much worse situation,” said Nancy Kim, a professor of law and internet studies at California Western School of Law. Kim wrote, in a 2009 Utah Law Review article, “Courts should impose tort liability upon Web site sponsors for creating unreasonable business models and hold them accountable for irresponsible and harmful business practices” — that is, Section 230 shouldn’t protect websites like Reddit from liability if their system is created with inherent biases promoting mob mentality.

However, according to Kim, “the norms are so different now than they were even seven years ago” that it’s likely too late for legislators or courts to have a positive impact for the victims of online mob harassment. It will likely have to fall to the companies themselves.

Twitter’s former head of news, Vivian Schiller, told BuzzFeed News in August, “The people that run Twitter… are not stupid. They understand that this toxicity can kill them, but how do you draw the line? Where do you draw the line? I would actually challenge anyone to identify a perfect solution. But it feels to a certain extent that it’s led to paralysis,” referring to Twitter’s ten-year failure to address its harassment problem. Twitter’s user base hasn’t grown significantly in years, and the issue prevented a potential purchase by the Walt Disney Company from continuing.

Reddit, whose CEO recently admitted to personally editing pro–Donald Trump posts on r/the_donald, now one of the largest subreddits, has long tolerated and even cooperated with the darkest corners of its user base. In his 2012 exposé of r/creepshot moderator Violentacrez, Adrian Chen reported that, during the height of his power, Violentacrez, who ran subreddits glorifying racism, misogyny, and “creepshots,” was considered for a Reddit job posting. “Violentacrez was a troll, but he was a well-connected troll,” Chen wrote. But only because Reddit staff allowed him to connect with them. Users who populate r/the_donald have been attacking Reddit staff and other moderators for months at this point, but as it’s one of the most-trafficked pages on the site now, that activity is making money for Reddit.

An important thing to remember is that these online systems are created by people; they don’t happen in the wild. As such, a choice to not adequately deal with harassment and mob mentality is exactly that: a choice, and one that usually turns out to be damning. If these companies are going to be responsible for even a little bit of the work of preventing their platforms from being used for mob harassment, these past examples of their work in the area are not encouraging.

How Do You Solve a Problem Like Pizzagate?