Facebook Haunted by Its Handling of 2016 Election Meddling

Alex Stamos, who’s reportedly leaving Facebook … eventually. Photo: Brendan Moran / SPORTSFILE / Web Summit/Corbis via Getty Images

Days after the 2016 election, Facebook CEO Mark Zuckerberg dismissed the idea that fake news shared on the social-media site influenced the election, calling it a “pretty crazy idea.” Facebook didn’t publicly acknowledge that its platform might have played a role in Russia’s election meddling until April 2017, when a company release noted that its “data does not contradict” a January 2017 report by the U.S. director of National Intelligence. As indicated in a footnote, that’s a reference to the U.S. intelligence agencies’ conclusion that Russia carried out a vast cybercampaign in an effort to help elect Donald Trump.

Now Facebook is facing an intensifying backlash over its handling of various elements of the 2016 campaign. The latest crisis came on Monday night, when the New York Times and other outlets reported that Alex Stamos, Facebook’s chief information security officer, is set to leave the company by August. Stamos is widely respected within the cybersecurity community, but that’s not the sole reason that his expected departure is such a blow. It’s also exposed a debate among Facebook executives over how to address attempts to misuse the platform, and how much to disclose following security breaches.

According to the Times, there’s been ongoing tension within Facebook’s upper ranks between the security team, which tended to push for more public disclosures on misuse by Russia and other nations, and the legal and policy teams, which prioritized protecting the company’s business interests. Stamos was said to be a key advocate for publicizing Russia’s interference and working to combat it, even before the election. Per the Times:

Mr. Stamos first put together a group of engineers to scour Facebook for Russian activity in June 2016, the month the Democratic National Committee announced it had been attacked by Russian hackers, the current and former employees said.


By November 2016, the team had uncovered evidence that Russian operatives had aggressively pushed DNC leaks and propaganda on Facebook …


In the ensuing months, Facebook’s security team found more Russian disinformation and propaganda on its site, according to the current and former employees. By the spring of 2017, deciding how much Russian interference to disclose publicly became a major source of contention within the company.

In early 2017, Stamos penned a memo describing Russia’s activities on the site, but direct references to Russia were reportedly scrubbed from the report, resulting in the vague aforementioned document published in April 2017.

However, efforts to avoid publicly discussing Russia’s activities didn’t make the controversy go away. Months after media reports shed more light on Russia’s use of fake ads and user accounts, in September 2017 the company said it had uncovered a vast Russian campaign to spread propaganda on the site:

In reviewing the ads buys, we have found approximately $100,000 in ad spending from June of 2015 to May of 2017 — associated with roughly 3,000 ads — that was connected to about 470 inauthentic accounts and Pages in violation of our policies. Our analysis suggests these accounts and Pages were affiliated with one another and likely operated out of Russia.

In a statement on Facebook Live at the time, Zuckerberg said the company was cooperating with investigators, and taking steps to make political advertising more transparent. When representatives from Facebook, as well as several other top tech companies, testified before Congress in the fall, lawmakers still criticized them for being slow to address the problem.

Stamos pushed back against reports of his imminent departure on Monday night, tweeting that while his role at the company has changed, “I’m still fully engaged with my work at Facebook.” He also denied that fellow executives hampered his efforts to investigate Russian election meddling (though he did not say whether there was resistance to disclosing those findings).

Even if Stamos stays put, Facebook has plenty of other crises to deal with. Over the weekend Facebook suspended Cambridge Analytica, which ran data operations for Trump’s 2016 campaign, ahead of media exposés on how it harvested data from more than 50 million Facebook profiles. Aleksandr Kogan, a Russian-American academic working with Cambridge Analytica, allegedly violated Facebook’s terms of use by saying the data would be used for academic purposes, not political purposes.

Aided by the Trump connection, the stories drew attention to Facebook’s loose rules for accessing users’ data. Kogan’s initial collection was no different from what thousands of developers — from Farmville to Barack Obama’s 2012 campaign — have done legitimately. (The rules have been tightened since 2015, and allowing third-party apps to access your information no longer gives them access to all your friends’ data as well.)

Now lawmakers in the U.S. and Europe are calling for probes into how Facebook and other social-media companies protect users’ data from third-party companies. Several members of Congress demanded that Zuckerberg testify on Capitol Hill about how Cambridge Analytica obtained its data.

“Facebook, Google, and Twitter have amassed unprecedented amounts of personal data and use this data when selling advertising, including political advertisements,” Senators Amy Klobuchar and John Kennedy wrote in a letter to Senator Chuck Grassley, chairman of the Senate Judiciary Committee. “The lack of oversight on how data is stored and how political advertisements are sold raises concerns about the integrity of American elections as well as privacy rights.”

Grassley’s spokesperson said he has not decided whether to hold a hearing, but the controversies are already having an impact on Facebook. Its stock fell 6.8 percent on Monday, its worst single-day drop in four years. The company lost more than $36 billion in market value, and Zuckerberg personally lost $6.06 billion.

Facebook can certainly survive the loss of a top security chief and another grilling from lawmakers, but it’s a company built on trust. What happens if the ongoing revelations about Facebook’s failure to protect users from Russian disinformation and the inappropriate use of their profile data make people less interested in handing over their personal information?

Facebook Haunted by Its Handling of 2016 Election Meddling