Facebook founder and CEO Mark Zuckerberg is rightfully facing scrutiny (and condemnation) over his company’s role in the 2016 presidential election, during which his vast global information-distribution network became home to a torrent of hoaxes, misinformation, exaggerations, and outright lies. Facebook cannot and should not be considered the single reason for Donald Trump’s triumph, but it’s also hard to imagine that the social network — on which 44 percent of Americans read news — is blameless. It would, after all, be tantamount to admitting that Facebook has no influence over the users it is attempting to sell to advertisers.
Zuckerberg, for his part, insists that Facebook bears no responsibility at all for the election results. One of his arguments is that fake news on either side of the aisle essentially cancelled itself out: “Why would you think there would be fake news on one side and not the other?” he asked last week at the Techonomy conference. But according to a new report from Gizmodo, Facebook itself is intimately aware that fake and misleading news is largely a problem with right-wing news sites, and not their left-wing counterparts — and, in fact, that fear of alienating those right-wing sites might have scared the company away from implementing tools to deal with harassment:
One source said high-ranking officials were briefed on a planned News Feed update that would have identified fake or hoax news stories, but disproportionately impacted right-wing news sites by downgrading or removing that content from people’s feeds. According to the source, the update was shelved and never released to the public. It’s unclear if the update had other deficiencies that caused it to be scrubbed.
That any attempt to stem the tide of fake news on Facebook would affect the conservative media more than any other sector is not news to anyone who’s been paying attention over the last several decades: As we argued in August, the connection between fake news and conservative-leaning news on Facebook is intrinsic and substantial.
Facebook denied the allegations in a statement, saying:
The article’s allegation is not true. We did not build and withhold any News Feed changes based on their potential impact on any one political party. We always work to make News Feed more meaningful and informative, and that includes examining the quality and accuracy of items shared, such as clickbait, spam and hoaxes. Mark himself said “I want to do everything I can to make sure our teams uphold the integrity of our products.” This includes continuously review updates to make sure we are not exhibiting unconscious bias.
(The caveat of “based on their potential impact on any one political party” is important here. The statement does not specifically deny the existence of a scrapped feature to lessen the spread of fake news. It only denies the reasoning.)
But even then, Zuckerberg has insisted that Facebook is not responsible. Following reports about anxiety within the company after the election, Zuckerberg crafted a lengthy statement about the company’s role in a Facebook status. “Of all the content on Facebook, more than 99% of what people see is authentic. Only a very small amount is fake news and hoaxes,” he asserted. Maybe this is true — Facebook is reticent to give further detail about its precious proprietary data — but the proportion of posts that are fake is not the issue. The issue is that the fake news is being amplified and distributed at a disproportionate rate, programmatically via the News Feed algorithm.
Still, Zuckerberg insists that the fake news doesn’t matter. “Personally I think the idea that fake news on Facebook, which is a very small amount of the content, influenced the election in any way — I think is a pretty crazy idea.”
He has used this line — that Facebook is but a mere humble tech platform — in the past. Five years ago, he claimed Facebook did not have a role in the Arab Spring. “My own opinion is that it would be extremely arrogant for any specific tech company to claim ay meaningful role in those,” he said in 2011. It’s a shrewd move, using humility as a means of deflection.
But looking at Facebook’s public messaging to its investors betrays the company’s actual ambition to become the filter through which everyone uses the internet (an ambition that it has largely already succeeded at). From a 2012 letter to investors, just as Facebook was going public:
By helping people form these connections [online], we hope to rewire the way people spread and consume information. We think the world’s information infrastructure should resemble the social graph — a network built from the bottom up or peer-to-peer, rather than the monolithic, top-down structure that has existed to date. We also believe that giving people control over what they share is a fundamental principle of this rewiring.
And further down:
By giving people the power to share, we are starting to see people make their voices heard on a different scale from what has historically been possible. These voices will increase in number and volume. They cannot be ignored. Over time, we expect governments will become more responsive to issues and concerns raised directly by all their people rather than through intermediaries controlled by a select few.
This piece has been updated with a statement from Facebook.