Facebook’s Tactics to Stop Fake News Work (After It’s Already Been Spreading for 3 Days)

Photo: Facebook

Since the 2016 election debacle, Facebook has made some changes as it attempts to stem the tide of misinformation and fake news on its platform. Most prominent, it’s introduced a system that uses third-party fact checkers — from places like Snopes, ABC News, and Poynter — to verify stories. Links found to be bogus would be labeled with a fake-news flag. Now, ten months later, Facebook says stories with fake-news flags see a significant drop in engagement. But the process takes awhile.

According to an internal email obtained — and almost certainly not written in order to be leaked, why would you think that? — by BuzzFeed, impressions fall by 80 percent after a story is flagged. The Facebook email, sent by Jason White, Facebook’s manager of news partnerships, also said the flagging process takes “over three days,” but that the company is looking into ways to make that turnaround quicker. Which is all well and good, but a lot of information — fake or otherwise — can be spread in 72 hours. “We know most of the impressions typically happen in that initial time period,” White wrote in the email. Facebook and its partners need to hurry up and figure out how to speed up the process. Because it doesn’t matter how few people click a story after it’s been flagged. Three days is ample time for misinformation to disperse and for the Facebook masses to move on to the next big thing in fake news.

Facebook Can Stop Fake News, If You Give It 3 Days