Shockingly, Facebook’s Disputed News-Flagging System Was a Flop

By

After that whole fake news definitely played a big ol’ role in the 2o16 election outcome thing, Facebook announced that it would be doing something to combat the spread of bogus stories. A year ago, the company started asking users to report fake news stories, which Facebook, through a partnership with organizations from Poynter’s International Fact-Checking Network, like Snopes and ABC News, would deem real or not. If it were the latter, the story would be — hypothetically — labeled with a “disputed” flag and potentially dinged in Facebook’s News Feed algorithm to stop its spread. Today, Facebook announced that it is doing away with that plan, and will instead show users “related articles” from more trustworthy sources.

An internal email leaked in October showed that Facebook’s disputed tags — though four out of four Select All staffers never saw a single tag in the wild — worked … but only after 72 hours. Which was plenty of time for fake news to spread far and wide. Plus, as Facebook points out in its blog post — and we noted a year ago — the disputed flags only served to make people who already believed that a piece of fake news was real believe so even more strongly. “Academic research on correcting misinformation has shown that putting a strong image, like a red flag, next to an article may actually entrench deeply held beliefs — the opposite effect to what we intended,” writes Facebook product manager Tessa Lyons. “Indeed, we’ve found that when we show Related Articles next to a false news story, it leads to fewer shares than when the Disputed Flag is shown.”

In a separate post on Medium, three Facebook employees laid out several of their findings from Facebook’s year using disputed flags. Among the problems were the number of clicks required for a user to actually find out what was disputed about a given story and the number of fact checkers required to label a story as such. The disputed tag could only be applied to stories that were entirely false, which meant that stories with portions that were incorrect couldn’t be dinged. Instead, the three employees hope that Facebook’s new plan to show related articles will be more effective. “Academic research supports the idea that directly surfacing related stories to correct a post containing misinformation can significantly reduce misperceptions,” the trio writes. We’ll see.

Facebook’s Disputed News-Flagging System Was a Flop