As Facebook continues to reckon with its role as the world’s largest information-distribution system, its security team released a report today outlining how the service plans on nullifying information campaigns — coordinated efforts to seed disinformation and dissent, and cultivate specific public sentiment. The two main drivers of this behavior on Facebook, according to report authors Jen Weedon, William Nuland, and Alex Stamos, is that Facebook is a global network with global reach, and that it affords every one of its users the ability to amplify messages.
Facebook defines “information operations” as “actions taken by governments or organized non-state actors to distort domestic or foreign political sentiment, most frequently to achieve a strategic and/or geopolitical outcome.” Aspects of these campaigns include the much-dreaded fake news (Facebook prefers the term “false news”), disinformation (inaccurate information spread intentionally), and “false amplifiers” (a.k.a. sock-puppet accounts spreading posts and messages).
Much of the effort to nullify these campaigns involves preventing these actors from harvesting data about users and their leanings, and reducing financial gain. Both Facebook and Google have made it against their policies to set ads against deliberately misleading content.
Perhaps the most interesting part of the paper is tucked in the middle, where Facebook downplays the role of automation in this sort of activity.
There is some public discussion of false amplifiers being solely driven by “social bots,” which suggests automation. In the case of Facebook, we have observed that most false amplification in the context of information operations is not driven by automated processes, but by coordinated people who are dedicated to operating inauthentic accounts. We have observed many actions by fake account operators that could only be performed by people with language skills and a basic knowledge of the political situation in the target countries, suggesting a higher level of coordination and forethought. Some of the lower-skilled actors may even provide content guidance and outlines to their false amplifiers, which can give the impression of automation.
The fight over fake news will be waged, on both sides, by people. That’s encouraging?