select all

What to Expect When Tech Companies Meet With Congress

Photo: Mark Wilson/Getty Images

If you’re interested in watching intelligent people talk around an important issue that nobody has a particularly comprehensive grasp on, you should tune into C-SPAN on Wednesday. In a hearing before the congressional intelligence committees, representatives from Facebook, Twitter, and Google (technically, its parent company Alphabet) will talk about foreign use of their various platforms to undermine and manipulate American elections. You won’t see any CEOs, however. All three companies are sending their general counsel. All three companies will talk about the thousands of accounts they found linked to Russian meddlers. Facebook, according to prepared remarks released last night, will reveal that posts on its network were seen by 126 million users. For comparison, that’s about as many people as voted last year (though that hardly means there’s a direct link between the two).

The main focus of this will likely be ad purchases; it’s been widely reported that the Kremlin-linked Internet Research Agency spent thousands of dollars on ads targeting different parts of the population on divisive issues. The buys, relatively tiny and arguably not effective enough to swing the vote, are still concerning because of how these tech platforms operate in an unregulated way — traditional political advertising like you see on television is far more stringently policed.

In response to the news, hoping to head off any action from Washington, Facebook and Twitter have announced broader disclosure policies concerning political advertising. Expect these companies to rehash these announcements for Congress. Political ads will now be publicly reviewable by anyone, and the companies will also disclose who is paying for them and what their targeting criteria is. Mark Warner, who helps lead the Senate’s intel committee, is a co-sponsor of the Honest Ads Act, a bill that, if passed, would make similar regulations a universal law of the land.

So the question of political ad regulation is largely wrapped up, considering Facebook and Google control the online ad space. More than half of digital advertising revenue goes to these two companies, and they account for almost all of the digital ad market’s growth. Any changes they make affect wide swaths of the internet. A bigger open question is how the companies will address the services that they offer for free: a robust and powerful network for disseminating information widely and across borders behind the cloak of anonymity.

The looming problem of planet-size, centralized tech platforms is that they allow for anyone to easily plant seeds of doubt and dissent from miles away. Complicating things, this quality has also been, up until about 18 months ago, their strength as well. The more concerning aspect of Russian meddling has been the use of accounts posing as American (or at least implying such through lack of specificity) to spread misinformation or shift the conversation toward virulent partisanship. The grand fear isn’t ads, it’s people pretending to be fellow citizens influencing how you think and feel, and eventually, vote. Earlier this month, Mark Warner called the use of sock puppets, as they’re known, “more problematic” than the ads, saying that they were “used to sow chaos.” Again, according to statements released last night, Facebook plans to disclose that posts made by Russia-linked accounts (to be clear, not ads) reached 126 million. That figure dwarfs the 10 million users who supposedly saw ads bought by Russian operatives in the same time frame by a significant margin. Google similarly said it found more than 1,100 videos likely tied to the Internet Research Agency on YouTube, though it said only 3 percent of the videos had view counts greater than 5,000.

What is there to do about this problem? Not a whole lot, according to these companies! The problem of the 2016 election was less fake news than the harnessing and weaponization of the conservative “attention backbone” by non-Americans. Less Pizzagate conspiracy and more fear of the government taking everyone’s guns.

Additionally, Facebook said it had uncovered evidence that members of Russian cyberespionage group Fancy Bear, also known as APT28, had targeted employees of the major American political parties, and that it had found said evidence and informed government authorities before the election. Precisely what methods Fancy Bear used are unclear, but let’s imagine that it’s social engineering — literally chatting someone up to get personal details that might be relevant to their online accounts. How is one to regulate who can message who over Facebook?

One method for solving this would require these companies to become non-neutral arbiters about what is allowed on their platform; closely moderating what people can say and post and publish. These companies don’t want to be editors, and in a vacuum, I don’t think any of their users want them to be either — Facebook and Twitter and Google are simply too large to have anything more specific than the most general rules of permissible expression.

Absent strict and universally enforceable rules, all these tech companies can offer is a big shrug. They’ll probably toss out how machine learning might help tackle the problem. “One of the things I did not understand was that these systems can be used to manipulate public opinion in ways that are quite inconsistent with what we think of as democracy,” Alphabet chairman Eric Schmidt told Fast Company in an interview yesterday, also adding that “it remains to be seen whether some of these algorithms can be used to prevent bad stuff.” Not the most encouraging attitude from one of the most prominent leaders in the tech industry.

Wednesday’s hearing, if I were a betting man, will be courteous. Congress will say, “You need to do something,” and the tech companies will say, “Yes, we are examining different ways to do something.” But nobody knows what that something is.

What to Expect When Tech Companies Meet With Congress