Ben: Facebook is receiving a lot of incoming fire for basically giving up on fact-checking its political ads. Mark Zuckerberg’s stance, which he articulated rather poorly during congressional hearings yesterday, is that it is not his platform’s place to be the arbiter of truth and that users can sift through misleading or outright false claims and make up their minds themselves. The site does have some rules about extreme speech, but simply telling a lie doesn’t usually break them. Does Zuckerberg deserve all the flak he’s getting on this, or are the ill feelings many people have about him and his company causing a bit of an overreaction?
Brian: In this particular case, he absolutely deserves the flak. His argument is basically that a powerful distribution channel like Facebook shouldn’t police politicians, which I think is a good stance. but lies are … lies? Like, I don’t know how else to put it. Politicians should not be allowed to spread lies uninhibited.
Max: I think Zuckerberg deserves the flak, yeah. I’m not positive that lies in political ads on Facebook will amount to much in terms of, say, changed votes or minds. But on the other hand, it’s kind of ridiculous that Zuckerberg is even picking this fight. If he doesn’t want to be the arbiter of political statements on his platform — which I think is a cowardly stance, but whatever — why not just refuse to take political ads at all? They can’t make up more than a fraction of a fraction of Facebook’s total revenue.
Brian: Right. My guess is that he has seen how Facebook advertising has benefited small businesses and applied that to politics. He thinks letting people pay him money for ads is giving people a voice, and because he’s completely decimated any organic reach on Facebook, that’s effectively true.
Ben: Why he doesn’t just issue a blanket ban on political ads is a good question. Perhaps he thinks doing so would make the platform much less relevant?
Brian: Maybe. If organic reach is close to zero, and the only way to reach followers/constituents is through payment, then offering that system is, in some twisted sense, a social good.
Max: I think Brian is right that at least part of it is a genuine belief that continuing to allow political ads on Facebook is helping new/local candidates get a voice.
This is going to sound sort of conspiracy-minded, but I think another part of it is that Facebook is institutionally signaling to Republicans that it’s the platform of “free speech,” i.e., of misleading and offensive speech, i.e., Republican speech. (Signaling to them because it is very nervous about a Justice Department antitrust investigation.)
Ben: Putting aside why Zuckerberg allows this in the first place, don’t you find it a bit odd to hold Facebook to a different standard than television or radio or anywhere else people used to see the bulk of political ads? I recall Mitt Romney running a TV spot that was straight-up false about Obama in 2012, and there’s obviously a long, long tradition of that sort of thing. It’s true that some stations rejected a particularly egregious Trump commercial recently, but that’s definitely an exception to the rule of basically allowing candidates to make whatever claims they want. I too don’t want lies showing up in the ads I see, but shouldn’t this be an across-the-board thing if we’re going in that direction?
Brian: I think it’s fine to hold Facebook to a different standard, because it is a different standard. I don’t think Zuckerberg has considered persistence as a part of this.
On TV, an ad runs once, and then it doesn’t have to run ever again. Same with newspapers and radio. On Facebook, ads stay live forever and can be reshared and passed around and don’t really ever expire, in a certain sense.
Ben: So it’s the medium that makes this uniquely dangerous?
Brian: Maybe not dangerous, but I think it’s an aspect that separates it from other older forms of advertising. It’s also kind of why “newsworthiness” exemptions are bad.
Max: As Feldman is saying, different mediums deserve different regulatory apparatuses. But I’d also say that if Facebook was constrained by the same level of regulation that broadcast or even cable television is, I think this conversation would be somewhat different. In some sense it’s okay to hold Facebook to a different standard because it’s already being held to a different standard under the law.
Ben: Can you elaborate?
Max: I mean, I’m not a lawyer, so I can’t speak with a huge amount of specificity, but just on a basic level, TV (and radio) stations are licensed. They’re held responsible by the FCC for their programming. It’s established under law that they have certain obligations and restrictions. Facebook doesn’t have any of that.
Ben: Do you think this kind of thing lends credence to Elizabeth Warren (and others’) idea that the company should be broken up?
Max: The political-ads stuff? IDK. I think it’s a case that Facebook requires regulatory attention of some kind. I mean, sure, break it up! But I also don’t think this particular problem is going to change anyone’s mind one way or the other.
Brian: I mean, at the very least, a breakup/more competition would allow for the existence of a competing platform with stricter rules, which might shake things up.
Ben: Do you think if Facebook just said, “We’re going to judge these ads in-house, and if they don’t pass a test we come up with, we’re not going to display them,” the backlash would be too terrible? Is the company just afraid of conservatives calling it biased again?
Max: I think the backlash would be very bad, yes. From conservatives, yes, but also … I think I would agree with them! I mean, depending on the specifics — if the test and its results were public, open, and transparent, I think that would be the best possible option. But since when has Facebook ever been open or transparent …
Ben: Right. A lot of this is just about the reservoir of bad will the company has built up over so many years.
Brian: Facebook is also too big to be able to commit to any policy without risking blowback. But they’re so risk averse that I don’t think a backbone will ever appear. The strategy just seems to be to ride out this low hum of frustration … forever?
Ben: It’s not very enjoyable for the rest of us.
Brian: But this example is particularly egregious. Letting politicians spread provably false information is a wild hand’s-off policy.
Ben: There’s a lot of panic about Facebook having a unique power to tilt the electorate one way or another in our democracy — just witness the New York Times’ (rather breathless) report last weekend on the Trump campaign’s domination of the platform. This is all understandable, after what happened in 2016 and, in a much more extreme fashion, in Myanmar and other places. But the progressive stance now seems to view the entire enterprise as a kind of black hole of civic disorder. Do you feel the current backlash is commensurate to what’s actually going on?
Max: Hmm. I think the scale of the backlash is basically “correct,” if somewhat misguided or poorly aimed, if that makes sense.
Ben: What do you mean by “misguided or poorly aimed”?
Max: Like I said above, I’m not super-convinced that the question of lies in political ads is really central to the deleterious effects Facebook has on society. But I think Facebook is very, very bad for the world, in a bunch of obvious and direct ways (ethnic cleansing!) and in a bunch of ways more diffuse and difficult to disentangle (its role in shoring up an economy of misinformation, say).
Ben: And on top of all that, it’s extremely boring.
Max: Lol. And it’s boring!
I mean, more than anything else, and at the risk of getting woo-woo cyberpunk here, I think that we still don’t really understand what Facebook “is,” or what it’s “doing,” or what its effects will be. And I think the first priority here should be to slow down its growth and its entanglement in institutions that we have a clearer grasp of. And if we can do that, and begin to diagnose how it interacts with our institutions and civil society, we can begin to pick out what the right regulation might look like. One major problem we face is that Facebook is so large, and moves so quickly — and secretly — that it’s often not until 18 months or two years or more later that we begin to see the effects of a given change to its operations, at which point it may have reversed the change, or accelerated it — and meanwhile regulators and critics are working with incomplete information.
Ben: Would you say it moves fast and breaks things?
Max: Wow — what a perfectly pithy summation of how Facebook works.
Brian: My final thought is if a politician lies to me, I’m gonna be mad about it. Politicians: You are on notice.
Ben: Strong, strong words.