On Monday, Facebook VP of global affairs and communications Nick Clegg published a “draft charter” for an independent body — currently given the somewhat underwhelming name of “Oversight Board for Content Decisions” — whose purpose is to “provide oversight of Facebook’s content decisions,” “reverse Facebook’s decisions when necessary,” and “be an independent authority outside of Facebook.” In other words: a high court, charged with upholding, interpreting, and applying the law of Facebook. Nearly a year after first endorsing the idea, Facebook’s benevolent dictator Mark Zuckerberg is giving his empire a supreme court.
Nothing is set in stone beyond the board’s purpose, but the draft charter contains a “suggested” framework: 40 “global experts” from diverse backgrounds, chosen at first by Facebook but in the future selected and approved by the board itself, serving three-year, part-time, paid terms and hearing “cases” referred to the board by both Facebook users and Facebook itself in rotating panels. Over the next six months, the company will hold “a series of workshops around the world where we will convene experts and organizations” to tweak or reimagine this structure, but it’s hard to imagine it changing too drastically from what Facebook has laid out here. It’s not quite Article III — “The judicial Power of the United States, shall be vested in one supreme Court” — but it’s something. The question is: will it work?
The answer, of course, depends on what you think the oversight board is supposed to do. Certainly, meeting its stated goal of providing “oversight of Facebook’s content decisions” is going to be difficult. The fact that the oversight board is designed to specifically handle what Clegg describes as “our most difficult and contested cases” — like, say, the question of whether or not to allow Alex Jones and Infowars on the platform — means that the vast majority of the company’s content-moderation decisions will continue to be handled unmonitored and uncontested. That’s not necessarily a failure; it will obviously be useful to have a transparent and accountable process by which high-profile cases can be adjudicated. But the millions of smaller instances of content moderation — whether or not to remove a photo or a status update — which can have meaningful effects on individual users’ lives will remain as opaque and unanswerable as they’ve always been.
For this reason, the oversight board seems particularly ill-equipped to handle some of Facebook’s most pressing problems, like its role in ethnic cleansing in Myanmar, where the issue is not a single high-profile bad actor or false article but the accumulation of rumor, conspiracy, and bile that a court-like body is in no position to address in a comprehensive way. Really, a truly comprehensive, transparent, and accountable solution to the problem of platform-assisted misinformation might require not just an open and independent “judiciary,” but a transparent legislature and executive branch, as well.
Those things are unlikely to be implemented, and it’s why the oversight board’s most lofty implicit goal — to introduce quasi-constitutional checks and balances to one of the planet’s largest quasi-states — is also its most far-fetched. Without more transparency around how and why Facebook sets and executes its policies, and especially without any accountability, even an independent body charged with oversight is only ever, at best, working at the margins — ensuring that decisions are consistent with rules being set elsewhere. (You wouldn’t place much faith in a high court installed by a dictator in the absence of any other political reforms, would you?) And Facebook’s rules and regulations are never going to be made transparent. As John Herrman explained in a recent New York Times Magazine column, the algorithmic rules and byways by which Facebook promotes some content and demotes other content, or the byzantine policies through which it removes some posts and leaves others on the site, are rarely made public, because Facebook, like other platforms, is “built on asymmetrical information.” If everyone knew exactly how Facebook’s News Feed “worked,” Facebook claims, it could no longer function. “We’re living in worlds governed by trade secrets,” Herrman writes. An Oversight Board intended to give users some window onto the world of Facebook governance is always going to run up against Facebook’s desire to protect the recipe to its secret sauce.
That’s not to say the oversight board is destined to be a complete failure. There is one task in particular at which it’s likely to succeed: reducing pressure on Facebook. Zuckerberg, who controls about 60 percent of Facebook’s voting shares, is an unusually powerful CEO, but he’s lately indicated an eagerness to cede control of content moderation: “We shouldn’t be making so many important decisions about free expression and safety on our own,” he told reporters on a conference call last year. The company has been widely criticized over its status as a vector for conspiracy theories, hate speech, harassment, and “fake news” of various stripes; it’s also frequently (and baselessly) accused by right-wing politicians and pundits of favoring liberal and left-wing content and suppressing conservative voices. By moving the most high-profile and attention-getting cases (“our most difficult and contested decisions,” as Clegg’s post puts it) to a separate, quasi-independent body, Zuckerberg can plausibly tell reporters and legislators that those decisions are out of his hands. I don’t think the oversight board is merely a cynical tool to address bad publicity. I just think that its most cynically construed purpose is the one at which it’s most likely to succeed.