Facebook Punts

By
Photo: Win McNamee/Getty Images

Earlier today, Facebook caught a very large and entirely reasonable amount of flak over the fact that it had removed a post featuring the famous “napalm girl” from the page of a Norwegian newspaper. As the paper’s editor-in-chief explained in an open letter to Mark Zuckerberg, the service’s inability to distinguish between child porn and newsworthy photography is an embarrassing, arguably dangerous screwup for what is regarded as the most important information network in the world.

This afternoon, Facebook released a bad response:

After hearing from our community, we looked again at how our Community Standards were applied in this case. An image of a naked child would normally be presumed to violate our Community standards, and in some countries might even qualify as child pornography. In this case, we recognize the history and global importance of this image in documenting a particular moment in time. Because of its status as an iconic image of historical importance, the value of permitting sharing outweighs the value of protecting the community by removal, so we have decided to reinstate the image on Facebook where we are aware it has been removed. We will also adjust our review mechanisms to permit sharing of the image going forward. It will take some time to adjust these systems but the photo should be available for sharing in the coming days. We are always looking to improve our policies to make sure they both promote free expression and keep our community safe, and we will be engaging with publishers and other members of our global community on these important questions going forward.

Okay, I am going to attempt to break down this mealymouthed statement. Here is the first problem: Facebook talks about its user base of more than one billion people as “our community.” Facebook is not a single community. It is millions of separate ones that overlap. At Facebook’s gargantuan scale, everybody that uses the service cannot be lumped into a whole.

Facebook, in the very next sentence, explicitly acknowledges this when talking about how different countries have different laws pertaining to child pornography. So it’s one community, which must follow a single set of guidelines, but each user is also subject to the laws of individual countries. Seems like a contradiction! This ill-conceived notion only gets worse because Facebook has, as far as I know, no mechanism in place to prevent content permissible in Norway from being displayed to users that aren’t Norwegian. Facebook wants one set of rules that can be layered over thousands of other regulations. This comes back to what I wrote earlier today: The content that Facebook prefers is so inert as to be completely unobjectionable — videos of food being prepared and moms laughing.

And then we get to this: “Because of its status as an iconic image of historical importance, the value of permitting sharing outweighs the value of protecting the community by removal.” Facebook reinstated the image because of historical importance — its action today is a one-time exception to its otherwise labyrinthine content guidelines. It has whitelisted this one photo and no others.

The idea of judging images based on their pre-Facebook historical impact is a terrible rubric, and one that really no longer applies. In fact, it is now because of Facebook that most of the viral images we see today garner attention. Facebook seems to have no qualms about distributing images of Alan Kurdi, the 3-year-old refugee whose body washed up on the shore in Turkey, or Omran Daqneesh, the devastated Syrian child covered in dust and blood. (Maybe the difference is that they were clothed.)

“We will also adjust our review mechanisms to permit sharing of the image going forward,” Facebook wrote, adding, “We are always looking to improve our policies to make sure they both promote free expression and keep our community safe.” Review mechanisms? Policies? These sure as hell sound like editorial guidelines from Facebook, the publisher that refuses to admit it is one and which believes that, with algorithms and artificial intelligence, it can supersede journalistic outlets.

Earlier today, Facebook’s Trending Topics module promoted a story asserting that 9/11 was an inside job.