select all

Here Are Some Editorial Guidelines That Facebook Should Consider

In what must be — roughly — its five-millionth attempt at removing crud from its most profitable product, the News Feed, Facebook announced today that it’s taking further measures to reduce “clickbait.” In a blog post, employees Alex Peysakhovich and Kristin Hendrix compared the new clickbait-detection system to an email spam filter, looking at commonly used phrases in clickbait articles.

But wait! You’re saying, correctly, Doesn’t “clickbait” on the internet just mean “anything a particular reader hates”? Luckily, Peysakhovich and Hendrix lay out two main criteria for clickbait:

(1) if the headline withholds information required to understand what the content of the article is; and (2) if the headline exaggerates the article to create misleading expectations for the reader.

Bad, vague, and withholding headlines came about because sites like Upworthy were able to harness what is known as the “curiosity gap” — the lack of information that makes someone want to click through. Readers, by and large, rarely get past headlines. In asking publishers to be more specific and direct in their headlines, Facebook is also asking them to eliminate incentives that might cause a user to click through to the full article. In other words, incentives that might take them off of Facebook and on to other sites.

But while this might be bad for the publishing industry, it’s not necessarily bad for Facebook users. Let’s state it plainly: This is Facebook making an editorial decision on behalf of its users. This makes Facebook not just a middleman between reader and article, but a publisher in itself, issuing edicts on how to present content.

It’s not the first time that Facebook has established a clear role as publisher with editorial responsibilities. Its Trending Topics feature, despite its smooth algorithmic look and tone, is curated and written by a team of editors and writers — an operation that came under fire after a series of Gizmodo articles from earlier this year was seized upon by conservatives as evidence of bias.

So how can Facebook do a better job of acting as a publisher? Here are some guidelines that the site might want to consider.

Own it: Admit that Facebook is making conscious editorial decisions.

When Facebook came under fire in in the spring for “excluding” “conservative” “news” topics (Benghazi) and websites (Breitbart) from its Trending Topics, the company made the mistake of doubling down on the idea that it was “neutral.” Rather than defend the decision — “Benghazi” is not news; Breitbart is a terrible news source — and acknowledge the fact that Facebook is built by people, who are not neutral, Facebook refused to acknowledge that the decisions it was making were largely correct, and fell back on a blinkered sense of its own neutrality. It needs to stop pretending that it is an automated system in which the machines and algorithms wield absolute power.

Be transparent about how it works and what gets removed.

The modern era of curiosity-gap clickbait is a direct result of trying to take advantage of the mysterious and powerful Facebook News Feed algorithm. Now realizing that it has created a ghastly content Hydra, Facebook is trying to improve the usefulness of Facebook as a content-provider. If it wants to set a minimum standard for quality, they need to be explicit about what does and does not pass muster.

Hire a full-time staff to work on this issue.

If Facebook is really committed to making its News Feed and Trending Topics good — instead of just “whatever people click on most,” which is garbage — it should staff a full-time team to work on editorial issues. The team responsible for Facebook’s Trending Topics section were contract workers housed in a basement — a pretty solid indication that they weren’t a priority, and a good way to make sure you’re not hiring the most experienced and judicious editors.

Get ready to admit when you screw up.

This is a big one for Facebook. When you consciously make decisions about what users can and can’t see, you’re bound to mess up. Taking things down when you shouldn’t have, letting posts propagate even when they’re a garbage scam. If you are going to steer users in a certain direction, you have to be willing to publicly acknowledge when you’ve effed up.

Get a public editor.

Good columns on Inverse and Motherboard have addressed this idea, which applies not just to brands and websites on Facebook, but to individual users as well. Facebook needs to be accountable when it makes decisions to remove content at the request of law enforcement, or when safety checks get activated during some crises and not others, or when certain articles are classified as “clickbait.” Facebook needs to be more proactive in explaining why it chooses to take the action it does, when it does. It can’t continue to pretend that nobody notices these things.

Know that you can’t please everyone.

The Sisyphean task that Facebook has created for itself is to design an online ecosystem that satisfies 1.6 billion people from all around the globe. Good luck with that.

Some Editorial Guidelines Facebook Should Consider