zuck'd

Does Facebook’s Emotional Manipulation Make It a Bad Source of News?

The happy face of Facebook.

It’s no secret that Facebook is the 800-pound gorilla of the modern media business. As by far the biggest source of traffic to digital publishers, its decisions about which stories to show users on their news feeds can make or break entire news organizations, and many publishers now explicitly optimize their content for maximum Facebook impact. Facebook’s algorithms determine not only how news stories are distributed, but which stories get written in the first place.

Which is why this week’s Facebook scandal — that the social network spent part of 2012 studying “emotional contagion” by tinkering with certain users’ news feeds to include more posts with positive or negative sentiment — should cause some soul-searching among both the people who create news and the people who get their news through Facebook.

Facebook’s mood study had some scientific flaws. But its conclusion — that users who are shown positive stories more frequently tend to post more positive updates themselves, and users who are shown more negative stories tend to post negative updates — seems intuitively correct. On social media, as in real life, we take our cues from the people around us and seek to fit in. Amid a sea of smiling-baby photos and feel-good Jimmy Fallon clips, nobody wants to be the person talking about mass graves.

The logic behind Facebook’s news feed is famously inscrutable. It changes its algorithms constantly, and what you see on your feed depends on thousands of variables (up to 100,000, according to one Facebook engineer). And since Facebook isn’t under any obligation to disclose its ranking methods, we’ll never know if it considers the mood of news stories when choosing which ones to display. But it seems reasonable to assume that at some point during its social tinkering, Facebook discovered that promoting certain types of news stories on the site can stoke emotions that make for happy, engaged users.

You can see how a positive-news skew would be problematic. News isn’t always happy. Wars happen. Institutions misbehave. People are victimized. The world is full of sad-but-important news, and at traditional news organizations, these stories receive emphasis based on some subjective judgment of their public value.

But Facebook uses different criteria. Or seems to, anyway. The stories that get shared the most on Facebook from this site, for example, are often the ones that tap into strong emotional currents. These stories aren’t always happy. (This story about a cancer doctor losing his wife to cancer, for example, generated a ton of Facebook referrals.) But they almost always give the reader something to either celebrate, mourn, laugh at, or rail against, and excessive nuance is rare. This appeal to simple, vivid emotions is obviously not an accident — as John Herrman says, Facebook’s algorithms are designed to promote “content that causes people to use Facebook more vigorously.” 

Social media in general has a positivity bias. It’s why the Huffington Post’s “Good News” vertical is one of the site’s most popular, and Upworthy’s saccharine-sweet videos are everywhere (or were, until Facebook changed its algorithm). It’s why news of a friend getting engaged will rocket to the top of your Facebook feed while a recently divorced friend’s status change from “married” to “single” will stay hidden. We post stories that make us look good, and we reward social networks that make us feel good by giving them our attention.

But Facebook isn’t like other social networks. It’s the biggest news wire in the history of the world, with more than a billion users who use it as one of their primary sources of information. What casual users might not understand is that Facebook isn’t a neutral platform that displays a representative sampling of whatever your friends happen to share. It’s the product of a company that wants to maximize the amount of time its users spend looking at its content. Facebook has made overtures to journalists in recent months, but maximizing the reach and visibility of news isn’t its core mission — like a resort or an amusement park, its job is simply to ensure that visitors have a good time.

The mood study at the heart of this week’s backlash included only about 700,000 users — a small fraction of the 1.2 billion active Facebook users. But every Facebook user is part of a larger, ongoing social experiment. Every time we look at our Facebook feeds, we’re seeing a highly engineered product that filters information in a very specific way. Like any news organization, Facebook has an ideological tilt — but rather than skewing liberal or conservative, it’s a bias toward the poles of the emotional spectrum.

Make no mistake: Facebook isn’t doing anything wrong by filtering its feeds in this way. It’s allowed to promote and demote whatever content it wants. But in the same way we’d read a newspaper in China with the expectation that certain types of stories are being censored, we should keep in mind that Facebook is modifying our experience to keep us engaged and clicking. The only way to combat the power of Facebook’s emotional filter is to remember that it’s there in the first place.

Is Facebook’s Manipulation Bad for News?