filter bubble

Your Customized News Feed Could Be Making You Dumber

Photo: Paul Bradbury/Getty Images/Caiaimage

At this point, you’re probably aware that a lot of the information you consume on the internet is filtered in some way or another. Google and Facebook, for example, have algorithms that attempt to figure out what sort of stuff you like (and are therefore most likely to click on), which affects which sorts of search results and news-feed items you see.

This has worried people for a long time. If we’re only served up stuff we like, could it mean that the internet is increasingly massaging our ideological and other biases, rather than challenging them? If so, it would mean one of the big hopes people had for the internet — that it would operate as a market in which the “best” ideas somehow won out — was being squandered, that instead people were being effectively herded into homogeneous ideological neighborhoods. (Eli Pariser’s 2011 book The Filter Bubble sparked a big conversation about these fears and attached a clever name to them.)

Over the years, researchers have done a fair amount of research showing that people do prefer to read stuff that lines up with their worldviews, and often don’t do a great job seeking out information that could challenge, rather than confirm, their preexisting theories. But as a team led by Ivan Dylko write in a new paper in Computers in Human Behavior, there’s actually a dearth of empirical experimental evidence showing what these sorts of filters do, and comparing internet users’ behavior with and without them. So Dylko and his colleagues ran one.

For the experiment, a group of communications and psychology students were asked to provide feedback on a new political website the authors were (supposedly) developing. They were asked some questions about their political ideology and then placed into one of three groups: a control group in which the articles they read were basically randomized by topic; a group in which “subjects could select the political ideology of the sources for each issue”; and a group in which an algorithm working invisibly filtered their articles for them.

Then, the researchers simply recorded the users’ behavior over the five minutes they were allotted on the website (that amount of time was chosen, they write, because other research has shown it’s about how much time people spend on political websites in a given session). Here’s what they found (you can ignore the “aggregated customizability group”):

As you can see, filtering of either sort led people to click and spend more time on “pro-attitudinal” articles — that is, articles most likely to reflect their own opinions right back at them. In a way, the bottom-right graph is the most interesting. It shows that people in the control group spent more than half their time on the site reading articles that challenged their beliefs. That number plummeted precipitously in the other conditions.

Now, this is just one experiment based on just five minutes of web browsing, but these are still striking numbers. And, in the case of the most plugged-in news consumers, it’s worrisome to imagine what happens when their propensity for pro-attitudinal content is constantly reinforced by Facebook and Google and Twitter and their network of ideologically like-minded friends. The way we’re siloing ourselves is not good.

Your Customized News Feed Could Be Making You Dumber