Political Extremists Are Resistant to One Kind of Bias

By
Rush Limbaugh during the Els for Autism Pro-Am on the Champions Course at the PGA National Golf Club on March 15, 2010 in Palm Beach Gardens, Florida.
Photo: David Cannon/Getty Images

We often think of political extremists as deeply biased people, and for good reason: They’re stuck in their views and no amount of evidence is going to sway them. A new study in Psychological Science, though, offers an interesting example of how their certitude might protect them from one particular kind of bias.

For the study, a team lead by Mark J. Brandt of Tilburg University in the Netherlands asked a bunch of people to participate in a so-called “anchoring task.” This is a task in which researchers ask you to estimate a certain value based on a piece of information they give you. For instance, in this study, as the press release explains, some participants were asked, “The distance between New York and San Francisco is greater than 2,000 miles. How far is it?”

That 2,000 miles figure is known as an anchor — a piece of (usually numerical) information that tends to affect how people make an estimate. In the past, these sorts of questions have uncovered a persistent “anchoring bias” in people: we tend to make guesses close to whatever anchor we’re given. “People who start with large anchor numbers end up with overly high estimates,” the release explains, “and vice versa for people who start with small anchor numbers.” Anchoring bias can often cause people to make faulty judgments, since anchors don’t always provide meaningful or accurate information (when you’re informed that that sweater at the mall is “usually $70” but on sale for a “limited time only,” that’s an anchor).

In this study, though, the researchers found that people with politically extreme views, and people who had great confidence in the superiority of their political views — there was a lot of overlap between the two groups — were less likely to fall victim to the anchor bias. That is, they made estimates further from the anchors the researchers provided than did people with more politically moderate views. In this one instance, they were able to filter out extraneous information — their overconfidence actually helped them.

In the real world, of course, people aren’t always good at determining which pieces of evidence do and don’t manner, and political extremists are well-known for ignoring evidence that would disconfirm their worldview (ever talked to a 9/11 Truther?). Still, this study is a useful reminder that bias is a complicated, multifaceted thing.