The Internet Isn’t Making Us Dumber — It’s Making Us More ‘Meta-Ignorant’

Photo: Jedrzej Kaminski / EyeEm / Getty Images

At five-foot-six and 270 pounds, the bank robber was impossible to miss. On April 19, 1995, he hit two Pittsburgh banks in broad daylight. Security cameras picked up good images of his face — he wore no mask — and showed him holding a gun to the teller. Police made sure the footage was broadcast on the local eleven o’clock news. A tip came in within minutes, and just after midnight, the police were knocking on the suspect’s door in McKeesport. Identified as McArthur Wheeler, he was incredulous. “But I wore the juice,” he said.

Wheeler told police he rubbed lemon juice on his face to make it invisible to security cameras. Detectives concluded he was not delusional, not on drugs — just incredibly mistaken.

Wheeler knew that lemon juice is used as an invisible ink. Logically, then, lemon juice would make his face invisible to cameras. He tested this out before the heists, putting juice on his face and snapping a selfie with a Polaroid camera. There was no face in the photo! (Police never figured that out. Most likely Wheeler was no more competent as a photographer than he was as a bank robber.) Wheeler reported one problem with his scheme: The lemon juice stung his eyes so badly that he could barely see.

Wheeler went to jail and into the annals of the world’s dumbest criminals. It was such a feature, in the 1996 World Almanac, that brought Wheeler’s story to the attention of David Dunning, a Cornell psychology professor. He saw in this tale of dim-witted woe something universal. Those most lacking in knowledge and skills are least able to appreciate that lack. This observation would eventually become known as the Dunning-Kruger effect.

Dunning and a graduate student, Justin Kruger, embarked on series of experiments testing this premise. They quizzed undergraduate psychology students on grammar, logic, and jokes, then asked the students to estimate their scores and also estimate how well they did relative to others (on a percentile basis). The students who scored lowest had greatly exaggerated notions of how well they did. Dunning had expected that, but not the magnitude of the effect. His first reaction to the results was “Wow.” Those who scored near the bottom estimated that their skills were superior to two-thirds of the other students.

Later research went far beyond the university. For one experiment, Dunning and Kruger recruited gun hobbyists at a trapshooting and skeet-shooting competition. Volunteers took a ten-question gun safety and knowledge quiz adapted from one published by the National Rifle Association. Again, the gun owners who knew the least about firearm safety wildly overestimated their knowledge.

Like most rules, this one has exceptions. “One need not look far,” Dunning and Kruger wrote, “to find individuals with an impressive understanding of the strategies and techniques of basketball, for instance, yet who could not ‘dunk’ to save their lives. (These people are called coaches.)” But of course coaches understand their own physical limitations. Similarly, “most people have no trouble identifying their inability to translate Slovenian proverbs, reconstruct a V-8 engine, or diagnose acute disseminated encephalomyelitis.”

The Dunning-Kruger effect requires a minimal degree of knowledge and experience in the area about which you are ignorant (and ignorant of your ignorance). Drivers, as a group, are subject to the effect — bad drivers usually think they’re good drivers — but those who have never learned how to drive are exempt.

Since Dunning and Kruger first published their results in the 1999 paper “Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments,” the effect named for them has become a meme. It strikes a universal chord: As Dunning put it, the overconfident airhead “is someone we’ve all met.” Actor John Cleese concisely explains the Dunning-Kruger effect in a much-shared YouTube video: “If you’re very, very stupid, how can you possibly realize that you’re very, very stupid? You’d have to be relatively intelligent to realize how stupid you are … And this explains not just Hollywood but almost the entirety of Fox News.” But the 1999 paper makes clear the authors’ opinion that the first place to look for a Dunning-Kruger ignoramus is in the mirror.

***

There is now an active field of research into how the internet is changing what we learn and remember. In a 2011 experiment helmed by Daniel Wegner of Harvard, volunteers were presented with a list of 40 trivia facts — short, pithy statements such as “An ostrich’s eye is bigger than its brain.” Each person was instructed to type all 40 statements into a computer. Half the volunteers were told to remember the facts. The other half weren’t. Also, half were informed that their work would be stored on the computer. The other half were told that it would be erased immediately after the task’s completion.

The volunteers were later given a quiz on the facts they’d typed. Those instructed to remember the information scored no better than those who hadn’t been told to do so. But those who believed that their work would be erased scored much better compared to those who believed it would be saved. This was true whether they were trying to remember the facts or not.

The conscious mind exercises little choice in remembering and forgetting. Nobody decides to forget a client’s name or to remember forever the lyrics of a detested pop tune. It just happens.

The Harvard experiment’s results are consistent with a pragmatic system of memory. It is impossible to remember everything. The brain must constantly be doing triage on memories, without conscious intervention. And apparently it recognizes that there is less need to stock our minds with information that can be readily retrieved. (It may be a very long time before you need to know how big an ostrich’s eyeball is.) So facts are more often forgotten when people believe the facts will be archived. This phenomenon has earned a name — the Google effect — describing the automatic forgetting of information that can be found online.

If you take the Google effect to the point of absurdity, selfies would cause amnesia. But a 2013 study conducted by Linda Henkel of Fairfield University pointed in that direction. Henkel noticed that visitors to art museums are obsessed with taking cell-phone shots of artworks and often are less interested in looking at the art itself. So she performed an experiment at Fairfield University’s Bellarmine Museum of Art. Undergraduates took a docent tour in which they were directed to view specific artworks. Some were instructed to photograph the art, and others were simply told to take note of it. The next day both groups were quizzed on their knowledge of the artworks. The visitors who snapped pictures were less able to identify works and to recall visual details.

Our unconscious curators of memory must be aware of how quickly and easily any needed fact can be called up. This implies that our broadband networks have created a new regime of learning and memory, one in which facts are less likely to be retained and are more quickly forgotten. In a few years, we’ll probably all be wearing devices that shoot a 24-7 video stream of our lives. Will social media make amnesiacs of us all?

Uploaded keystrokes are just one of many ways we have of storing information outside our brains. Long before our virtual social networks, we shared memory, knowledge, and expertise among our real social networks. I’m not a foodie, but I have friends who can recommend interesting new restaurants. I don’t know doctors, but I have a general practitioner who can recommend a specialist. We get by in the world, not by knowing everything but by knowing people.

Distributed memory can counteract misinformation — to a degree, anyway. Surveys have shown that most people think antibiotics will fight viruses. Wrong. But as Dan M. Kahan of Yale points out, it hardly matters. “Most people” are not going to self-prescribe azithromycin. The important thing is to know that it’s a good idea to go to a doctor when we’re sick and to follow that doctor’s instructions.

The Google effect is another adaptation to distributed memory. The cloud is a friend who happens to know everything. It’s always available, provides the answer in seconds, and never gets upset with dumb questions. It’s little wonder we depend on it to the point of absurdity. Economist Seth Stephens-Davidowitz noted that the third-most-common Google search containing the phrase “my penis” is “How big is my penis?” You’d think a ruler would have a better answer.

***

Most — more than 50 percent — of millennials can’t name anyone who shot a U.S. president or discovered a planet; they don’t know the ancient city celebrated for its hanging gardens, the one destroyed by Mount Vesuvius, or the emperor said to have fiddled while Rome burned; and most millennials can’t name the single word uttered by the raven in Edgar Allan Poe’s poem.

The conventional reaction to such reports is a blend of shock and amusement. It’s terrible how little young people/ordinary citizens know — right? It’s worth asking how we know it’s so terrible and whether it’s terrible at all.

Ignorance can be rational. Economist Anthony Downs made that claim in the 1950s. He meant that there are many situations in which the effort needed to acquire knowledge outweighs the advantage of having it. Maybe you got your diploma and a high-paying job without ever learning about that poem with the raven in it. Why learn it now?

The contemporary world regards knowledge with ambivalence. We admire learning and retain the view that it is a desirable end in itself. But our more entitled side sees learning as a means to an end — to social advancement, wealth, power, something. We are suspicious of education that lacks an ulterior motive; we click on listicles entitled “8 College Degrees With the Worst Return on Investment.”

Ours is the golden age of rational — and rationalized — ignorance. Information is being produced, devalued, and made obsolete at a prodigious rate. Every day the culture changes beneath our feet. It is harder than ever to keep up or even to be sure that keeping up matters anymore. We are left speculating about how important it is to stay current on the Middle East, contemporary novels, local politics, wearable technology, and college basketball. A friend recently wondered aloud whether it was okay to not know anything about Game of Thrones. The observation that you can look up any needed bit of information dodges the issue. You can’t Google a point of view.

The poorly informed don’t necessarily know less. They just know different things. A gamer who spends all his free time playing video games will have an encyclopedic understanding of those games. He is ill-informed only by arbitrary standards of what’s important. Not everyone agrees that there is a fixed set of facts that all should know. But absent such a set, the concept of being well-informed becomes a hopelessly relative one.

Today’s mediascape does not provide much guidance. It encourages us to create personal, solipsistic filters over information, making it unprecedentedly easy to gorge on news of favorite celebrities, TV shows, teams, political ideologies, and tech toys. This leaves less time and attention for everything else. The great risk isn’t that the internet is making us less informed or even misinformed. It’s that it may be making us meta-ignorant — less cognizant of what we don’t know.

Adapted from Head in the Cloud: Why Knowing Things Still Matters When Facts Are So Easy to Look Up by William Poundstone. Copyright © 2016 by William Poundstone.

How the Internet Is Making Us More ‘Meta-Ignorant’