I. The Poverty of Attention
I’m going to pause here, right at the beginning of my riveting article about attention, and ask you to please get all of your precious 21st-century distractions out of your system now. Check the score of the Mets game; text your sister that pun you just thought of about her roommate’s new pet lizard (“iguana hold yr hand LOL get it like Beatles”); refresh your work e-mail, your home e-mail, your school e-mail; upload pictures of yourself reading this paragraph to your “me reading magazine articles” Flickr photostream; and alert the fellow citizens of whatever Twittertopia you happen to frequent that you will be suspending your digital presence for the next twenty minutes or so (I know that seems drastic: Tell them you’re having an appendectomy or something and are about to lose consciousness). Good. Now: Count your breaths. Close your eyes. Do whatever it takes to get all of your neurons lined up in one direction. Above all, resist the urge to fixate on the picture, right over there, of that weird scrambled guy typing. Do not speculate on his ethnicity (German-Venezuelan?) or his backstory (Witness Protection Program?) or the size of his monitor. Go ahead and cover him with your hand if you need to. There. Doesn’t that feel better? Now it’s just you and me, tucked like fourteenth-century Zen masters into this sweet little nook of pure mental focus. (Seriously, stop looking at him. I’m over here.)
Over the last several years, the problem of attention has migrated right into the center of our cultural attention. We hunt it in neurology labs, lament its decline on op-ed pages, fetishize it in grassroots quality-of-life movements, diagnose its absence in more and more of our children every year, cultivate it in yoga class twice a week, harness it as the engine of self-help empires, and pump it up to superhuman levels with drugs originally intended to treat Alzheimer’s and narcolepsy. Everyone still pays some form of attention all the time, of course—it’s basically impossible for humans not to—but the currency in which we pay it, and the goods we get in exchange, have changed dramatically.
Back in 1971, when the web was still twenty years off and the smallest computers were the size of delivery vans, before the founders of Google had even managed to get themselves born, the polymath economist Herbert A. Simon wrote maybe the most concise possible description of our modern struggle: “What information consumes is rather obvious: It consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention, and a need to allocate that attention efficiently among the overabundance of information sources that might consume it.” As beneficiaries of the greatest information boom in the history of the world, we are suffering, by Simon’s logic, a correspondingly serious poverty of attention.
If the pundits clogging my RSS reader can be trusted (the ones I check up on occasionally when I don’t have any new e-mail), our attention crisis is already chewing its hyperactive way through the very foundations of Western civilization. Google is making us stupid, multitasking is draining our souls, and the “dumbest generation” is leading us into a “dark age” of bookless “power browsing.” Adopting the Internet as the hub of our work, play, and commerce has been the intellectual equivalent of adopting corn syrup as the center of our national diet, and we’ve all become mentally obese. Formerly well-rounded adults are forced to MacGyver worldviews out of telegraphic blog posts, bits of YouTube videos, and the first nine words of Times editorials. Schoolkids spread their attention across 30 different programs at once and interact with each other mainly as sweatless avatars. (One recent study found that American teenagers spend an average of 6.5 hours a day focused on the electronic world, which strikes me as a little low; in South Korea, the most wired nation on earth, young adults have actually died from exhaustion after multiday online-gaming marathons.) We are, in short, terminally distracted. And distracted, the alarmists will remind you, was once a synonym for insane. (Shakespeare: “poverty hath distracted her.”)
This doomsaying strikes me as silly for two reasons. First, conservative social critics have been blowing the apocalyptic bugle at every large-scale tech-driven social change since Socrates’ famous complaint about the memory-destroying properties of that newfangled technology called “writing.” (A complaint we remember, not incidentally, because it was written down.) And, more practically, the virtual horse has already left the digital barn. It’s too late to just retreat to a quieter time. Our jobs depend on connectivity. Our pleasure-cycles—no trivial matter—are increasingly tied to it. Information rains down faster and thicker every day, and there are plenty of non-moronic reasons for it to do so. The question, now, is how successfully we can adapt.