Why You Truly Never Leave High School

Sarah and Jim, 1988 & 2011.
For the past three years, Argentine photographer Irina Werning has been staging reenactments of old snapshots. The project, “Back to the Future,” includes 270 photographs made in 29 countries.
Photo: Irina Werning

Throughout high school, my friend Kenji had never once spoken to the Glassmans. They were a popular, football-­playing, preposterously handsome set of identical twins (every high school must have its Winklevii). Kenji was a closeted, half-Japanese orchestra nerd who kept mainly to himself and graduated first in our class. Yet last fall, as our 25th high-school reunion was winding down, Kenji grabbed Josh Glassman by his triceps—still Popeye spinach cans, and the subject of much Facebook discussion afterward—and asked where the after-party was. He was only half-joking.

Psychologically speaking, Kenji carries a passport to pretty much anywhere now. He’s handsome, charming, a software engineer at an Amazon subsidiary; he radiates the kind of self-possession that earns instant respect. Josh seemed to intuit this. He said there was an after-party a few blocks away, at the home of another former football player. And when Kenji wavered, Josh wouldn’t take no for an answer. “I could see there was no going back,” Kenji explained the next morning, over brunch. “It was sort of like the dog who catches the car and doesn’t know what to do with it.”

The party was fine. For a while, Kenji wondered if he’d been brought along as a stunt guest—a suspicion hardly allayed by Josh’s announcement “I brought the valedictorian!” as they were descending the stairs to their host’s living room—though Kenji’s attendance was in the same spirit, really, just in reverse. (“This is the party I never got invited to in high school,” he told Josh at one point, who didn’t disagree.) At any rate, Kenji didn’t care. His curiosities were anthropological: He had no idea what it was like “to be a football player or a cheerleader, get out of high school, marry someone from your local area, and settle in the same area.” And his conclusion, by the end of the night, was: Nothing special. “It was just an ordinary party, one that might have been a little uncomfortable if we all hadn’t been a little drunk.”

You’d think Kenji’s underwhelmed reaction would have been reassuring. But another classmate of ours, also at that brunch, didn’t take it that way. Like Kenji, Larry was brilliant, musically gifted, and hidden behind awkward glasses during most of his adolescence; like Kenji, he too is attractive and successful today. He received a Tony nomination for the score of Legally Blonde, he has a new baby, he married a great woman who just happens to be his collaborator. Yet his reaction was visceral and instantaneous. “Literally?” he said. “Your saying this makes me feel I wish I’d been invited to that.”

“Well, right,” said Kenji. “Because that’s the way high school is.”

“And maybe the way life is, still, sometimes,” said Larry. “About wanting to be invited to things.” He’s now working on a musical adaptation of Heathers, the eighties classic that culminates, famously, in Christian Slater nearly blowing up a high school.

Not everyone feels the sustained, melancholic presence of a high-school shadow self. There are some people who simply put in their four years, graduate, and that’s that. But for most of us adults, the adolescent years occupy a privileged place in our memories, which to some degree is even quantifiable: Give a grown adult a series of random prompts and cues, and odds are he or she will recall a disproportionate number of memories from adolescence. This phenomenon even has a name—the “reminiscence bump”—and it’s been found over and over in large population samples, with most studies suggesting that memories from the ages of 15 to 25 are most vividly retained. (Which perhaps explains Ralph Keyes’s observation in his 1976 classic, Is There Life After High School?: “Somehow those three or four years can in retrospect feel like 30.”)

To most human beings, the significance of the adolescent years is pretty intuitive. Writers from Shakespeare to Salinger have done their most iconic work about them; and Hollywood, certainly, has long understood the operatic potential of proms, first dates, and the malfeasance of the cafeteria goon squad. “I feel like most of the stuff I draw on, even today, is based on stuff that happened back then,” says Paul Feig, the creator of Freaks and Geeks, which had about ten glorious minutes on NBC’s 1999–2000 lineup before the network canceled it. “Inside, I still feel like I’m 15 to 18 years old, and I feel like I still cope with losing control of the world around me in the same ways.” (By being funny, mainly.)

Yet there’s one class of professionals who seem, rather oddly, to have underrated the significance of those years, and it just happens to be the group that studies how we change over the course of our lives: developmental neuroscientists and psychologists. “I cannot emphasize enough the amount of skewing there is,” says Pat Levitt, the scientific director for the National Scientific Council on the Developing Child, “in terms of the number of studies that focus on the early years as opposed to adolescence. For years, we had almost a religious belief that all systems developed in the same way, which meant that what happened from zero to 3 really mattered, but whatever happened thereafter was merely tweaking.”

Pancho, 1983 & 2010.Photo: Irina Werning

Zero to 3. For ages, this window dominated the field, and it still does today, in part for reasons of convenience: Birth is the easiest time to capture a large population to study, and, as Levitt points out, “it’s easier to understand something as it’s being put together”—meaning the brain—“than something that’s complex but already formed.” There are good scientific reasons to focus on this time period, too: The sensory systems, like hearing and eyesight, develop very early on. “But the error we made,” says Levitt, “was to say, ‘Oh, that’s how all functions develop, even those that are very complex. Executive function, emotional regulation—all of it must develop in the same way.’ ” That is not turning out to be the case. “If you’re interested in making sure kids learn a lot in school, yes, intervening in early childhood is the time to do it,” says Laurence Steinberg, a developmental psychologist at Temple University and perhaps the country’s foremost researcher on adolescence. “But if you’re interested in how people become who they are, so much is going on in the adolescent years.”

In the past couple of decades, studies across the social sciences have been designed around this new orientation. It has long been known, for instance, that male earning potential correlates rather bluntly with height. But it was only in 2004 that a trio of economists thought to burrow a little deeper and discovered, based on a sample of thousands of white men in the U.S. and Britain, that it wasn’t adult height that seemed to affect their subjects’ wages; it was their height at 16. (In other words, two white men measuring five-foot-eleven can have very different earning potential in the same profession, all other demographic markers being equal, just because one of them was shorter at 16.) Eight years later, Deborah Carr, a sociologist at Rutgers, observed something similar about adults of a normal weight: They are far more likely to have higher self-esteem if they were a normal weight, rather than overweight or obese, in late adolescence (Carr was using sample data that tracked weight at age 21, but she notes that heavy 21-year-olds were also likely to be heavy in high school). Robert Crosnoe, a University of Texas sociologist, will be publishing a monograph with a colleague this year that shows attractiveness in high school has lingering effects, too, even fifteen years later. “It predicted a greater likelihood of marrying,” says Crosnoe, “better earning potential, better mental health.” This finding reminds me of something a friend was told years ago by Frances Lear, head of the eponymous, now defunct magazine for women: “The difference between you and me is that I knew in high school I was beautiful.”

Our self-image from those years, in other words, is especially adhesive. So, too, are our preferences. “There’s no reason why, at the age of 60, I should still be listening to the Allman Brothers,” Steinberg says. “Yet no matter how old you are, the music you listen to for the rest of your life is probably what you listened to when you were an adolescent.” Only extremely recent advances in neuroscience have begun to help explain why.

It turns out that just before adolescence, the prefrontal cortex—the part of the brain that governs our ability to reason, grasp abstractions, control impulses, and self-­reflect—undergoes a huge flurry of activity, giving young adults the intellectual capacity to form an identity, to develop the notion of a self. Any cultural stimuli we are exposed to during puberty can, therefore, make more of an impression, because we’re now perceiving them discerningly and metacognitively as things to sweep into our self-concepts or reject (I am the kind of person who likes the Allman Brothers). “During times when your identity is in transition,” says Steinberg, “it’s possible you store memories better than you do in times of stability.”

At the same time, the prefrontal cortex has not yet finished developing in adolescents. It’s still adding myelin, the fatty white substance that speeds up and improves neural connections, and until those connections are consolidated—which most researchers now believe is sometime in our mid-­twenties—the more primitive, emotional parts of the brain (known collectively as the limbic system) have a more significant influence. This explains why adolescents are such notoriously poor models of self-­regulation, and why they’re so much more dramatic—“more Kirk than Spock,” in the words of B. J. Casey, a neuroscientist at Weill Medical College of Cornell University. In adolescence, the brain is also buzzing with more dopamine activity than at any other time in the human life cycle, so everything an adolescent does—everything an adolescent feels—is just a little bit more intense. “And you never get back to that intensity,” says Casey. (The British psychoanalyst Adam Phillips has a slightly different way of saying this: “Puberty,” he writes, “is everyone’s first experience of a sentient madness.”)

Patrick, 1996 & 2011.Photo: Irina Werning

Those feelings of intensity aren’t just associated with good experiences. Casey and two of her colleagues, Francis Lee and Siobhan Pattwell, were part of a team that co-published a startling paper last year showing that adolescents—both mice and humans—were far less capable of dialing back their fear response than children or adults. They did so by designing two very simple experiments: In mice, they paired a neutral tone with a shock; in humans, they paired a neutral color with a horrible noise. Both populations learned to associate one with the other. The mice froze as soon as they heard the tone; the humans, when seeing the color, would sweat more. Over the next few days, the researchers again played the neutral tone for the mice and showed the neutral color to the humans, but this time without the horrible outcome (no shock, no loud noise). And over the course of those few days, both the adults and the children—whether mice or human—learned to dissociate the two.

But not the adolescents. Whether they were pubescent mice or high-school students, the adolescents remained as fear-stricken as ever. Their systems remained on high alert, as if a threat were just around the corner.

These studies could have sobering implications. If, as the researchers say, adolescents have an exaggerated sense of fear when faced with certain triggers, isn’t it possible they could carry that exaggerated panic into adulthood, because they never developed the tools at the time to beat it back? I phoned Pattwell and Lee to ask this question. The press release accompanying the study notes that an estimated 75 percent of people with fear-related disorders “can trace the roots of their anxiety to earlier ages.” Doesn’t this suggest that the fears of adolescence are harder to overcome?

“It’s funny you say that,” said Pattwell. “We actually checked in with the mice 30 days later, once they’d reached adulthood.”

And?

“Their level of fear was just as high,” she said. “It was as if the experiment had just been done.”

Now, people are not mice, and there are limits to what one can learn from a single experiment. But if humans really do feel things most intensely during adolescence, and if, at this same developmental moment, they also happen to be working out an identity for the first time—“sometimes morbidly, often curiously, preoccupied with what they appear to be in the eyes of others as compared with what they feel they are,” as the psychoanalyst Erik Erikson wrote—then it seems safe to say this: Most American high schools are almost sadistically unhealthy places to send adolescents.

Until the Great Depression, the majority of American adolescents didn’t even graduate from high school. Once kids hit their teen years, they did a variety of things: farmed, helped run the home, earned a regular wage. Before the banning of child labor, they worked in factories and textile mills and mines. All were different roads to adulthood; many were undesirable, if not outright Dickensian. But these disparate paths did arguably have one virtue in common: They placed adolescent children alongside adults. They were not sequestered as they matured. Now teens live in a biosphere of their own. In their recent book Escaping the Endless Adolescence, psychologists Joseph and Claudia Worrell Allen note that teenagers today spend just 16 hours per week interacting with adults and 60 with their cohort. One century ago, it was almost exactly the reverse.

Something happens when children spend so much time apart from adult company. They start to generate a culture with independent values and priorities. James Coleman, a renowned mid-century sociologist, was among the first to analyze that culture in his seminal 1961 work, The Adolescent Society, and he wasn’t very impressed. “Our society has within its midst a set of small teen-age societies,” he wrote, “which focus teen-age interests and attitudes on things far removed from adult responsibilities.” Yes, his words were prudish, but many parents have had some version of these misgivings ever since, especially those who’ve consciously opted not to send their kids into the Roman amphi­theater. (From the website of the National Home Education Network: “Ironically, one of the reasons many of us have chosen to educate our own is precisely this very issue of socialization! Children spending time with individuals of all ages more closely resembles real life than does a same-age school setting.”)

In fact, one of the reasons that high schools may produce such peculiar value systems is precisely because the people there have little in common, except their ages. “These are people in a large box without any clear, predetermined way of sorting out status,” says Robert Faris, a sociologist at UC Davis who’s spent a lot of time studying high-school aggression. “There’s no natural connection between them.” Such a situation, in his view, is likely to reward aggression. Absent established hierarchies and power structures (apart from the privileges that naturally accrue from being an upperclassman), kids create them on their own, and what determines those hierarchies is often the crudest common-­denominator stuff—looks, nice clothes, prowess in sports—­rather than the subtleties of personality. “Remember,” says Crosnoe, who spent a year doing research in a 2,200-student high school in Austin, “high schools are big. There has to be some way of sorting people socially. It’d be nice if kids could be captured by all their characteristics. But that’s not realistic.”

Riff Raff, 1976 & 2011Photo: Irina Werning

The result, unfortunately, is a paradox: Though adolescents may want nothing more than to be able to define themselves, they discover that high school is one of the hardest places to do it. Crosnoe mentions the 1963 classic Stigma: Notes on the Management of Spoiled Identity, in which the sociologist Erving Goffman very devastatingly defines the term in his title as “a trait that can obtrude itself upon attention … breaking the claim that other attributes have on us.” For many people, that’s the high-school experience in a nutshell. At the time they experience the most social fear, they have the least control; at the time they’re most sensitive to the impressions of others, they’re plunked into an environment where it’s treacherously easy to be labeled and stuck on a shelf. “Shame,” says Brené Brown, a researcher at the University of Houston, “is all about unwanted identities and labels. And I would say that for 90 percent of the men and women I’ve interviewed, their unwanted identities and labels started during their tweens and teens.”

Out of all the researchers who think about high-school-related topics, Brené Brown may be the one whose work interests me most. Since 2000, she has studied shame in pointillist detail. She’s written both academic papers and general-interest books on the subject; her ted lecture on shame was one of the most popular of all time. Because that’s what high school—both at the time and as the stuff of living memory—is about, in its way: shame. And indeed, when Brown and I met for breakfast this fall, she told me that high school comes up all the time in her work. “When I asked one of the very first men I ever interviewed, ‘What does shame mean to you?’ ” she recalled, “he answered, ‘Being shoved up against the lockers.’ High school is the metaphor for shame.”

The academic interest in shame and other emotions of self-consciousness (guilt, embarrassment) is relatively recent. It’s part of a broader effort on the part of psychologists to think systematically about resilience—which emotions serve us well in the long run, which ones hobble and shrink us. Those who’ve spent a lot of time thinking about guilt, for example, have come to the surprising conclusion that it’s pretty useful and adaptive, because it tends to center on a specific event (I cannot believe I did that) and is therefore narrowly focused enough to be constructive (I will apologize, and I will not do that again).

Shame, on the other hand, is a much more global, crippling sensation. Those who feel it aren’t energized by it but isolated. They feel unworthy of acceptance and fellowship; they labor under the impression that their awfulness is something to hide. “And this incredibly painful feeling that you’re not lovable or worthy of belonging?” asks Brown. “You’re navigating that feeling every day in high school.”

Most of us, says Brown, opt for one of three strategies to cope with this pain. We move away from it, “by secret-keeping, by hiding”; we move toward it, “by people-pleasing”; or we move against it “by using shame and aggression to fight shame and aggression.” Whichever strategy we choose, she says, the odds are good we’ll use that strategy for life, and those feelings of shame will heave to the surface, unbidden and unannounced, in all sorts of unfortunate settings down the road.

Like among our future families, for instance. Brown says it’s remarkable how many parents of teenagers talk to her about reexperiencing the shame of high school once their own kids start to experience the same familiar scenarios of rejection. “The first time our kids don’t get a seat at the cool table, or they don’t get asked out, or they get stood up—that is such a shame trigger,” she says. “It’s like a secondary trauma.” So paralyzing, in fact, that she finds parents often can’t even react with compassion. “Most of us don’t say, ‘Hey, it’s okay. I’ve been there.’ We say, ‘I told you to pull your hair back and wear some of those cute clothes I bought you.’ ”

And it’s not just the bullied who carry the shame of those years. Rosalind Wiseman, author of Queen Bees and Wannabes (subsequently transformed into the movie Mean Girls), points to the now-legendary Washington Post story that ran last spring, which documented Mitt Romney’s escapades as a prep-school ogre: pinning down an outcast and cutting his hair; shouting “Atta girl” to a closeted boy when he tried to speak; leading a teacher with poor eyesight into a set of closed doors. Years later, one of the victims carried that pain with him still (“It’s something I have thought about a lot since then,” he said). But even more telling, she notes, was that Romney’s co-conspirators in thuggery felt so awful about their misdeeds as boys in 1965 that they talked about them openly, on the record, as grown men in 2012. “To this day, it troubles me,” Thomas Buford, a retired prosecutor, told the Post. He carried around that shame for almost half a century.

In the fall of 2011, Tavi Gevinson, the 16-year-old force behind the web magazine Rookie, solicited a wide variety of celebrities for advice about how to survive high school. Among the wisest essays came from Winnie Holzman, the creator of My So-Called Life. “In high school,” she wrote, “we become pretty convinced that we know what reality is: We know who looks down on us, who is above us, exactly who our friends and our enemies are.” The truth of the matter, wrote Holzman, is that we really have no clue. “[W]hat seems like unshakable reality,” she concluded, “is basically just a story we learned to tell ourselves.”

There happens to be a body of contemporary research that suggests Holzman is right. Adolescents often do take a highly distorted view of their social world. In 2007, for instance, Steinberg and two colleagues surveyed hundreds of adolescents in two midwestern communities, asking them to decide which category they most identified with: Jocks, Populars, Brains, Normals, Druggie/Toughs, Outcasts, or None. They also asked a subsample of those kids to make the same assessment of their peers. Then they compared results.

Some were predictable. The kids who were identified as Druggies, Normals, or Jocks, for example, tended to see themselves in the same way. What was surprising was the self-assessment of the kids others thought were popular. Just 27 percent in one study and 37 in a similar, second study in the same paper saw themselves as campus celebrities. Yes, a few declared themselves Jocks, perhaps just as prestigious. But more were inclined to view themselves either as normal or none of the above.

Faris’s research on aggression in high-school students may help account for this gap between reputation and self-­perception. One of his findings is obvious: The more concerned kids are with popularity, the more aggressive they are. But another finding isn’t: Kids become more vulnerable to aggression as their popularity increases, unless they’re at the very top of the status heap. “It’s social combat,” he explains. “Think about it: There’s not much instrumental value to gossiping about a wallflower. There’s value to gossiping about your rivals.” The higher kids climb, in other words, the more precariously balanced they feel, unless they’re standing on the square head of the totem pole. It therefore stands to reason that many popular kids don’t see themselves as popular, or at least feel less powerful than they loom. Their perch is too fragile.

It’s also abundantly, poignantly clear that during puberty, kids have absolutely no clue how to assess character or read the behavior of others. In 2005, the sociologist Koji Ueno looked at one of the largest samples of adolescents in the United States, and found that only 37 percent of their friendships were reciprocal—meaning that when respondents were asked to name their closest friends, the results were mutual only 37 percent of the time. One could argue that this heartbreaking statistic is just further proof that high school is a time of unrequited longings. But these statistics also suggest that teenagers cannot tell when they are being rejected (Hey, guys, wait for me!) or even accepted (I thought you hated me). So much of what they think they know about others’ opinions of them is plain wrong.

Deborah Yurgelun-Todd, director of the Cognitive Neuroimaging Laboratory at the University of Utah, did a well-known pilot study at McLean Hospital a few years ago asking teenagers to look at a picture of a face and identify the emotion they saw. Every adult who looked at that picture—100 percent of them—saw fear in that face. Not the teenagers. Half of them saw anger or confusion, even sadness.

It was a really small study. I wouldn’t necessarily read too much into it. But its results sum up the entire high-school experience, in my view: mistaking people’s fear for something else.

Kurt Vonnegut wrote that high school “is closer to the core of the American experience than anything else I can think of.” And it is, certainly, in the sense that it’s the last shared cultural experience we have before choosing different paths in our lives. But for years, I’d never quite understood why high-school values are so different from adult ones. In fact, whenever I spoke to sociologists who specialized in the rites and folkways of this strange institution, I’d ask some version of this question: Why is it that in most public high schools across America, a girl who plays the cello or a boy who plays in the marching band is a loser? And even more fundamentally: Why was it such a liability to be smart?

The explanations tended to vary. But among the most striking was the one offered by Steinberg, who conjectured that high-school values aren’t all that different from adult values. Most adults don’t like cello or marching bands, either. Most Americans are suspicious of intellectuals. Cellists, trumpet players, and geeks may find their homes somewhere in the adult world, and even status and esteem. But only in places that draw their own kind.

Robert Faris puts an even finer point on this idea. “If you put adults in a similar situation”—meaning airlifted into a giant building full of strangers with few common bonds—“you’d find similar behaviors.” Like reality television, for instance, in which people literally divide into tribes, form alliances, and vote one another off the island. “And I think you see it in nursing homes,” says Faris. “In small villages. And sometimes in book clubs.” And then I realized, having covered politics for many years: Congress, too. “It’s not adolescence that’s the problem,” insists Faris. “It’s the giant box of strangers.”

As adults, we spend a lot of time in boxes of strangers. “I have always referred to life as ‘perpetual high school,’ ” Paul Feig wrote me in our first e-mail exchange, later adding, when we spoke, that his wife’s first order when she landed her Hollywood dream job was to go fire her predecessor. Brown tells me she frequently hears similar things from men in finance—as a reward for outstanding quarterly earnings, they get to pick their new office, which means displacing someone else. (The corresponding shame led one to consider quitting: “I didn’t sign up to terrorize people,” he tells her in her latest book, Daring Greatly.) Today, we also live in an age when our reputation is at the mercy of people we barely know, just as it was back in high school, for the simple reason that we lead much more public, interconnected lives. The prospect of sudden humiliation once again trails us, now in the form of unflattering photographs of ourselves or unwanted gossip, virally reproduced. The whole world has become a box of interacting strangers.

Maybe, perversely, we should be grateful that high school prepares us for this life. The isolation, the shame, the aggression from those years—all of it readies us to cope. But one also has to wonder whether high school is to blame; whether the worst of adult America looks like high school because it’s populated by people who went to high school in America. We’re recapitulating the ugly folkways of this institution, and reacting with the same reflexes, because that’s where we were trapped, and shaped, and misshaped, during some of our most vulnerable years.

High school itself does something to us, is the point. We bear its stripes. Last October, the National Bureau of Economic Research distributed a study showing a compelling correlation between high-school popularity—measured by how many “friendship nominations” each kid received from their peers—and future earnings in boys. Thirty-five years later, the authors estimated, boys who ranked in the 80th percentile of popularity earned, on average, 10 percent more than those in the 20th. There are obvious chicken-and-egg questions in all studies like this; maybe these kids were already destined for dominance, which is why they were popular. But Gabriella Conti, an economist and first author of the paper, notes that she and her colleagues took into consideration the personality traits of their subjects, measuring their levels of openness, agreeableness, extroversion, and so forth. “And adolescent popularity is predictive beyond them,” she says, “which tells me this is about more than just personality. It’s about interpersonal relations. High school is when you learn how to master social relationships—and to understand how, basically, to ‘play the game.’ ” Or don’t. Joseph Allen and his colleagues at the University of Virginia just found that kids who suffer from mild depression at 14, 15, and 16 have worse odds in the future—in romance, friendship, competency assessments by outsiders—even if their depression disappears and they become perfectly happy adults. “Because that’s their first template for adult interaction,” says Allen when asked to offer an explanation. “And once they’re impaired socially, it carries forward.”

Yet even the most popular kids, the effortlessly perfect ones, the ones who roamed the halls as if their fathers had built them especially in their honor, may not entirely benefit from the experiences of the high-school years. In 2000, three psychologists presented a paper titled “Peer Crowd-Based Identities and Adjustment: Pathways of Jocks, Princesses, Brains, Basket-Cases, and Criminals,” which asked a large sample of tenth-graders which of the five characters from The Breakfast Club they most considered themselves to be, and then checked back in with them at 24. The categories were “immensely predictive,” according to Jacquelynne Eccles, one of the authors. (Criminals were still most apt to smoke pot; male jocks still had the highest self-esteem.) But one datum was interesting: At 24, the princesses had lower self-esteem than the brainy girls, which certainly wasn’t true when they were 16. But Eccles sees no inconsistency in this finding. In fact, she suspects it will hold true when she completes her follow-up with the same sample at 40. “Princesses are caught up in this external world that defines who they are,” says Eccles, “whereas if brainy girls claim they’re smart, that probably is who they are.” While those brainy girls were in high school, they couldn’t rely on their strengths to gain popularity, perhaps, but they could rely on them as fuel, as sources of private esteem. Out of high school, they suddenly had agency, whereas the princesses were still relying on luck and looks and public opinion to carry them through, just as they had at 16. They’d learned passivity, and it’d stuck.

Whether it’s for vindication or validation, whether out of self-punishment or self-­appeasement, many of us choose to devote a lot of time revisiting our high-school years. That’s the crazy thing. In 2011, the Pew Research Center found that the largest share of our Facebook friends—22 percent—come from high school. Keith Hampton, a Rutgers sociologist and one of the researchers who did the analysis, says this is true for college- and non-college-educated Americans alike. In fact, Hampton suspects that Facebook itself plays a role. “Before Facebook, there was a real discontinuity between our high-school selves and the rest of our lives.” Then Mark Zuckerberg came along. “Social ties that would have gone dormant now remain accessible over time, and all the time.”

Maybe that’s what ultimately got me to that nondescript bar near Times Square last fall. Until Facebook, the people from my high-school years had undeniably occupied a place in my unconscious, but they were ghost players, gauzy and green at the edges. Now here they were, repeatedly appearing in my news feed, describing their plans to attend our reunion. And so I went, curious about whom they’d become. There were the former football players, still acting like they owned the joint, but as much more generous proprietors. There were the beautiful girls, still beautiful, but looking less certain about themselves. There was my former best pal, who’d blown past me on her way to cheerleaderhood, but nervous in a way I probably hadn’t recognized back then. I was happy to see her. And to see a lot of them, truth be told. We’d all grown more gracious; many of us had bloomed; and it was strangely moving to be among people who all shared this shameful, grim, and wild common bond. I found myself imagining how much nicer it’d have been to see all those faces if we hadn’t spent our time together in that redbrick, linoleum-­tiled perdition. Then again, if we hadn’t—if we’d been somewhere more benign—I probably wouldn’t have cared.

See Also:
Slideshow: Celeb Yearbook Photos

Why You Truly Never Leave High School