The Bad Things That Happen When People Can’t Deal With Ambiguous Situations


Just about everyone dislikes the feeling of not knowing the answer to an important question about what’s going to happen in the future. Generally speaking, waiting to hear whether you’ll get an important job, or to find out about a loved one’s diagnosis after a medical test, is a uniquely anxiety-provoking experience. It’s not uncommon to hear people mired in these sorts of limbos say that they’d prefer hearing about a terrible outcome than continue waiting to find out. That way, at least they’d know, and a potent form of psychological tension would be released.

In his book Nonsense: The Power of Not Knowing, which was published last week by Crown, Jamie Holmes, a Future Tense Fellow at the New America Foundation, takes readers on a tour of this human tendency, highlighting the voluminous research into both its upsides and downsides. “If there’s any takeaway, it’s that we’re programmed to get rid of ambiguity, and yet if we engage with it we can make better decisions, we can be more creative, and we can even be a little more empathetic,” Holmes told Science of Us. 

Holmes was quick to point out that the feeling of ambiguity isn’t an inherently negative one. “It’s not always unpleasant,” he said. It’s more that uncertainty is “an emotional amplifier,” as he put it. “A lot of times it is threatening, just because of the content of what we happen to be facing: whether I’m going to be fired, or a physical threat, or the uncertainty of a medical diagnosis. But there’s also great research that shows that if we’re uncertain about whether someone’s romantically interested in us, or if we’re uncertain about whether something good or really good might happen to us, then those experiences are even more pleasurable than they usually are.” His book also highlights the many ways embracing ambiguity can make people more creative and better problem-solvers.


But it’s the other side of the ledger that feels a bit more urgent: history, recent and otherwise, is replete with examples of catastrophic blunders made as a result of leaders’ inability to deal with ambiguity. As Holmes explains in Nonsense, it’s when we’re stressed out — and particularly when we’re faced with what we feel are existential threats — that our resistance to ambiguity grows strongest. For Americans, 9/11 appears to have a unique ability to ramp up our need to resolve ambiguity: As Holmes explains in the book, one study showed that simply reminding Americans of 9/11 increased their need for (cognitive) closure — an important psychological concept measuring, to put it roughly, one’s level of comfort with ambiguity (as I wrote last year, need for closure can partially explain why some people were driven crazy by the first season of the blockbuster podcast “Serial”). “I feel like the entire need for closure of the country went up, and any uncertainty in general became more unpleasant,” said Holmes. He doesn’t view it as a stretch to say that the invasion of Iraq, which was broadly supported by Americans when it first occurred, can be partly attributed to the country’s need for a simple story of good and evil at a time great fear and uncertainty.


Perhaps the most carefully laid-out example of the real-world consequences of the need for closure in Holmes’s book is his recounting of the U.S. government’s disastrous handling of the Branch Davidians, a breakaway Christian sect whose leader, David Koresh, was wanted by the government on weapons charges. In February of 1993, the ATF attempted to raid Koresh’s Mount Carmel compound outside Waco, Texas, so as to capture the fugitive, but Koresh was tipped off beforehand and the raid failed spectacularly, leading to the deaths of five Davidians and four ATF agents. The FBI was called in, kicking off 50 days of negotiations between the Bureau and Koresh — an attempt to resolve the siege without further bloodshed. On April 19, the FBI, its decision-makers tired of negotiating, followed through with a plan approved by then attorney general Janet Reno, punching holes in the compound and filling it with tear gas in an attempt to drive the occupants out. Instead, the Davidians set the compound on fire and it burned to the ground on national TV. As Holmes writes, “over seventy Davidians died, including twenty-five children,” leading to a tragic “national embarrassment.”


This terrible outcome, writes Holmes, was partly the result of two very different attitudes held by those working to resolve the dispute during the 50 days between the initial raid and the fire. On one side was Gary Noesner, a talented hostage negotiator who had been called down to Waco to speak with Koresh. The two struck up a relationship, and Koresh appeared to gradually grow to trust Noesner, which helped lead to Koresh’s agreement to allow a steady stream of Davidians, mostly children, to leave the compound (each Davidian who left peacefully, of course, made the government’s job easier and reduced the risk of Davidians being harmed).


Unfortunately, as Noesner worked to resolve the conflict peacefully, the head of the FBI’s Hostage Rescue Team, Dick Rogers, was growing impatient. Rogers, aka “Sergeant Severe,” grew particularly incensed when Koresh reneged on an agreement to leave the compound on March 2 if the Christian Broadcasting Network aired a message he had recorded about the Book of Revelation, which it did. Noesner “was used to negotiating with all sorts of people, and he knew not to overreact. The bottom line was that the negotiations were working. He and his team had gotten a steady stream of people out of Mount Carmel safely.” Rogers and those working with him, however, saw things differently — the episode further confirmed their views that Koresh was an evil snake who was not to be trusted, and who could only be dislodged from the compound by force.


These views gradually won out, and even as Noesner continued negotiating with Koresh he found himself regularly undermined by Rogers’s decisions to, for example, cut power to the compound, angering Koresh. The eventual results were catastrophic for everyone involved. Holmes explains that Noesner, whom he interviewed for the book, agreed to take a shortened version of the standard test psychologists use to measure individuals’ need for closure, and scored extremely low, which isn’t a surprise — he was able to hold in his head the conflicting ideas that Koresh was a bad guy who had hurt people, but also that he was worth negotiating with for a protracted period, even if there were slipups along the way. It’s safe to say Rogers sits at the other end of the spectrum.


It’s one thing to identify the problems posed by humanity’s distaste for ambiguity, Holmes writes in Nonsense, but it’s another to actually overcome these hurdles. At the organization level, there are some (in theory) simple fixes. “Beyond hiring more people like Noesner,” writes Holmes, “organizations can … also create a culture that respects ambiguity” by, among other things, “underscor[ing] the consequences of bad decisions” and, in times of crisis, making sure everyone envisions a wide range of responses and outcomes rather than quickly narrowing the scope of discussion. What all these recommendations have in common is that they will help prevent the sort of black-or-white thinking that so often leads to bad decisions, particularly during periods of heightened fear or more general emotional arousal.


Nudging people away from poor, need-for-closure-driven decisions at the individual level is tougher, but Holmes had some some suggestions there as well. One is to simply be deliberate in your decision-making, not just writing down, say, pros and cons, but listing as many potential consequences of different decisions as possible. It’s also important to realize that your need for closure can vary depending on the circumstances at the time. “You can kind of have a rough self-check — what is my need for closure today, this week? Have I been under a lot of pressure? Is there a lot of uncertainty in my life?”


Finally, there are a couple of broader options for people who want to better handle ambiguity, although Holmes admits they’re a bit less practical. One is to simply read fiction: “Reading fiction has been shown to lower people’s need for closure. I think that’s partially because it’s safe, and you go into this other world, and it’s kind of broadening our categories because we’re thinking about how other people make decisions.” And the other is “positive multicultural experiences,” which appear to have the effect of lowering need for closure for similar reasons.


None of these personal or organization tweaks are panaceas, of course. Humans aren’t going to stop making knee-jerk decisions out of a resistance to ambiguity anytime soon. But Nonsense is still an extremely useful primer for anyone who wants to better understand the complicated ways ambiguity affects human decision-making.