Streetlights and Shadows:
Searching for the Keys to Adaptive Decision Making
by Gary Klein
352 pages, softbound, 9x6
Reviewed by Gila Hayes
Making decisions without all the facts is an issue for self defense, especially when called on later to justify use of force to people who were not present. Rules about use of force get codified into law and society expects armed citizens to adapt those rules to the situation. I picked up cognitive psychologist Gary Klein’s book, Streetlights and Shadows: Searching for the Keys to Adaptive Decision Making seeking how to overcome the uncertainty of incomplete facts and how to later describe perceptions driving such a decision.
Klein challenges ten popular axioms about making decisions. These have worked in predictable, structured situations, he suggests, but fail in unpredictable, fast-breaking circumstances. Many emergencies are a mix of the predictable and unpredictable, he asserts. This book studies where those domains overlap, shows the value of experience over analysis, and defines why rules and procedures often fall short.
People want simple rules to guide decisions, but rules and procedures which rely on explicit knowledge rarely suffice, Klein writes. Good decisions draw from both rules and intuition. “For many types of complex work we need both procedures and the judgment to interpret and work around the procedures,” he writes. SWAT teams, for example, operate under procedures and rules, but the unpredictability of their callouts perfectly illustrates “adapt[ing] as the mission unfolds.” Klein adds, “Skilled performance depends on…how we interpret, modify, and replace the standard procedures when they don’t work.” While sometimes rather academic, the book is full of examples illustrating real-life applications of his premises.
Defining what he calls tacit knowledge and its role in perception, pattern recognition, problem solving and judgment calls, Klein acknowledges that intuitive knowledge “is hard to articulate or even to notice. We depend on unconscious processes to carry out tasks. That’s how our experience gets translated into our actions. Because most of the time we don’t think about this background knowledge, it stays hidden under the surface of our lives.”
Foundations of Intuition
Klein diagrams a triad of decision-making, sense-making and adapting at the center of which is a core of experience. Recognized patterns from earlier experiences “let us judge what category of situation we are facing,” he explains. “We draw on dozens and hundreds of experiences to sense when something seems familiar, or to pick up anomalies.”
Experts struggle to describe perceptions and mental models when asked to explain their decisions. He reports, “I have had several expert decision makers explain to me that they felt they had made specific life-and-death decisions on the basis of extrasensory perception. It had felt like ESP to them because their decisions weren’t based on conscious reasoning. Eventually I got to the bottom of their strategies and showed them that they hadn’t in fact used ESP, but getting there took a lot of work.”
Intuition, Klein says, applies “experience without consciously thinking things out,” and is just as important as analysis. “We need both logic and intuition. Either one, by itself, can get us in trouble,” he adds. Decisions are only as accurate as the knowledge–tacit or explicit–on which they’re based, he explains. Filling in unknown details through our own biases is another impediment to accurate decisions, he continues, comparing decision making to vision. We exaggerate the contrast between objects to determine what we are looking at and unconsciously “adjust for this distortion.” He suggests that we accept and adjust around this kind of inaccuracy in many aspects of daily life.
Klein blames information overload and over thinking for judgment errors. Intuitive knowledge defies verbal explanation. Over-analysis encourages wrong choices by favoring explicit knowledge over an intuitive response that can’t be explained. Analysis of explicit information alone “keyholes” our focus to obvious elements and often overlooks vital context, Klein asserts. “The notion of favoring logic and statistics over intuition leads to overthinking. It pretends that we can make sense of facts without taking context into account,” he explains.
How can we reduce how much data we have to consider? Klein suggests, “The more skilled one is, the fewer options one thinks about.” Experience eliminates poor options without wasting time. He interviewed experienced fire fighters who insisted they didn’t decide how to fight fires by comparing options, but Klein concluded that the fire fighters were actually imagining how options would play out, quickly eliminating the poor choices and acting on the best.
The academic description for matching a known pattern and imagining likely outcomes is the Recognition-Primed Decision process. “Good decision makers use their experience to recognize an effective option and evaluate it through mental simulation,” Klein stresses. Experienced people–like the fire fighters–imagine possible outcomes so smoothly that each option is not consciously evaluated. Experience creates hunches about how to react. He illustrates the same process in use during an offshore oil drilling emergency, Captain Sullenberger’s 2009 Hudson River landing, and by nurses and even chess experts.
All this assumes being able to acquire applicable experience Klein acknowledges in chapter seven of Streetlights and Shadows. “People become experts by the lessons they draw from their experiences, and by the sophistication of their mental models about how things work,” he introduces. “Mental models are developed through experience—individual experience, organizational experience, and cultural experience,” he writes, offering an example of connecting numerous facts to predict a horse race winner. “Our mindsets frame the cues in front of us and the events that are unfolding so we can make sense of everything. Experience and patterns produce mindsets. The more experience we have, the more patterns we have learned, the larger and more varied our mindsets and the more accurate they are.”
The next section titled Making Sense of Situations promised more about emergency decisions. Klein builds on examples sketched out in the beginning of the book to illustrate developing theories, so I’d absorbed a lot of information to reach this part. Fortunately, the foundational material was interesting and worthwhile.
Experience makes it easier to resolve contradictory information or to digest large quantities of data. “Sensemaking is not just a matter of connecting the dots. Sensemaking determines what counts as a dot. Jumping to conclusions is sometimes the right thing to do even before all the dots have been collected,” Klein introduces.
Klein warns against information overload with examples of hesitation to react to numerous early warnings in the face of impending natural disasters. That’s particularly likely when a heavy load of data fails to point to a uniform conclusion, he explains. More data is not needed, he argues; better analysis is needed. When plagued with too much or incongruent data, we must focus on what is known, not the unknown, he advises.
In the book’s summary chapter, Klein advises readers to embrace uncertainty instead of struggling to gain control. When trouble strikes, “People with the control mentality get frustrated and discouraged. However, when the routines break down, those with a resilience mindset switch gears,” he reports. This leads into a section titled Anticipatory Thinking, which teaches advancing from one task to the next and allowing for surprises, detecting discrepancies, absorbing meanings and confidently resolving problems. Of all of Streetlights and Shadows this segment is most applicable to the daily safety decisions of the armed citizens. “Anticipatory thinking describes how we actively speculate about what might happen next,” he continues.
Using combat pilots as an example, Klein explains that accurately assessing change requires quickly recognizing what’s applicable and putting it into perspective for present circumstances. Of all the details a pilot sees, what is trivial? What is immediately important and what will become crucial in moments? What is not yet apparent? “Sensemaking seems to depend heavily on seeing such connections. We can’t find the connections by exhaustively linking all the elements because we don’t know in advance which are the relevant elements,” he explains.
The third section of Streetlight and Shadows is entitled Adapting. “In complex situations, our attempts to make adaptations may fail if we persist in pursuing the goals we started with, if we rely too heavily on identifying and minimizing risks, and if we maintain the ground rules we set at the beginning of an activity. Adapting means revising our goals, becoming resilient to threats we cannot predict, and changing the way we work together. To adapt, we have to learn, but we also have to unlearn,” he introduces.
Adapting can require compromises that fall short of the desired outcome, Klein continues. When goals shift or are in conflict, seek the best tradeoff. Look for new connections; don’t doggedly pursue the original goal, he advises. In a chapter on risk management he challenges the notion that with sufficient effort we can eliminate risks. If we’ve never tackled a problem of this sort before, how can we fully anticipate the risks? He calls risk assessments created “purely on the basis of statistics drawn from previous events…like driving while looking only through the rear-view mirror.” He adds, “In complex situations, we should give up the delusion of managing risks. We cannot foresee or identify risks, and we cannot manage what we can’t see or understand.”
Klein explains some of our denial when he quotes Nassim Nicholas Taleb’s theory about black swans–catastrophes that happen so rarely as to be statistically impossible. “By definition, these kinds of events are hard to comprehend and so we explain them away.” Risk management encourages denial because we erroneously believe disaster planning protects us. “Plans sensitize us to expect some things, but that can mean ignoring other things that we don’t expect—precisely the kinds of black swans that can catch us off guard…When working in an unfamiliar and shadowy landscape, we can’t neatly identify, prioritize, and cauterize all the risks in advance.”
Instead, Klein advises, “Learn from the near-misses rather than wait to learn from accidents.” Cultivating an adaptive mindset increases ability to “anticipate, avoid and manage risks.” Build in resilience instead of trying to “predict and control unpredictable risks.”
The next chapter addresses unlearning outdated or erroneous information, is also applicable to armed citizens. Klein discusses getting fixated on an initial explanation despite evidence to the contrary, an issue he has raised earlier in Streetlights and Shadows. “Fixation isn’t a type of defective reasoning. It’s a natural outgrowth of the way we use our mental models to guide our attention and make sense of events,” he introduces.
To correct fixation, we must first recognize it. Symptoms include repeated efforts to explain away mounting pieces of contrary data, Klein suggests. Unfortunately, “People usually have to lose faith in their old mental models before they can seriously consider new ones,” he writes. Other solutions include fostering curiosity about anomalies, seeking an outsider’s viewpoint and rephrasing the situation as an analogy or metaphor.
It is easy to chuckle at the doubters who spoke out against Copernicus and Galileo’s discoveries, but how often–even presented with convincing evidence–do we fail to adjust our own mental maps? Mindset needs to be like snakeskin, Klein suggests. In order to grow, we have to shed the old so the new can accommodate growth.
Streetlights and Shadows requires a lot of concentration to adapt the lessons to our concerns. The lessons are present, though, and I thought the book was well worth the effort to read and learn.
To read more of this month's journal, please click here.