For decades, psychologists and economists have been piling up evidence that people are alarmingly irrational in their decision making. Many of us make bad choices about crucial matters such as investing our retirement savings. One documented reason: The pain of losses is greater than the pleasure of gains. When faced with an unusual situation, we tend to fall back on simple precedents, often choosing those that don’t apply or fumbling to identify any relevant experience at all. And few of us are able to see very far ahead when considering the consequences of our decisions. It gets pretty scary when you ask how all of this applies to the people who manage America’s national security and other high-level tasks.
Calm down, say political scientists Emilie M. Hafner-Burton, D. Alex Hughes, and David G. Victor, writing in Perspectives on Politics. Most of the studies behind the “cognitive revolution” in our understanding of human rationality relied on willing undergraduates as their subjects. A smaller number of experiments involving CEOs, politicians, doctors, and other people in positions of great responsibility present a more complicated picture. Experienced elites, including leaders who decide the fate of nations, may be better able to avoid the cognitive errors that plague everyone else. However, they also have characteristic flaws of their own.
Humans rely on mental models to navigate complex events and make snap decisions. In new and fast-changing situations, most of us grope for one of these models, known to psychologists as heuristics. It’s different for elites with “domain-specific” experience. These Jedi Masters settle on appropriate heuristics much faster than most people. Veteran physicians, one study found, swiftly alight on diagnoses “by applying a small set of rules to the data and sorting for the right decision pattern.” Greener doctors labor through routine cases by working through every possible diagnosis.
Elites’ “metacognition” also appears to be superior. They “revise (or even jettison) their heuristics” with much greater ease when things aren’t working out.
Elites are also better than novices at anticipating the reactions of competitors. They consider future rounds of strategic interactions and tailor their choices accordingly. Economists have even developed a measure called the “k-level” to gauge the number of steps ahead that a person can think in a game-playing exercise. In one study, three-quarters of the subjects betrayed “a strategically simple view of the world.”
In the realm of risk and reward, the average person typically errs by risking more to avoid losses than to attain gains — a phenomenon first identified in 1979 by psychologists Daniel Kahneman and Amos Tversky. But as explained by Hafner-Burton and Victor, who are professors of international relations at the University of California, San Diego, and Hughes, a UCSD doctoral student, seasoned decision makers “are less prone to loss aversion, which makes them better gamblers.”
Elites do exhibit one potentially dangerous tendency: They’re flush with confidence. Compared to novice players, for example, chess grandmasters are more likely to place undue value on one of their most precious assets, their ability to recall past moves. And NFL executives “routinely overestimate the abilities of their draft picks and pay above a talent-adjusted market wage.” The national security implication: “What looks like bombastic nationalistic pride — for example, the refusal of a leader to back down in the face of overwhelming odds of failure — might simply be the result of improper self-assessment.”
But there are upsides even to this weakness. Confidence goes hand in hand with willpower, an indispensable element in international affairs. And, paradoxically, because of their lower fear of losses, elites appear to be more cooperative than others in game-playing experiments.
Elites aren’t superhuman. Everyone has “the hardware needed for politically sophisticated tasks,” Hafner-Burton and her colleagues note. Top decision makers just have the opportunity to cultivate them.
Indeed, the authors argue, the George W. Bush administration’s dealings with North Korea between 2002 and 2006 show how such learning can occur. At first, the fledgling administration angrily and clumsily confronted North Korea about its nuclear program. Heedless of how Pyongyang might react, it abruptly cut off aid shipments of fuel. Kim Jong Il’s regime responded by kicking international weapons inspectors out of the country, withdrawing from the Nuclear Non-Proliferation Treaty, and redoubling its efforts to build the Bomb.
No surprise, Hafner-Burton and her colleagues say. At the time, National Security Adviser Condoleezza Rice, Secretary of State Colin Powell, and Secretary of Defense Donald Rumsfeld had scant experience negotiating with Pyongyang. Showing signs of loss aversion, “the Bush administration made its most aggressive move first and seemed to have no strategy for the next iterations.”
Four years later, North Korea tested its first nuclear weapon. This time the White House used nimble diplomacy, enlisting China and other regional players to help, and North Korea agreed to gradually dismantle its nuclear facilities (though it later reneged). “In 2002 the heuristics were drawn from how parents deal with children throwing tantrums,” the authors write, citing evidence from the memoirs of Bush administration officials. In 2006, the Bush team showed that it had quickly learned from experience.
There’s plenty more to be learned about how experienced elites make decisions. “Unfortunately, experienced elites are difficult to obtain as subjects because they are generally busy, wary of clinical poking, and skittish about revealing information about their decision-making processes.”
THE SOURCE: “The Cognitive Revolution and the Political Psychology of Elite Decision Making” by Emilie M. Hafner-Burton, D. Alex Hughes, and David G. Victor. Perspectives on Politics, June 2013 (pdf).
Photo courtesy of Wikimedia Commons