Hardbound

Bias Detectors

Michael Lewis on the friendship that changed the world

|
Published 6 years ago on Nov 18, 2017 3 minutes Read

“In late 1973 or early 1974, Danny gave a talk, which he would deliver more than once, and which he called “Cognitive Limitations and Public Decision Making.” It was troubling to consider, he began, “an organism equipped with an affective and hormonal system not much different from that of the jungle rat being given the ability to destroy every living thing by pushing a few buttons.” Given the work on human judgment that he and Amos had just finished, he found it further troubling to think that “crucial decisions are made, today as thousands of years ago, in terms of the intuitive guesses and preferences of a few men in positions of authority.” The failure of decision makers to grapple with the inner workings of their own minds, and their desire to indulge their gut feelings, made it “quite likely that the fate of entire societies may be sealed by a series of avoidable mistakes committed by their leaders.”

Before the war, Danny and Amos had shared the hope that their work on human judgment would find its way into high-stakes real-world decision making. In this new field called decision analysis, they could transform high-stakes decision making into a sort of engineering problem. They would design decision-making systems. Experts on decision making would sit with leaders in business, the military, and government and help them to frame every decision explicitly as a gamble; to calculate the odds of this or that happening; and to assign values to every possible outcome. If we seed the hurricane, there is a 50% chance we lower its wind speed but a 5% chance that we lull people who really should evacuate into a false sense of security: What do we do? In the bargain, the decision analysts would remind important decision makers that their gut feelings had mysterious powers to steer them wrong. “The general change in our culture toward numerical formulations will give room for explicit reference to uncertainty,” Amos wrote, in notes to himself for a talk of his own. Both Amos and Danny thought that voters and shareholders and all the other people who lived with the consequences of high-level decisions might come to develop a better understanding of the nature of decision making. They would learn to evaluate a decision not by its outcomes—whether it turned out to be right or wrong—but by the process that led to it. The job of the decision maker wasn’t to be right but to figure out the odds in any decision and play them well. As Danny told audiences in Israel, what was needed was a “transformation of cultural attitudes to uncertainty and to risk.”

“Exactly how some decision analyst would persuade any business, military, or political leader to allow him to edit his thinking was unclear. How would you even persuade some important decision maker to assign numbers to his “utilities”? Important people didn’t want their gut feelings pinned down, even by themselves. And that was the rub.”