Steve Cole, the VP of research and development at HopeLab, a non-profit that fights to improve kids’ health using technology, said, “Any time in life you’re tempted to think, ‘Should I do this or that?’ instead, ask yourself, ‘Is there a way I can do this and that?’ It’s surprisingly frequent that it’s feasible to do both things.”
For one major project, Cole and his team at HopeLab wanted to find a design partner, a firm that could help them design a portable device capable of measuring the amount of exercise that kids were getting. There were at least seven or eight design firms in the Bay Area that were capable of doing the work. In a typical contracting situation, HopeLab would have solicited a proposal from each firm and then given the winner a giant contract.
But instead of choosing a winner, Cole ran a “horse race.” He shrank down the scope of the work so that it covered only the first step of the project, and then he hired five different firms to work on the first step independently. (To be clear, he wasn’t quintupling his budget — as a non-profit, HopeLab didn’t have unlimited resources. Cole knew that what he’d learn from the first round would make the later rounds more efficient.)
With his horse race, Cole ensured that he’d have multiple design alternatives for the device. He could either pick his favourite or combine the best features of several. Then, in round two of the design, he could weed out any vendors who were unresponsive or ineffective.
Cole is fighting the first villain of decision making — narrow framing, which is the tendency to define our choices too narrowly, to see them in binary terms. We ask, “Should I break up with my partner or not?” instead of “What are the ways I could make this relationship better?” We ask ourselves, “Should I buy a new car or not?”instead of “What’s the best way I could spend some money to make my family better off?”
In the introduction, when we asked the question “Should Shannon fire Clive or not?” we were stuck in a narrow frame. We spotlighted one alternative at the expense of all others.
Cole, with his horse race, is breaking out of that trap. It wasn’t an obvious move; he had to fight for the concept internally. “At first, my colleagues thought I was insane. At the beginning, it costs some money and takes some time. But now everybody here does it. You get to meet lots of people. You get to know lots of different kinds of things about the industry. You get convergence on some issues, so you know they are right, and you also learn to appreciate what makes the firms different and special. None of this can you do if you’re just talking to one person. And when all of those five firms know that there are four other shops involved, they bring their best game.”
Notice the contrast with the pros-and-cons approach. Cole could have tallied up the advantages and disadvantages of working with each vendor and then used that analysis to make a decision. But that would have reflected narrow framing. Implicitly, he would have been assuming that there was one vendor that was uniquely capable of crafting the perfect solution, and that he could identify that vendor on the basis of a proposal.
There’s a more subtle factor involved too — Cole, in meeting with the teams, would have inevitably developed a favourite, a team he clicked with. And though intellectually he might have realised that the people he likes personally aren’t necessarily the ones who are going to build the best products, he would have been tempted to jigger the pros-and-cons list in their favour. Cole might not even have been aware he was doing it, but because pros and cons are generated in heads, it is very, very easy for us to bias the factors. We think we are conducting a sober comparison but, in reality, our brains are following orders from our guts.
Our normal habit in life is to develop a quick belief about a situation and then seek out information that bolsters our belief. And that problematic habit, called the “confirmation bias,” is the second villain of decision making.
Here’s a typical result from one of the many studies on the topic: Smokers in the 1960s, back when the medical research on the harms of smoking was less clear, were more likely to express interest in reading an article headlined 'Smoking Does Not Lead to Lung Cancer' than one with the headline 'Smoking Leads to Lung Cancer.' (To see how this could lead to bad decisions, imagine your boss staring at two research studies headlined 'Data That Supports What You Think' and 'Data That Contradicts What You Think'. Guess which one gets cited at the staff meeting?)
Researchers have found this result again and again. When people have the opportunity to collect information from the world, they are more likely to select information that supports their pre-existing attitudes, beliefs, and actions. Political partisans seek out media outlets that support their side but will rarely challenge their beliefs by seeking out the other side’s perspective. Consumers who covet new cars or computers will look for reasons to justify the purchase but won’t be as diligent about finding reasons to postpone it.
The tricky thing about the confirmation bias is that it can look very scientific. After all, we’re collecting data. Dan Lovallo, the professor and decision-making researcher cited in the introduction, said, “Confirmation bias is probably the single biggest problem in business, because even the most sophisticated people get it wrong. People go out and they’re collecting the data, and they don’t realise they’re cooking the books.”
At work and in life, we often pretend that we want truth when we’re really seeking reassurance: “Do these jeans make me look fat?” “What did you think of my poem?” These questions do not crave honest answers.
Or pity the poor contestants who try out to sing on reality TV shows, despite having no discernible ability to carry a tune. When they get harsh feedback from the judges, they look shocked. Crushed. And you realise: This is the first time in their lives they’ve received honest feedback. Eager for reassurance, they’d locked their spotlights on the praise and support they received from friends and family. Given that affirmation, it’s not hard to see why they’d think they had a chance to become the next American Idol. It was a reasonable conclusion drawn from a wildly distorted pool of data.
And this is what’s slightly terrifying about the confirmation bias: When we want something to be true, we will spotlight the things that support it, and then, when we draw conclusions from those spotlighted scenes, we’ll congratulate ourselves on a reasoned decision.