Meet my friend Bart. As a surgeon, every day at work he’s entrusted with the lives of others, and he handles the job well. He’s a genuinely gifted fellow. He’s also fit, healthy, and well-rounded.
In other words, Bart has made a lot of great decisions in his life, and continues to do so every day.
Except that some time ago, he got engaged. And none of his friends thought it was a good idea. We all predicted disaster, of the Hindenberg up-in-flames variety.
Bart did get separated a few years later, and you probably know someone who was plenty smart who made a similarly disastrous decision. Whether it was taking the wrong job, buying a Hummer, selling off Microsoft stock in 1989 or launching into a destructive affair, this kind of thing happens all the time. Perhaps it’s even happened to you.
It’s easy to see all of this in hindsight. But what if you could see the faulty decision-making while it was happening? Then, instead of an “I told you so” story which helps little and irritates much, we may actually accomplish something useful — like helping avoid the error in the first place.
Psychologists who’ve studied our decision-making processes have observed cognitive biases that tend to get us in trouble.
Remember that these biases don’t make you a bad person — they just make you human. As far as we can tell, they’re deeply-ingrained features of our brain function. The more you’re aware of them, the better chance you have of avoiding them. There’s a slew of them, so I’ll highlight some of the big ones:
1) The fundamental attribution error.
This bias makes us attribute the failure of others to character and our own failures to circumstance. “Jenkins lost his job because he was incompetent; I lost mine because of the recession.” It also attributes our own successes to our competence, discounting luck, while seeing others’ successes as products of mere luck.
This lands you in hot water when you assume that bad stuff only happens to other people: you’re not going to be part of the 50 percent of people who get divorced, and the price of your house will go up even though 90 percent of them have dropped in price. I’m going to marry Charlie Sheen and make it work because I’m different; those 4,000 other women were just stupid. They did something wrong, but I know what I’m doing. The fundamental attribution error’s a pernicious one, and it nails all of us at some point.
2) The confirmation bias.
This one has two parts. First, we tend to gather and rely upon information that confirms our existing views. Second, we avoid or downplay information that goes against our pre-existing hypothesis.
Say you suspect that your computer has been hacked. Then every time it stalls or has a little glitch, you blame it on the hackers. Or you think that your boss has it in for you. Then everything she says or does you interpret as part of her plan to undermine you. It’s a bit like a self-fulfilling prophecy.
If you identify with a political party, you probably do this all the time. If you’re a scientist, you do this inadvertently as part of the scientific method. And if you’re a trial lawyer, it’s your job to do this.
If you’re interested in moving an agenda forward, then the confirmation bias works in your favor. If you’re subject to this agenda and don’t like it, recognize the confirmation bias for its fallacy. And if you’re interested in the truth, start without preconceptions. Outwitting the confirmation bias means exploring both sides of an argument with equal diligence.
3) The overconfidence bias.
I call this the ‘my guess is better than yours’ bias. People’s confidence in their own decisions tends to outstrip the accuracy of those decisions. Your friend will say he’s “100 percent positive” about something — e.g. his choice of wife – and only be right 50 percent of the time. A disastrous form of this happened in the doomed 1996 Mt Everest expedition described in Jon Krakauer’s Into Thin Air, resulting in the death of many climbers.
4) The availability bias.
We tend to estimate what’s more likely by how easily we can come up with an example from memory. The availability of our memories is biased toward vivid, unusual, or emotionally charged examples. So we tend to make those more salient, then come up with weird decisions based on them.
As a result, you may cancel your trip to the Canary Islands because mom tells you the biggest plane crash in history happened there. Or you stop going to hockey games because you heard someone in the stands got thwacked on the head with a puck last week. Or avoid investing in stocks because those crashed last year.
To bypass the availability bias, be sure to look at all the evidence around a particular decision, not the stuff that jumps to mind first. If only 1 out of 100,000 plane landings resulted in a crash, it’s safe to fly to the Canary Islands. If one out of ten million hockey fans gets nailed by a puck, you can watch a hockey game.
5) The sunk cost fallacy.
I call this the slot-machine effect. You put a quarter in a one-armed bandit, and pull the lever. You win nothing. No big deal – you put in another quarter. And another. This goes on for a while, and you start thinking, “Well, I’m invested in this machine now. It’s going to belch an avalanche of quarters any second!”
The truth is that every pull of the lever has the same winning probability of nearly zero, regardless of how much money you’ve put in. The money is effectively gone forever – it’s a sunk cost. There’s no quantifiable expectation of future return, so it’s not an investment.
This is a big one in jobs and relationships. You can be stuck in a crappy situation for a while, and then think, “But I’ve invested three years in this! I can’t just throw that away!” The fact is that those three years are never coming back – you’ve already thrown them away, so don’t worry about it! The sooner you cut bait and go for a better situation, the better off you are.
So next time you have smart friends who are about to make an unbelievably dumb decision, follow this five-step plan:
a) Look through this list, or an even more comprehensive one
b) Empathize with them for being human, coming up with an example of a time when you made a similarly boneheaded choice – “Boy, was I a goober!”
c) Instead of saying “What the hell are you thinking,” say “I have a lot of faith in your judgment, so help me understand how you came up with this decision.”
d) If you’re still convinced they’re smoking something funny, only then offer gently some insight on cognitive biases, and see what happens.
e) If they still don’t get it, take the frying pan from behind your back and give them a compassionate but bracing thwack upside the head. It probably won’t change their mind, but it’ll feel pretty satisfying.