NOTES / How We Make Decisions And Why They Often Suck

1. Intro

[SLIDE] [SLIDE]

Thesis statement: We've evolved to look for patterns. To quickly extrapolate from a small amount of information and make snap judgments to get the best course of action. Why? Because this makes sense if you are a hunter/gatherer 100,000 years ago trying to decide if the rustling in the bushes is a tiger or just the wind. But for most decisions now, it's just because we don't have time to research everything, weigh every factor and make choices based on complete information. So we develop models of the world or heuristics, a kind of shortcut to intuitively understanding our environment and making predictions. The problem is that this process has a lot of variables and traps that can lead to extremely irrational - and therefore bad - decisions.

So basically we have 2 ways of deciding - using our emotional brain and rational brain. Both have advantages and disadvantages. The key is to know the difference. For example, humans love being certain and hate being uncertain. So we tend to filter the world based on what we already believe. This is called the confirmation bias. [SLIDE] i.e., we only notice what reinforces our pre-existing ideas and assumptions (the other lane is always moving faster because you don't notice when it's not). But on the other hand, the emotional brain can also take in far more information than you're aware of. This is sort of why people think they have good instincts or intuition.

[SLIDE]

So there's a form of brain injury where people essentially lose their emotions (they become Spock, completely rational) and the result is, they can't make a damn decision. Because they are endlessly going over all the different factors of a choice and never get that emotional bump that tells them this is the way to go. These people will literally spend 2 hours deciding between a blue pen or a black pen. So we know that we need emotion in some sense but it can lead us astray.

[SLIDE]

2. Loss Aversion

So I'm going to make you a bet. We'll flip a coin and if it's heads you pay me a dollar. If it's tails, how much do I have to pay you for you to accept the bet? $1.25? $1.50? $2? An economist might say $1.01 is the correct answer. You should be willing to take anything over a dollar because over time, it'll pay off. But most people wouldn't take it until you offer them $1.75 or $2 before they accept the bet. And that's because losses hurt more than gains feels good. This is called loss aversion. We'll fight harder if we stand to lose something (even more than if we stood to gain the same amount).

This is also why people will spend more money using credit cards than with cash. Because using a card is too abstract, our brains don't recognize the loss. And it's behind procrastination and what is known as time inconsistency. That is, we have trouble connecting today's actions with tomorrow's goals.

This is also related to something called the disposition effect [SLIDE] which is about how people are more likely to sell stocks that go up in value and hold onto stocks that go down. Because they don't want to recognize those losses. Instead of doing the rational thing and cutting their losses, they want to postpone making those losses tangible by selling and end up incurring more losses. And all because of loss aversion.

Next we have the fairness impulse [SLIDE] — so there's this experiment conducted around the world with the same results every time. This economist takes 2 subjects and presents them with a significant amount of money that they are to split between themselves and can then keep. But it works like this: person A gets to decide just how to divide the money, 50-50 or something less equitable. Person B then decides either to accept the cut, in which case the experiment ends and the money is paid out accordingly — or reject the cut, in which case the experiment also ends but no one gets anything. Now you might expect the self-interested, rational person B to accept almost any cut since a little free money is better than none at all. But the average accepted split is 43%. Offers of 25% or lower are almost never accepted, even when the amount is months worth of salary.

[SLIDE]

3. Consistency Bias

Also called the principle of congruency: people are more willing to do something if they see it as consistent with what they already have said or done. And there's ways we often get tricked into doing this.

So imagine it's 20 years ago. You go into this new place called Starbucks just out of curiosity and are shocked by the prices. But you have a cup anyway. 2 days later you walk by again. You don't remember the circumstances from 2 days ago but you remember that you went in and bought a cup of coffee. So you think I must have had a good reason. I must be the type of person that drinks at Starbucks. Anchoring [SLIDE]—the process of seeding a thought in people's minds and having that thought influence their later actions. Whatever we thought we knew about how much coffee should cost, whatever our historical precedent for price was, it was now irrelevant and you now have a reason to pay more - because you already did.

[SLIDE]

If you divide a room into 2 and tell one group to write down 10 reasons why they love their significant other and tell the other to write down 3 reasons, and then ask each group how much they love their significant other and ask them questions related to that: how likely they are to stay with them, how likely they are to have an affair, etc. Do you think there'd be a difference? Which group would profess more love? Turns out there is a difference and the group that had to write down only 3 reasons professes to love their significant other more. And the reason is if you're trying to think of 10 reasons, you usually will run out at 6 or 7 and will start thinking, maybe there's not that much to love.

We also have a sort of yeah, whatever heuristic. [SLIDE] When people tend to stick with the default or starting point. You can see it in how the order of TV shows matters, as well as retirement fund enrollment and organ donation. I'd like to talk about that last one. [SLIDE] Look at this graph. The number of people who commit to organ donation in a country has little to do with religion, culture, health, need, altruism or medicine. It really only depends on what is the default. When you get a national ID card or a drivers license are you automatically enrolled and you have to check a box to opt out or are you not automatically enrolled and have to check a box to opt in. It's just checking a box but people just will not do it. This is not because they don't care or something, it's because organ donation is a difficult and complex thing to think about so people just choose to not decide. If you were to ask them about their preference, they may have an opinion, but because of this human tick we have, they act as if they don't.

[SLIDE]

Another interesting case study: they took a group of doctors and told them we have this patient who has hip pain. We've already recommended hip replacement surgery. But the doctor is then told that it's discovered that they had never tried Ibuprofen. What do you do? Ibuprofen or surgery? The majority logically chose to first try Ibuprofen, the anti-inflamatory drug, over an expensive, irreversible, dangerous surgery. The second group is told that it's discovered that there were 2 medications they had never tried, Ibuprofen and Piroxicam. Now what do you do? The conflict between Piroxicam and Ibuprofen was sufficiently large that 75% of the doctors chose hip replacement. Now it's not that this was a conscious thought process where they threw up their hands between 2 medications but in fact, it just introduced enough noise into the decision, that they went with what they felt was the decision that was already made.

[SLIDE]

4. Weakness for Authority

We all know that another shortcut people use is submitting to authority (doesn't need to mean "the man") it just means going with what someone else says who is presumably more experienced or credible than you or maybe just someone you like. There's even technology that can determine your favorite friends on Facebook, create an amalgamation of their faces and use that vaguely familiar and weirdly likable fake person in targeted advertising. So you'll have your own spokesperson assigned to only you algorythmically designed to sell one thing just to you. It's coming.

Sometimes our deference to others is rational but sometimes it isn't. But instead of giving you other examples, I'm just going to give a funny trick I learned about how to exploit this in other people.

Turns out you are afforded a moment of persuasive power immediately after admitting a weakness or a drawback in the case that you are making. That's the moment where you should put your strongest argument. Because you've established yourself as a creditable source of information.

So consider this - if you have 2 identical reference letters, both filled with praise and glowing reviews of the candidate but you mention a flaw of the person in one of the letters, more often than not the letter with the flaw will be more effective. This exact study was actually conducted for managerial positions of a bunch of fortune 500 companies and significantly more interviews came from reference letters that contained mention of a candidate's weakness, in an otherwise positive description than letters that were uniformly positive.

[SLIDE]

5. Conclusion

In conclusion, you have loss aversion, consistency bias, weaknesses for authority, there's also a weakness for scarcity (why people don't want to let bad relationships die; once something is in danger of being lost, we value it more even if we would otherwise not care at all). But these are all just a few examples of the irrational brain. Though we have evolved a well-tuned ability to anticipate and weigh very clear, very short-term dangers, this focus can mislead us when things get more complex. So something to think about next time you're making a decision.

[SLIDE]

Outro




top