You are on page 1of 11

58 Cognitive Biases

That Screw Up
Everything We Do

DRAKE BAER AND GUS LUBIN
JUN. 18, 2014

We like to think we're rational human beings. In fact, we are prone to hundreds of
proven biases that cause us to think and act irrationally, and even thinking we're rational
despite evidence of irrationality in others is known as blind spot bias.
Hoping to clue you and ourselves into the biases that frame our decisions, we've
collected a long list of the most notable ones.



1. Affect heuristic
The way you feel filters the way you interpret the world. Take, for instance, if the
words rake, take, and cake flew across a computer screen blinked on a computer screen
for 1/30 of a second. Which would you recognize? If you're hungry, research suggests
that all you see is cake.


2. Anchoring bias
People are over-reliant on the first piece of information they hear. In a salary negotiation,
for instance, whoever makes the first offer establishes a range of reasonable possibilities
in each person's mind. Any counteroffer will naturally react to or be anchored by that
opening offer.

"Most people come with the very strong belief they should never make an opening
offer," says Leigh Thompson, a professor at Northwestern University's Kellogg School of
Management. "Our research and lots of corroborating research shows that's completely
backwards. The guy or gal who makes a first offer is better off."




3. Confirmation bias
We tend to listen only to the information that confirms our preconceptions one of the
many reasons it's so hard to have an intelligent conversation about climate change.



4. Observer-expectancy effect
A cousin of confirmation bias, here our expectations unconsciously influence how we
perceive an outcome. Researchers looking for a certain result in an experiment, for
example, may inadvertently manipulate or interpret the results to reveal their
expectations. That's why the "double-blind" experimental design was created for the field
of scientific research.


5. Bandwagon effect
The probability of one person adopting a belief increases based on the number of people
who hold that belief. This is a powerful form of groupthink and it's a reason meetings
are so unproductive.



6. Bias blind spots
Failing to recognize your cognitive biases is a bias in itself. Notably, Princeton
psychologist Emily Pronin has found that "individuals see the existence and operation of
cognitive and motivational biases much more in others than in themselves."


7. Choice-supportive bias
When you choose something, you tend to feel positive about it, even if the choice has
flaws. You think that your dog is awesome even if it bites people every once in a while
and that other dogs are stupid, since they're not yours.


8. Clustering illusion
This is the tendency to see patterns in random events. It is central to various gambling
fallacies, like the idea that red is more or less likely to turn up on a roulette table after a
string of reds.
9. Conservatism bias
Where people believe prior evidence more than new evidence or information that has
emerged. People were slow to accept the fact that the Earth was round because they
maintained their earlier understanding the planet was flat.


10. Conformity
This is the tendency of people to conform with other people. It is so powerful that it may
lead people to do ridiculous things, as shown by the following experiment by Solomon
Asch:

Ask one subject and several fake subjects (who are really working with the experimenter)
which of lines B, C, D, and E is the same length as A? If all of the fake subjects say that D
is the same length as A, the real subject will agree with this objectively false answer a
shocking three-quarters of the time.

"That we have found the tendency to conformity in our society so strong that reasonably
intelligent and well-meaning young people are willing to call white black is a matter of
concern," Asch wrote. "It raises questions about our ways of education and about the
values that guide our conduct."


11. Curse of knowledge
When people who are more well-informed cannot understand the common man. For
instance, in the TV show "The Big Bang Theory," it's difficult for scientist Sheldon Cooper
to understand his waitress neighbor Penny.


12. Decoy effect
A phenomenon in marketing where consumers have a specific change in preference
between two choices after being presented with a third choice. Offer two sizes of soda and
people may choose the smaller one; but offer a third even larger size, and people may
choose what is now the medium option.


13. Denomination effect
People are less likely to spend large bills than their equivalent value in small bills or
coins.
14. Duration neglect
When the duration of an event doesn't factor enough into the way we consider it. For
instance, we remember momentary pain just as strongly as long-term pain.


15. Availability heuristic
When people overestimate the importance of information that is available to them.
For instance, a person might argue that smoking is not unhealthy on the basis that his
grandfather lived to 100 and smoked three packs a day, an argument that ignores the
possibility that his grandfather was an outlier.


16. Empathy gap
Where people in one state of mind fail to understand people in another state of mind. If
you are happy you can't imagine why people would be unhappy. When you are not
sexually aroused, you can't understand how you act when you are sexually aroused.


17. Frequency illusion
Where a word, name or thing you just learned about suddenly appears everywhere.


18. Fundamental attribution error
This is where you attribute a person's behavior to an intrinsic quality of her identity
rather than the situation she's in. For instance, you might think your colleague is an
angry person, when she is really just upset because she stubbed her toe.


19. Galatea Effect
Where people succeed or underperform because they think they should. It is also
referred to as a self-fulfilling prophesy.

20. Halo effect
Where we take one positive attribute of someone and associate it with everything else
about that person or thing.

21. Hard-Easy bias
Where someone is overconfident on easy problems and not confident enough on hard
problems.


22. Herding
People tend to flock together, especially in difficult or uncertain times.


23. Hindsight bias
Of course Apple and Google would become the two most important companies in phones
tell that to Nokia, circa 2003.


24. Hyperbolic discounting
The tendency for people to want an immediate payoff rather than a larger gain later on.



25. Ideometer effect
Where an idea causes you to have an unconscious physical reaction, like a sad thought
that makes your eyes tear up. This is also how Ouija boards seem to have minds of their
own.


26. Illusion of control
The tendency for people to overestimate their ability to control events, like when a sports
fan thinks his thoughts or actions had an effect on the game.


27. Information bias
The tendency to seek information when it does not affect action. More information is not
always better. Indeed, with less information, people can often make more accurate
predictions.


28. Inter-group bias
We view people in our group differently from how see we someone in another group.
29. Irrational escalation
When people make irrational decisions based on past rational decisions. It may happen
in an auction, when a bidding war spurs two bidders to offer more than they would
otherwise be willing to pay.


30. Negativity bias
The tendency to put more emphasis on negative experiences rather than positive ones.
People with this bias feel that "bad is stronger than good" and will perceive threats more
than opportunities in a given situation. Psychologists argue it's an evolutionary
adaptation it's better to mistake a rock for a bear than a bear for a rock.


31. Omission bias
The tendency to prefer inaction to action, in ourselves and even in politics. Psychologist
Art Markman gave a great example back in 2010:
Omission bias creeps into our judgment calls on domestic arguments, work mishaps, and
even national policy discussions. In March, President Obama pushed Congress to enact
sweeping health care reforms. Republicans hope that voters will blame Democrats for any
problems that arise after the law is enacted. But since there were problems with health
care already, can they really expect that future outcomes will be blamed on Democrats,
who passed new laws, rather than Republicans, who opposed them? Yes, they canthe
omission bias is on their side.


32. Ostrich effect
The decision to ignore dangerous or negative information by "burying" one's head in the
sand, like an ostrich.


33. Outcome bias
Judging a decision based on the outcome rather than how exactly the decision was
made in the moment. Just because you won a lot at Vegas, doesn't mean gambling your
money was a smart decision.


34. Overconfidence
Some of us are too confident about our abilities, and this causes us to take greater risks in
our daily lives.
35. Over-optimism
When we believe the world is a better place than it is, we aren't prepared for the danger
and violence we may encounter. The inability to accept the full breadth of human nature
leaves us vulnerable.


36. Pessimism bias
This is the opposite of the over-optimism bias. Pessimists over-weigh negative
consequences with their own and others' actions.


37. Placebo effect
Where believing that something is happening helps cause it to happen. This is a basic
principle of stock market cycles, as well as a supporting feature of medical treatment in
general.


38. Planning fallacy
The tendency to underestimate how much time it will take to complete a task.


39. Post-purchase rationalization
Making ourselves believe that a purchase was worth the value after the fact.


40. Priming
Where if you're introduced to an idea, you'll more readily identify related ideas. Let's
take an experiment as an example, again from Less Wrong:
Suppose you ask subjects to press one button if a string of letters forms a word, and
another button if the string does not form a word. (e.g., "banack" vs. "banner".) Then
you show them the string "water". Later, they will more quickly identify the string
"drink" as a word. This is known as "cognitive priming"
Priming also reveals the massive parallelism of spreading activation: if seeing "water"
activates the word "drink", it probably also activates "river", or "cup", or "splash"


41. Pro-innovation bias
When a proponent of an innovation tends to overvalue its usefulness and undervalue its
limitations.
42. Procrastination
Deciding to act in favor of the present moment over investing in the future.


43. Reactance
The desire to do the opposite of what someone wants you to do, in order to prove your
freedom of choice.


44. Recency
The tendency to weigh the latest information more heavily than older data.


45. Reciprocity

The belief that fairness should trump other values, even when it's not in our economic or
other interests.


46. Regression bias
People take action in response to extreme situations. Then when the situations become
less extreme, they take credit for causing the change, when a more likely explanation is
that the situation was reverting to the mean.


47. Restraint bias
Overestimating one's ability to show restraint in the face of temptation.


48. Salience
Our tendency to focus on the most easily-recognizable features of a person or concept.


49. Scope insensitivity
This is where your willingness to pay for something doesn't correlate with the scale of the
outcome.
From Less Wrong:
Once upon a time, three groups of subjects were asked how much they would pay to save
2,000 / 20,000 / 200,000 migrating birds from drowning in uncovered oil ponds. The
groups respectively answered $80, $78, and $88. This is scope insensitivity or scope
neglect: the number of birds saved the scope of the altruistic action had little effect
on willingness to pay.


50. Seersucker Illusion
Over-reliance on expert advice. This has to do with the avoidance or responsibility. We
call in "experts" to forecast when typically they have no greater chance of predicting an
outcome than the rest of the population. In other words, "for every seer there's a sucker."


51. Selective perception
Allowing our expectations to influence how we perceive the world.


52. Self-enhancing transmission bias
Everyone shares their successes more than their failures. This leads to a false perception
of reality and inability to accurately assess situations.


53. Status quo bias
The tendency to prefer things to stay the same. This is similar to loss-aversion bias, where
people prefer to avoid losses instead of acquiring gains.


54. Stereotyping
Expecting a group or person to have certain qualities without having real information
about the individual. This explains the snap judgments Malcolm Gladwell refers to in
"Blink." While there may be some value to stereotyping, people tend to overuse it.


55. Survivorship bias
An error that comes from focusing only on surviving examples, causing us to misjudge a
situation. For instance, we might think that being an entrepreneur is easy because we
haven't heard of all of the entrepreneurs who have failed.

It can also cause us to assume that survivors are inordinately better than failures, without
regard for the importance of luck or other factors.


56. Tragedy of the commons
We overuse common resources because it's not in any individual's interest to conserve
them. This explains the overuse of natural resources, opportunism, and any acts of self-
interest over collective interest.


57. Unit bias
We believe that there is an optimal unit size, or a universally-acknowledged amount of a
given item that is perceived as appropriate. This explains why when served larger
portions, we eat more.


58. Zero-risk bias
The preference to reduce a small risk to zero versus achieving a greater reduction in a
greater risk.

This plays to our desire to have complete control over a single, more minor outcome, over
the desire for more but not complete control over a greater, more unpredictable
outcome.

You might also like