57 Cognitive Biases That Screw Up How We Think

Gus Lubin

From attentional bias — where someone focuses on only one or two of several possible outcomes — to zero-risk bias — where we place too much value on reducing a small risk to zero — the sheer number of cognitive biases that affect us every day is staggering.

Understanding these biases is key to suppressing them — and needless to say, it is good to try to be rational in most cases. How else can you have any sort of control over investments, purchases, and all other decisions that you make in your life?

To convey the breadth of cognitive biases, we've picked out 57 of the most notable ones from a much longer list on Wikipedia. [Aimee Groth contributed to an earlier version of this article.]

Attentional bias

When someone focuses on only one or two choices despite there being several possible outcomes. 

Read more about attentional bias.

Availability heuristic

Where people overestimate the importance of information that is available to them.

One example would be a person who argues that smoking is not unhealthy on the basis that his grandfather lived to 100 and smoked three packs a day, an argument that ignores the possibility that his grandfather was an outlier.

Read more about the availability heuristic.

Backfire effect

When you reject evidence that contradicts your point of view or statement, even if you know it's true.

Read more about the backfire effect.

Bandwagon effect

The probability of one person adopting a belief increases based on the number of people who hold that belief. This is a powerful form of groupthink.

Read more about the bandwagon effect.

Belief bias

A bias where people make faulty conclusions based on what they already believe or know. For instance, one might conclude that all tiger sharks are sharks, and all sharks are animals, and therefore all animals are tiger sharks.

Read more about belief bias.

Bias blind spots

If you fail to realize your own cognitive biases, you have a bias blind spot. Everyone thinks they're not as biased as people may think, which is a cognitive bias itself.

Read more about bias blind spot.

Choice-supportive bias

A bias in which you think positive things about a choice once you made it, even if that choice has flaws. You may say positive things about the dog you just bought and ignore that the dog bites people.

Read more about choice-supportive bias.

Clustering illusion

This is the tendency to see streaks or clusters in random events. A gambler after watching a red come up multiple times in a row on a roulette table may erroneously conclude that red is hot. In a related bias, known as cognitive bias, the gambler may conclude that black is particularly likely to come up since it hasn't come in awhile. In fact, the results are always random.

Read more about the gambler's fallacy.

Confirmation bias

A tendency people have to believe certain information that confirms what they think or believe in.

Read more about confirmation bias.

Conservatism bias

Where people believe prior evidence more than new evidence or information that emerged. People were slow to accept the fact that the earth was round because they tended to believe earlier information that it was flat.

Read more about conservatism.

Curse of knowledge

When people who are smarter or more well informed can not understand the common man. For instance, in the TV show "The Big Bang Theory" it's difficult for scientist Sheldon Cooper to understand his waitress neighbor Penny.

Read more about the curse of knowledge.

Decoy effect

A phenomenon in marketing where consumers have a specific change in preference between two choices after being presented with a third choice.

Read more about the decoy effect.

Denomination effect

People are less likely to spend large bills than their equivalent value in small bills or coins.

Read more about the denomination effect.

Duration neglect

When the duration of an event doesn't factor enough into a valuation. For instance we may remember momentary displeasure as strongly as protracted displeasure.

You can learn more about these biases on The Psy-Fi Blog and Wikipedia.

Empathy gap

Where people in one state fail to understand people in another state. If you are happy you can't imagine why people would be unhappy. When you are not sexually aroused, you can't understand how you act when you are sexually aroused.

Read more about the empathy gap.

Frequency illusion

Where a word, name or thing you just learned about suddenly appears everywhere. Now that you know what that SAT word means, you see it in so many places!

Read more about the frequency illusion.

Galatea Effect

Where people succeed because they think they should.

Read more about the Galatea effect.

Halo effect

Where we take one positive attribute of someone and associate it with everything else about that person or thing.

Read more about the halo effect.

Hard-Easy bias

Where everyone is overconfident on easy problems and not confident enough for hard problems.

Read more about the hard-easy bias.


People tend to flock together, especially in difficult or uncertain times.

Read more about herd behavior.

Hindsight bias

The tendency to see past events as predictable. "I knew all along Philip Phillips would win American Idol." Sure you did...

Read more about hindsight bias.

Hyperbolic discounting

The tendency for people to want an immediate payoff rather than a larger gain later on. Most people would rather take $5 now than $7 in a week.

Read more about hyperbolic discounting.

Ideometer effect

Where an idea causes you to have an unconscious physical reaction, like a sad thought that makes your eyes tear up. This is also how Ouija boards seem to have minds of their own.

Read more about ideometer effect.

Illusion of control

The tendency for people to overestimate their ability to control events, like when a sports fan thinks his thoughts or actions had an effect on the game.

Read more about illusion of control.

Illusion of validity

When weak but consistent data leads to confident predictions. Like one commenter noted on the MIT admissions blog:

Why is MIT's admissions process better than random? Say you weeded out the un-qualified (the fewer-than-half of applicants insufficiently prepared to do the work at MIT) and then threw dice to stochastically select among the remaining candidates. Would this produce a lesser class?

Read more about illusion of validity.

Information bias

The tendency to seek information when it does not affect action. More information is not always better.

Read more about information bias.

Inter-group bias

We view people in our group differently than we would someone in another group.

Read more about inter-group bias.

Irrational escalation

Investing more money or resources into something based on prior investment, even if you know it's a bad one. "I already have 500 shares of Lehman Brothers, let's buy more even though the stock is tanking."

Read more about irrational escalation.

Less-is-more effect

With less knowledge, people can often make more accurate predictions.

Read more about less-is-more effect.

Negativity bias

The tendency to put more emphasis on negative experiences rather than positive ones. People with this bias feel that "bad is stronger than good" and will perceive threats more than opportunities in a given situation.

This leads toward loss aversion.

Read more about negativity bias.

Observer-expectancy effect

Our expectations unconsciously influence how we perceive an outcome. Researchers, for example, looking for a certain result in an experiment, may inadvertently manipulate or interpret the results to reveal their expectations. That's why the "double-blind" experimental design was created for the field of scientific research. 

Read more about observer-expectancy effect.

Omission bias

The tendency to judge harmful actions as worse than equally harmful inactions. For example, we consider it worse to crash a car while drunk than to let one's friend crash his car while drunk.

Read more about omission bias.

Ostrich effect

The decision to ignore dangerous or negative information by "burying" one's head in the sand, like an ostrich.  

Read more about the ostrich effect.

Outcome bias

Judging a decision based on the outcome over the quality of the decision when it was made. This is not accounting for the role luck plays in outcomes.

Read more about outcome bias.


We are too confident about our abilities, and this causes us to take greater risks in our daily lives.

Read more about overconfidence.


When we believe the world is a better place than it is, we aren't prepared for the danger and violence we may encounter. The inability to accept the full breadth of human nature leaves us vulnerable.  

Read more about optimism bias.

Pessimism bias

This is the opposite of the overoptimism bias. Pessimists over-weigh negative consequences with their own and others' actions.

Read more about pessimism bias.

Placebo effect

A self-fulfilling prophecy, where belief in something causes it to be effective. This is a basic principle of stock market cycles.

Read more about placebo effect.

Planning fallacy

The tendency to underestimate how much time it will it will take to complete a task.

Read more about planning fallacy.

Post-purchase rationalization

Making ourselves believe that a purchase was worth the value after the fact. 

Read more about post-purchase rationalization.

Pro-innovation bias

When a proponent of an innovation tends to overvalue its usefulness and undervalue its limitations.

Read more about pro-innovation bias.


Deciding to act in favor of the present moment over investing in the future.

Read more about procrastination.


The desire to do the opposite of what someone wants you to do, in order to prove your freedom of choice.

Read more about reactance.


The tendency to weight the latest information more heavily than older data.

Read more about recency.


The belief that fairness should trump other values, even when it's not in our economic and/or other interests.

Read more about reciprocity.

Regression bias

People take action in response to extreme situations. Then when the situations become less extreme, they take credit for causing the change, when a more likely explanation is that the situation was reverting to the mean.

Read more about regression bias.

Restraint bias

Overestimating one's ability to show restraint in the face of temptation.

Read more about restraint bias.


Our tendency to focus on the most easily-recognizable features of a person or concept. 

Read more about salience.

Seersucker Illusion

Over-reliance on expert advice. This has to do with the avoidance or responsibility. We call in "experts" to forecast, when in fact, they have no greater chance of predicting an outcome than the rest of the population. In other words, "for every seer there's a sucker." 

Read more about seersucker illusion.

Selective perception

Allowing our expectations to influence how we perceive the world.

Read more about selective perception.

Self-enhancing transmission bias

Everyone shares their successes more than their failures. This leads to a false perception of reality and inability to accurately assess situations.

Read more about self-enhancing transmission bias.

Status quo bias

The tendency to prefer things to stay the same. This is similar to loss-aversion bias, where people prefer to avoid losses instead of acquiring gains. 

Read more about status quo bias.


Expecting a group or person to have certain qualities without having real information about the individual. This explains the snap judgments Malcolm Gladwell refers to in "Blink."  

Read more about stereotyping.

Survivorship bias

An error that comes from focusing only on surviving examples, causing us to misjudge a situation. For instance, we might think that being an entrepreneur is easy because we haven't heard of all of the entrepreneurs that have failed.

It can also cause us to assume that survivors are inordinately better than failures, without regard for the importance of luck.

Read more about survivorship bias.

Tragedy of the commons

We overuse common resources because it's not in any individual's interest to conserve them. This explains the overuse of natural resources, opportunism, and any acts of self-interest over collective interest. 

Read more about the tragedy of the commons.

Unit bias

We believe that there is an optimal unit size, or a universally-acknowledged amount of a given item that is perceived as appropriate. This explains why when served larger portions, we eat more.  

Read more about unit bias.

Zero-risk bias

The preference to reduce a small risk to zero versus achieving a greater reduction in a greater risk. 

This plays to our desire to have complete control over a single, more minor outcome, over the desire for more — but not complete — control over a greater, more unpredictable outcome. 

Read more about zero-risk bias.

That's just behavioral biases. There are all sorts of weird things in your head.

48 Psychological Facts You Should Know About Yourself > 

See Also: