Heuristics &
Cognitive Biases
First a Brief History
While people like to believe that they are rational and logical, the fact is that people are continually under the influence of cognitive biases. These biases distort thinking, influence beliefs, and sway the decisions and judgments that people make each and every day.
Sometimes these biases are fairly obvious, and you might even find that you recognize these tendencies in yourself or others. In other cases, these biases are so subtle that they are almost impossible to notice.
Why do these biases happen? Attention is a limited resource. This means we can’t possibly evaluate every possible detail and event when forming thoughts and opinions. Because of this, we often rely on mental shortcuts that speed up our ability to make judgments, but sometimes lead to bias.
A heuristic is a mental shortcut that allows people to solve problems and make judgments quickly and efficiently. These rule-of-thumb strategies shorten decision-making time and allow people to function without constantly stopping to think about their next course of action. Heuristics are helpful in many situations, but they can also lead to cognitive biases.
It was during the 1950s that the Nobel-prize winning psychologist Herbert Simon suggested that while people strive to make rational choices, human judgment is subject to cognitive limitations. Purely rational decisions would involve weighing such factors as potential costs against possible benefits.
But people are limited by the amount of time they have to make a choice as well as the amount of information we have at our disposal. Other factors such as overall intelligence and accuracy of perceptions also influence the decision-making process.
During the 1970s, psychologists Amos Tversky and Daniel Kahneman presented their research on the cognitive biases. They proposed that these biases influence how people think and the judgments people make.
As a result of these limitations, we are forced to rely on mental shortcuts to help us make sense of the world. Simon’s research demonstrated that humans were limited in their ability to make rational decisions, but it was Tversky and Kahneman’s work that introduced the specific ways of thinking people rely on to simplify the decision-making process.
Common Uses
Heuristics play important roles in both problem-solving and decision-making. When we are trying to solve a problem or make a decision, we often turn to these mental shortcuts when we need a quick solution. Psychologists have suggested a few different theories for the reasons that we rely on heuristics.
- Attribute substitution: Theories suggest people substitute simpler but related questions in place of more complex and difficult questions.
- Effort reduction: According to this theory, people utilize heuristics as a type of cognitive laziness. Heuristics reduce the mental effort required to make choices and decisions.
- Fast and frugal: Still other theories argue that heuristics are actually more accurate than they are biased. In other words, we use heuristics because they are fast and usually correct.
The world is full of information, yet our brains are only capable of processing a certain amount. If you tried to analyze every single aspect of every situation or decision, you would never get anything done.
In order to cope with the tremendous amount of information we encounter and to speed up the decision-making process, the brain relies on these mental strategies to simplify things so we don’t have to spend endless amounts of time analyzing every detail.
You probably make hundreds or even thousands of decisions every day. What should you have for breakfast? What should you wear today? Should you drive or take the bus? Should you go out for drinks later with your co-workers? The list of decisions you make each day is endless and varied. Fortunately, heuristics allow you to make such decisions with relative ease without a great deal of agonizing.
For example, when trying to decide if you should drive or ride the bus to work, you might suddenly remember that there is road construction along the bus route. You realize that this might slow the bus and cause you to be late for work. So you leave earlier and drive to work on an alternate route. Heuristics allow you to think through the possible outcomes quickly and arrive at a solution.
Types of Heuristics
The following are 3 examples of the different cognitive biases that have a powerful influence on how you think, how you feel, and most importantly… how you behave.
Availability
The availability heuristic involves making decisions based upon how easy it is to bring something to mind. When you are trying to make a decision, you might quickly remember a number of relevant examples. Since these are more readily available in your memory, you will likely judge these outcomes as being more common or frequently-occurring.
For example, if you are thinking of flying and suddenly think of a number of recent airline accidents, you might feel like air travel is too dangerous and decide to travel by car instead. Because those examples of air disasters came to mind so easily, the availability heuristic leads you to think that plane crashes are more common than they really are.
Another example is after seeing several news reports of car thefts in your neighborhood, you might start to believe that such crimes are more common than they are.
It is essentially a mental shortcut designed to save us time when we are trying to determine risk. The problem with relying on this way of thinking is that it often leads to poor estimates and bad decisions.
Representative
The representativeness heuristic involves making a decision by comparing the present situation to the most representative mental prototype. Say a mental shortcut used when making judgments about the probability of an event under uncertainty. When you are trying to decide if someone is trustworthy, you might compare aspects of the individual to other mental examples you hold. A sweet older woman might remind you of your grandmother, so you might immediately assume that she is kind, gentle and trustworthy.
If you meet someone who is into yoga, spiritual healing and aromatherapy you might immediately assume that she works as a holistic healer rather than something like a school teacher or nurse. Because her traits match up to your mental prototype of a holistic healer, the representativeness heuristic causes you to classify her as more likely to work in that profession.
Affect
The affect heuristic involves making choices that are influenced by the emotions that an individual is experiencing at that moment. For example, research has shown that people are more likely to see decisions as having benefits and lower risks when they are in a positive mood. Negative emotions, on the other hand, lead people to focus on the potential downsides of a decision rather than the possible benefits.
When a user makes a snap judgment based on a quick impression. This heuristic views a situation quickly and decides without further research whether a thing is good or bad. Naturally, this heuristic can be both helpful and hurtful when applied in the wrong situation.
How Heuristics Can Lead to Bias
While heuristics can speed up our problem and the decision-making process, they can introduce errors. As you saw in the examples above, heuristics can lead to inaccurate judgments about how common things occur and about how representative certain things may be.
Just because something has worked in the past does not mean that it will work again, and relying on an existing heuristic can make it difficult to see alternative solutions or come up with new ideas.
Heuristics can also contribute to things such as stereotypes and prejudice. Because people use mental shortcuts to classify and categorize people, they often overlook more relevant information and create stereotyped categorizations that are not in tune with reality.
The Confirmation Bias
The confirmation bias is the tendency to listen more often to information that confirms our existing beliefs. Through this bias, people tend to favor information that reinforces the things they already think or believe.
Examples include:
- Only paying attention to information that confirms your beliefs about issues such as gun control and global warming.
- Refusing to listen to the opposing side.
- Not considering all of the facts in a logical and rational manner.
- Only following people on social media who share your viewpoints.
- Choosing news sources that present stories that support your views.
There are a few reasons why this happens. One is that only seeking to confirm helps limit mental resources we need to use to make decisions. It also helps protect self-esteem by making people feel that their beliefs are accurate.
People on two sides of an issue can listen to the same story and walk away with different interpretations that they feel validates their existing point of view. This is often indicative that the confirmation bias is working to “bias” their opinions.
The problem with this is that it can lead to poor choices, an inability to listen to opposing views, or even contribute to othering people who hold different opinions.
The Hindsight Bias
The hindsight bias is a common cognitive bias that involves the tendency to see events, even random ones, as more predictable than they are. It’s also commonly referred to as the “I knew it all along” phenomenon.
Some examples of the hindsight bias include:
- Insisting that you knew who was going to win a football game once the event is over.
- Believing that you knew all along that one political candidate was going to win an election.
- Saying that you knew you weren’t going to win after losing a coin flip with a friend.
- Looking back on an exam and thinking that you knew the answers to the questions you missed.
- Believing you could have predicted which stocks would become profitable.
In one classic psychology experiment, college students were asked to predict whether they thought then-nominee Clarence Thomas would be confirmed to the U.S. Supreme Court.
Prior to the Senate vote, 58% of the students thought Thomas would be confirmed. The students were polled again following Thomas’s confirmation, and a whopping 78% of students said they had believed Thomas would be confirmed.
The hindsight bias occurs for a combination of reasons, including our ability to “misremember” previous predictions, our tendency to view events as inevitable, and our tendency to believe we could have foreseen certain events.
The effect of this bias is that it causes us to overestimate our ability to predict events. This can sometimes lead people to take unwise risks.
The Anchoring Bias
The anchoring bias is the tendency to be overly influenced by the first piece of information that we hear. Some examples of how this works:
- The first number voiced during a price negotiation typically becomes the anchoring point from which all further negotiations are based.
- Hearing a random number can influence estimates on completely unrelated topics.
- The most profound impact of this bias is when Doctors become susceptible to the anchoring bias when diagnosing patients. The physician’s first impressions of the patient often create an anchoring point that can sometimes incorrectly influence all subsequent diagnostic assessments.
Like other cognitive biases, anchoring can have an effect on the decisions you make each day. For instance, it can influence how much you are willing to pay for your home. However, it can sometimes lead to poor choices and make it more difficult for people to consider other factors that might also be important.
The Misinformation Effect
The misinformation effect is the tendency for memories to be heavily influenced by things that happened after the actual event itself. A person who witnesses a car accident or crime might believe that their recollection is crystal clear, but researchers have found that memory is surprisingly susceptible to even very subtle influences.
For example:
- Research has shown that simply asking questions about an event can change someone’s memories of what happened.
- Watching television coverage may change how people remember the event.
- Hearing other people talk about a memory from their perspective may change your memory of what transpired.
In one classic experiment by memory expert Elizabeth Loftus, people who watched a video of a car crash were then asked one of two slightly different questions: “How fast were the cars going when they hit each other?” or “How fast were the cars going when they smashed into each other?”
When the witnesses were then questioned a week later whether they had seen any broken glass, those who had been asked the “smashed into” version of the question were more likely to report incorrectly that they had seen broken glass.
There are a few factors that may play a role in this phenomenon. New information may get blended with older memories. In other cases, new information may be used to fill in “gaps” in memory.
The effects of misinformation can range from the trivial to much more serious. It might cause you to misremember something you thought happened at work, or it might lead to someone incorrectly identifying the wrong suspect in a criminal case.
The Actor Observer Bias
The actor-observer bias is the tendency to attribute our actions to external influences and other people’s actions to internal ones. The way we perceive others and how we attribute their actions hinges on a variety of variables, but it can be heavily influenced by whether we are the actor or the observer in a situation.
When it comes to our own actions, we are often far too likely to attribute things to external influences. For example:
- You might complain that you botched an important meeting because you had jet lag.
- You might say you failed an exam because the teacher posed too many trick questions.
When it comes to explaining other people’s actions, however, we are far more likely to attribute their behaviors to internal causes. For example:
- A colleague screwed up an important presentation because he’s lazy and incompetent (not because he also had jet lag).
- A fellow student bombed a test because they lack diligence and intelligence (and not because they took the same test as you with all those trick questions).
While there are many factors that may play a role, perspective plays a key role. When we are the actors in a situation, we are able to observe our own thoughts and behaviors. When it comes to other people, however, we cannot see what they are thinking. This means we focus on situational forces for ourselves, but guess at the internal characteristics that cause other people’s actions.
The problem with this is that it often leads to misunderstandings. Each side of a situation is essentially blaming the other side rather than thinking about all of the variables that might be playing a role.
The False-Consensus Effect
The false consensus effect is the tendency people have to overestimate how much other people agree with their own beliefs, behaviors, attitudes, and values. For example:
- Thinking that other people share your opinion on controversial topics.
- Overestimating the number of people who are similar to you.
- Believing that the majority of people share your preferences.
Researchers believe that the false consensus effect happens for a variety of reasons. First, the people we spend the most time with, our family and friends, do often tend to share very similar opinions and beliefs. Because of this, we start to think that this way of thinking is the majority opinion even when we are with people who are not among our group of family and friends.
Another key reason this cognitive bias trips us up so easily is that believing that other people are just like us is good for our self-esteem. It allows us to feel “normal” and maintain a positive view of ourselves in relation to other people.
This can lead people not only to incorrectly think that everyone else agrees with them—it can sometimes lead them to overvalue their own opinions. It also means that we sometimes don’t consider how other people might feel when making choices.
The Halo Effect
The halo effect is the tendency for an initial impression of a person to influence what we think of them overall. Also known as the “physical attractiveness stereotype” or the “what is beautiful is ‘good’ principle” (the pretty girl syndrome) we are either influenced by or use the halo to influence others almost every day.
For example:
- Thinking people who are good-looking are also smarter, kinder, and funnier than less attractive people.
- Believing that products marketed by attractive people are also more valuable.
- Thinking that a political candidate who is confident must also be intelligent and competent.
One factor that may influence the halo effect is our tendency to want to be correct. If our initial impression of someone was positive, we want to look for proof that our assessment was accurate. It also helps people avoid experiencing cognitive dissonance, which involves holding contradictory beliefs.
This cognitive bias can have a powerful impact in the real world. For example, job applicants perceived as attractive and likable are also more likely to be viewed as competent, smart, and qualified for the job.
The Self-Serving Bias
The self-serving bias is a tendency for people tend to give themselves credit for successes but lay the blame for failures on outside causes. When you do well on a project, you probably assume that it’s because you worked hard. But when things turn out badly, you are more likely to blame it on circumstances or bad luck.
Some examples of this:
- Attributing good grades to being smart or studying hard.
- Believing your athletic performance is due to practice and hard work.
- Thinking you got the job because of your merits.
The self-serving bias can be influenced by a variety of factors. Age and sex have been shown to play a part. Older people are more likely to take credit for their successes, while men are more likely to pin their failures on outside forces.
This bias does serve an important role in protecting self-esteem. However, it can often also lead to faulty attributions such as blaming others for our own shortcomings.
The Optimism Bias
The optimism bias is a tendency to overestimate the likelihood that good things will happen to us while underestimating the probability that negative events will impact our lives. Essentially, we tend to be too optimistic for our own good.
For example, we may assume that negative events won’t affect us such as:
- Divorce
- Job loss
- Illness
- Death
The optimism bias has roots in the availability heuristic. Because you can probably think of examples of bad things happening to other people it seems more likely that others will be affected by negative events. This bias can lead people to take health risks like smoking, eating poorly, or not wearing a seat belt. The bad news is that research has found that this optimism bias is incredibly difficult to overcome.