8 powerful ways to overcome thinking errors and cognitive biases

 |   |  Disclaimer: Links to some products earn us a commission

There are cognitive biases and thinking errors. Our brains have evolved to have tendencies that, sometimes, and unfortunately, lead to errors. Our perception is compromised, our decisions are based on wrong or partial information, our reactions are based on a tiny slice of information. But, there are things you can do to counter biased thinking.

These could be errors in how the brain absorbs information – interpreting conversations, reading articles, understanding behavior, understanding others’ motivation, rights & wrongs. Cognitive biases are errors in thinking and perception that make us think in a particular way even when it is inappropriate or wrong. They systematically warp our perception of reality. Cognitive biases emerge from the heuristics our brains acquired through evolution.

The brain has evolved quick tendencies for a reason, but many of them are out of context today. For example, jumping to conclusions such as – ‘the food is toxic based on its color’ may have been useful back in the day, you know, a million years ago, when we were all babies. These decision-making short-cuts called “heuristics” are often useful, but not always accurate. One reason we have many of them is that heuristics require lesser mental effort, so it’s economical for the brain.

These ‘heuristics’ lead to highlighting day-to-day information in unproductive ways. Especially when there are inherent tendencies about oneself or others. If there is a tendency to be self-critical, you might interpret social cues in unfavorable ways. You might interpret a lack of party invitations as not being cool enough to hang out with the group. It isn’t easy to counter such thinking, and quick fixes are not likely to work.

A cognitive bias occurs due to a limited attention span for information in the environment, poor memory of events and details, heuristics for simplifying information, and a lack of variety of experiences. One study proposes that biased attention can lead to biased interpretation, and that can lead to a biased memory, at least in subclinically depressed people. If we are intolerant of uncertainty and ambiguity, we may be more prone to a negative interpretation bias. Anxiety comes with a bias to interpret ambiguous events negatively, overestimate the likelihood of negative events, and focus on threatening information in the environment.

The advantage of having such tendencies to jump to conclusions is SPEED & EASE of decision-making. However, speed compromises accuracy here. Biology is a product of evolution and not design. There are optimizations. We didn’t get both speed and accuracy like a computer by default.

A short recap of cognitive biases

Some conclusions are more likely to be wrong than right. They confirm beliefs that we already have by selecting bits of information and giving those bits undue importance. This is the confirmation bias. We notice and remember information that confirms our belief and ignore or dismiss information that goes against our belief. It is the queen of the biases.

There are other biases such as the gambler’s fallacy– we somehow believe that the world likes to balance itself out. If you toss a coin 5 times in a row and get the result heads every time, what do you think the next toss would yield? Heads? Tails? Most people believe that it would be tails. This is wrong. Previous coin tosses have no causal relationship with the next toss. They are independent events.  People make this error of thinking that when something happens a lot, the opposite will be true in subsequent events. These errors lead to heavy monetary losses in gambling.

Another pervasive bias is the anchoring effect. Nobel Laureate Daniel Kahneman and his colleague Amos Tversky conducted an experiment (Kahneman, Thinking fast and slow 2011) in which they asked people the following question – What percentage of African countries are a part of the United Nations? Two equal groups were created. One group was asked – Is it greater or lesser than 10%? The other group was asked – Is it greater or lesser than 65%? The first group answered an average of 25% and the second, 45%. The 2 questions included an anchor – 10% and 65%. These numbers gave a starting point for people to think around, follow up with assumptions, and then give an answer.

This post isn’t about the biases. This post is about thinking clearly. Regardless of what the biases and errors are called or where they manifest, there are ways to counter them. There are over 100 cognitive biases. I highly recommend reading this book and this book to learn more about them; you won’t be disappointed. However, one can use the following 8 strategies to think clearly and objectively in spite of these tendencies to jump to wrong conclusions. These thinking strategies will counter some of the “parent” cognitive biases like the confirmation bias, the interpretation bias, survivorship bias, and the anchoring bias.

8 strategies to think clearly and objectively: How to overcome thinking mistakes that we make

We have a powerful multi-purpose instrument called the brain which can be trained with just little practice. To overcome thinking errors, you need to let go of many assumptions, and learn to accept a new assumption: you may have already missed meaningful information in your perception, and you have to put in the effort to fill in the missing pieces. The causes could be your bias, pure logic, or consequences of how things work in the world.

1. Focus on the data: In any situation that demands decision making, focus on the evidence or information. Even the bad kind. Data might be hard to spot, but can be figured out. Just takes a little bit of effort. Once this becomes a habit, it’s nearly effortless. However, the anchoring bias occurs because the information itself biases you. For that, consider the opposite information and see how it makes sense.

2. Seek out contrary data and conclusions: Keep an eye on bad reviews and see if they matter to you. One hundred good reviews are great but a hundred good reviews and a few bad reviews are better. This is your best weapon against the confirmation bias. This is perhaps the most important technique in this list as well, if there is data that supports a notion, find data that doesn’t; or at least try to think in that direction. You’ll have a much clearer picture of everything. And I really mean EVERYTHING. In fact, this is at the core of scientific investigation. This is how accurate knowledge builds.

3. Understand the noise: Focus on important aspects of a problem, not every single aspect. It is hard to filter out the noise but let me show how noise is useful when avoided correctly. Noise is background information that is of no use to you. Suppose you have to read a huge textbook for an exam. How do you know what to focus on? A lot of the book’s information might be useless for the exam, but you might feel it is important because it is in the textbook. Once you know what your exam is all about, you can learn to predict what seems important and what IS important. But, the additional information in the book could be a gateway to explore and figure out what is important. So noise is needed to understand the signal and then separating the two can help. An expert may tell you what to leave out and ignore, but you might not become an expert until you learn to identify the noise.

4. Test and Re-test: Consider the following example. You are talking with a friend, and he is not friendly. You wonder why and think that perhaps your friendship is changing or he had a bad day, or you said something unpleasant. Instead of drawing such conclusions, test and retest. Try having a similar conversation again or perhaps ask how his day was. Perhaps you are concluding that your boss is cranky on Wednesdays. Don’t just test this hypothesis for Wednesdays, test this observation for all days. Maybe your boss is always cranky, or it was random crankiness – work stress? This is tricky; because, if done wrong, you walk right into the confirmation bias. 

5. Make educated guesses: Look for anchors. People ask leading questions that contain information that primes others to think in a certain way. For example, ‘he isn’t that bad a guy’. Someone is more likely to respond saying ‘Yeah he isn’t that bad’, but the answer could very well be ‘He is an awesome guy’ if the question were ‘He is a pretty good guy’. When trying to make a real educated guess, rethink assumptions, spot anchors, use data, and try to work out an answer.

6. Avoid misattributions: Sometimes, we get attracted to advertisements based on things unrelated to the advertised products. There are images that evoke emotional responses. Try to isolate that emotion. Its purpose might be to compensate for the lack of useful content or amplify a desirable feature. We often misattribute emotions to a false cause despite the presence of a true cause. Let us look at mobile applications. Sometimes great restaurants make terrible apps and reviewers rate the app for what it is – an app. When you see a low rating, can you analyze if the poor rating is for the food or the app? Even though an app sucks, the food can be good but the app warrants a low rating. We misattribute this and assume the food is bad. Is the food bad because the app sucks? No. The food could be brilliant AND the app can suck, the app shouldn’t change the perceived quality of the food.

7. Have multiple perspectives: You can look at a situation from a different person’s point of view (empathy) or even literally look at something from a different angle. In both cases, you will get new information. Your opinions could change. It’s easier to think from someone else’s perspective than to think from an imaginary perspective. For example, flying can be thought of from a pilot’s point of view, a passenger’s, or the technical assistant’s perspective. But there are so many more vantage points – from a customer care’s point of view, a flying bird’s point of view, an alien’s point of view. The thing is that you don’t have to know how someone else thinks. Your brain will conjure approximations and assumptions that change your perspective regardless of it being useful, and that is important.  

8. Assume you don’t know what you don’t know: In many situations, it is impossible to understand the clockwork that leads to a phenomenon. Let go of assumptions. Accept that there are factors at play that could be beyond your comprehension – The unknown unknown. You wouldn’t know what you don’t know or you could know. For example, there is a common debate among audiophiles, laypeople, musicians, and musical technicians about a song’s file size and quality. A person who knows what a song waveform looks like can argue that larger files have more content in it. That is a valid premise, but there is an unknown assumption, that some of the content actually translates into better quality. This assumption is wrong. The unknown unknown for many is that it is ok to remove some content because our brain doesn’t register certain frequency changes. The brain masks extremely low frequencies that follow high frequencies. For all practical purposes, those frequencies are useless to humans.

The benefits of overcoming cognitive biases and thinking errors

Short answer – better thinking, decision making, and perception.

Longer answer – Humans have advanced, technologically & socially, largely due to the pre-frontal cortex and the frontal lobe, which are implicated in executive functions. Executive functions guide decision making, planning, problem-solving, complex analysis of situations, etc. Cognitive biases interfere with these functions. For example, experts can fail to see good, novel solutions to problems or people could feel their superiors at work perform poorly. By bringing these errors into awareness and mitigating them, you will process and understand the information around you better. Simply by acknowledging the Survivorship bias, you can avoid bad productivity advice and shield yourself from bad success stories. You will know how to make better decisions in stressful and relaxed situations alike. You will shop better, manage your resources better, and have healthier conversations with lesser misunderstanding. Personal, professional, and social interactions will significantly improve as you will have learned how to make better judgments.



Did you like this article? If yes, you will love this book- The art of thinking clearly. It’s written by the guy in the video above.

Now, I suppose, you’ll have a few strategies in your quiver to make good decisions by overcoming cognitive biases. Have fun thinking objectively!


Read more: 4 cognitive biases that you should be aware of

Additional Resources:

Dobelli, Rolf. 2013. “The Art of Thinking Clearly.” In The Art of Thinking Clearly, by Rolf Dobelli.

Kahneman, Daniel. 2011. “Thinking fast and slow.” In Thinking fast and slow, by Daniel Kahneman.

Kahneman, Daniel, and Amos Tversky. 1974. “Judgement under Uncertainty: Heuristics and Biases.” Science 1124-1131.

Was this useful?

Average rating 0 / 5. Vote count: 0

We are sorry that this post was not useful for you!

Let us improve this post!

Tell us how we can improve this post?

Previous

Top 5 psychology channels on Youtube

4 cognitive biases you should be aware of

Next

Join 3,410 other subscribers

2 thoughts on “8 powerful ways to overcome thinking errors and cognitive biases”

  1. Bro.. The first item here says focus on data. That’s true, but don’t you think sometimes we need to be really optimistic and have unrealistic belief to achieve success? Here I am referring to Upsc exams wherein data says 1 in 790 applicants gets selected, but those who got selected say they just had faith and never worried about competition. How to move ahead at such cross roads?

    Reply
    • Hey, yes, focusing on data is important to overcome biases. However, it doesn’t mean one shouldn’t be optimistic. Both are compatible with each other. In your example of UPSC, it is good to be optimistic but it is also important to know what to expect.

      The truth lies in a cognitive bias known as the survivorship bias – Those who were selected (survived) said that they had faith and never worried. But that’s only half the story. On the other side, there are those who had faith and never worried who also failed. Having faith and not worrying didn’t cause their success.

      But that’s not it. Some people, in retrospect, downplay the negative emotions associated with a positive outcome.

      In competitive situations like these, regulating emotions is important. Whether it’s the pressure, fear of failure, stress, anxiety, lack of confidence, disappointing others, etc. These aspects do affect performance. And sometimes, these factors hamper the preparation process too.

      Regarding unrealistic beliefs, it’s not bad at all. It’s nice to dream and think big. But there is a point where over-optimism becomes a problem because overconfidence, lack of effort, relying on luck, etc. thinking big and high also requires some amount of strategy, thinking, healthy coping with failures, luck, good decisions, help, etc.

      The optimism you are talking about sounds like a great idea to approach and take an opportunity. Many times, it is this optimism which puts people in the right place at the right time.

      Reply

Comments