There are cognitive biases and thinking errors. Our brains have evolved to have tendencies that, sometimes, and unfortunately, lead to errors. Our perception is compromised, our decisions are based on wrong or partial information, our reactions are based on a tiny slice of information. But, there are things you can do to counter biased thinking.
These could be errors in how the brain absorbs information – interpreting conversations, reading articles, understanding behavior, understanding others’ motivation, rights & wrongs. Cognitive biases are errors in thinking and perception that make us think in a particular way even when it is inappropriate or wrong. They systematically warp our perception of reality. Cognitive biases emerge from the heuristics our brains acquired through evolution.
The brain has evolved quick tendencies for a reason, but many of them are out of context today. For example, jumping to conclusions such as – ‘the food is toxic based on its color’ may have been useful back in the day, you know, a million years
These ‘heuristics’ lead to highlighting day-to-day information in unproductive ways. Especially when there are inherent tendencies about oneself or others. If there is a tendency to be self-critical, you might interpret social cues in unfavorable ways. You might interpret a lack of party invitations as not being cool enough to hang out with the group. It isn’t easy to counter such thinking, and quick fixes are not likely to work.
The advantage of having such tendencies to jump to conclusions is SPEED & EASE of decision-making. However, speed compromises accuracy here. Biology is a product of evolution and not design. There are optimizations. We didn’t get both speed and accuracy like a computer by default.
A short recap of cognitive biases
Some conclusions are more likely to be wrong than right. They confirm beliefs that we already have by selecting bits of information and giving those bits undue importance. This is the confirmation bias. We notice and remember information that confirms our belief and ignore or dismiss information that goes against our belief. It is the queen of the biases.
There are other biases such as the gambler’s fallacy– we somehow believe that the world likes to balance itself out. If you toss a coin 5 times in a row and get the result heads every time, what do you think the next toss would yield? Heads? Tails? Most people believe that it would be tails. This is wrong. Previous coin tosses have no causal relationship with the next toss. They are independent events. People make this error of thinking that when something happens a lot, the opposite will be true in subsequent events. These errors lead to heavy monetary losses in gambling.
Another pervasive bias is the anchoring effect. Nobel Laureate Daniel Kahneman and his colleague Amos Tversky conducted an experiment (Kahneman, Thinking fast and slow 2011) in which they asked people the following question – What percentage of African countries are a part of the United Nations? Two equal groups were created. One group was asked – Is it greater or lesser than 10%? The other group was asked – Is it greater or lesser than 65%? The first group answered an average of 25% and the second, 45%. The 2 questions included an anchor – 10% and 65%. These numbers gave a starting point for people to think around, follow up with assumptions, and then give an answer.
This post isn’t about the biases. This post is about thinking clearly. Regardless of what the biases and errors are called or where they manifest, there are ways to counter them. There are over 100 cognitive biases. I highly recommend reading this book and this book to learn more about them; you won’t be disappointed. However, one can use the following 8 strategies to think clearly and objectively in spite of these tendencies to jump to wrong conclusions. These thinking strategies will counter some of the “parent” cognitive biases like the confirmation bias, the interpretation bias, survivorship bias, and the anchoring bias.
8 strategies to think clearly and objectively: How to overcome thinking mistakes that we make
We have a powerful multi-purpose instrument called the brain which can be trained with just little practice. To overcome thinking errors, you need to let go of many assumptions, and learn to accept a new assumption: you may have already missed meaningful information in your perception, and you have to put in the effort to fill in the missing pieces. The causes could be your bias, pure logic, or consequences of how things work in the world.
1. Focus on the data: In any situation that demands decision making, focus on the evidence or information. Even the bad kind. Data might be hard to spot, but can be figured out. Just takes a little bit of effort. Once this becomes a habit, it’s nearly effortless. However, the anchoring bias occurs because the information itself biases you. For that, consider the opposite information and see how it makes sense.
2. Seek out contrary data and conclusions: Keep an eye on bad reviews and see if they matter to you. One hundred good reviews are great but a hundred good reviews and a few bad reviews are better. This is your best weapon against the confirmation bias. This is perhaps the most important technique in this list as well, if there is data that supports a notion, find data that doesn’t; or at least try to think in that direction. You’ll have a much clearer picture of everything. And I really mean EVERYTHING. In fact, this is at the core of scientific investigation. This is how accurate knowledge builds.
3. Understand the noise: Focus on important aspects of a problem, not every single aspect. It is hard to filter out the noise but let me show how noise is useful when avoided correctly. Noise is background information that is of no use to you. Suppose you have to read a huge textbook for an exam. How do you know what to focus on? A lot of the book’s information might be useless for the exam, but you might feel it is important because it is in the textbook. Once you know what your exam is all about, you can learn to predict what seems important and what IS important. But, the additional information in the book could be a gateway to explore and figure out what is important. So noise is needed to understand the signal and then separating the two can help. An expert may tell you what to leave out and ignore, but you might not become an expert until you learn to identify the noise.
4. Test and Re-test: Consider the following example. You are talking with a friend, and he is not friendly. You wonder why and think that perhaps your friendship is changing or he had a bad day, or you said something unpleasant. Instead of drawing such conclusions, test and retest. Try having a similar conversation again or perhaps ask how his day was. Perhaps you are concluding that your boss is cranky on Wednesdays. Don’t just test this hypothesis for Wednesdays, test this observation for all days. Maybe your boss is always cranky, or it was random crankiness – work stress? This is tricky; because, if done wrong, you walk right into the confirmation bias.
5. Make educated guesses: Look for anchors. People ask leading questions that contain information that primes
6. Avoid misattributions: Sometimes, we get attracted to advertisements based on things unrelated to the advertised products. There are images that evoke emotional responses. Try to isolate that emotion. Its purpose might be to compensate for the lack of useful content or amplify a desirable feature. We often misattribute emotions to a false cause despite the presence of a true cause. Let us look at mobile applications. Sometimes great restaurants make terrible apps and reviewers rate the app for what it is – an app. When you see a low rating, can you analyze if the poor rating is for the food or the app? Even though an app sucks, the food can be good but the app warrants a low rating. We misattribute this and assume the food is bad. Is the food bad because the app sucks? No. The food could be brilliant AND the app can suck, the app shouldn’t change the perceived quality of the food.
7. Have multiple perspectives: You can look at a situation from a different person’s point of view (empathy) or even literally look at something from a different angle. In both cases, you will get new information. Your opinions could change. It’s easier to think from someone else’s perspective than to think from an imaginary perspective. For example, flying can be thought of from a pilot’s point of view, a passenger’s, or the technical assistant’s perspective. But there are so many more vantage points – from a customer care’s point of view, a flying bird’s point of view, an alien’s point of view. The thing is that you don’t have to know how someone else thinks. Your brain will conjure approximations and assumptions that change your perspective regardless of it being useful, and that is important.
8. Assume you don’t know what you don’t know: In many situations, it is impossible to understand the clockwork that leads to a phenomenon. Let go of assumptions. Accept that there are factors at play that could be beyond your comprehension – The
The benefits of overcoming cognitive biases and thinking errors
Short answer – better thinking, decision making, and perception.
Longer answer – Humans have advanced, technologically & socially, largely due to the pre-frontal cortex and the frontal lobe, which are implicated in executive functions. Executive functions guide decision making, planning, problem-solving, complex analysis of situations, etc. Cognitive biases interfere with these functions. For example, experts can fail to see good, novel solutions to problems or people could feel their superiors at work perform poorly. By bringing these errors into awareness and mitigating them, you will process and understand the information around you better. Simply by acknowledging the Survivorship bias, you can avoid bad productivity advice and shield yourself from bad success stories. You will know how to make better decisions in stressful and relaxed situations alike. You will shop better, manage your resources better, and have healthier conversations with lesser misunderstanding. Personal, professional, and social interactions will significantly improve as you will
Did you like this article? If yes, you will love this book- The art of thinking clearly. It’s written by the guy in the video above.
Now, I suppose, you’ll have a few strategies in your quiver to make good decisions by overcoming cognitive biases. Have fun thinking objectively!
Read more: 4 cognitive biases that you should be aware of
Kahneman, Daniel, and Amos Tversky. 1974. “Judgement under Uncertainty: Heuristics and Biases.” Science 1124-1131.
We are sorry that this post was not useful for you!
Let us improve this post!
Tell us how we can improve this post?
Hey! Thank you for reading; hope you enjoyed the article. I run Cognition Today to paint a holistic picture of psychology. My content here is referenced in Forbes, CNET, Entrepreneur, Lifehacker, a few books, academic courses, and research papers.
I’m an applied psychologist from Bangalore, India. Love sci-fi, horror media; Love rock, metal, synthwave, and pop music; can’t whistle; can play the guitar.