top of page
  • carmenkolcsar

Intelligence traps, delusions and a few cognitive biases

Updated: Jan 13, 2023

In a classic 1977 study, 94 percent of professors rated themselves above average relative to their peers.

According to a more recent study performed in 2018, 65% of Americans believe they are above average in intelligence. A superficial search on the internet will reveal countless studies with similar results across the world.

The human tendency to overestimate their capabilities relative to others it’s known as the Dunning-Kruger effect and is strongly correlated with a cognitive bias called illusory superiority.

In 1966, Peter Cathcart Wason created a logic puzzle now known as the Wason selection task.

He had four cards as in the below figure:

The participants of the test were asked which cards would they turn over to test the truth of the hypothesis that if a card shows an even number on one face, then its opposite face is red. Surprisingly at that time (but not anymore when the test was re-run in 1993), less than 10% of the participants gave the correct answer.

Take a bit of time to answer this question.

As David McRaney puts it in his book "You Are Not So Smart", the misconception is that we consider ourselves rational, logical beings who see the world as it really is, but the truth is that we are as deluded as everybody else.

So, even if you got this one right, there are many other tests or situations where your logic can/will fail.

As human beings, we fail a lot in truly understanding why we do some of the things we do, but we are very successful in creating large and false explanations of why we do what we do.

If you answered “8 card and brown card”, at the Watson selection test earlier, then you are in the few less than 10% that got it right. If you had a different answer, don’t feel that bad, you still can be above average.

Priming is a phenomenon whereby exposure to one stimulus influences a response to a subsequent stimulus, without conscious guidance or intention. Since psychological studies looking at priming became mainstream, in 1996, a huge amount of evidence has been built. A famous priming example is the use of scents to trigger a buy decision.

For example, if in a grocery store or supermarket you can smell freshly baked bread, the chances are you’ll buy more food - not only bread. If a house you want to buy smells like a freshly taken-out of oven pie, chances are you shortlist that house.

Of course, any of the senses can be used: If your office hallways the walls are decorated with natural landscapes you’ll walk slower than if they are decorated with urban scapes.

We struggle to admit that unnoticed external factors influence how we behave and feel. In fact, we are too proud to admit we can easily be influenced and manipulated and too busy to spend some time retroactively analyzing our behaviors and feelings and acknowledge the causality relationships between events. But the truth is that only by acknowledging and understanding we can become more skilled at this.

Knowing what the logical fallacies and cognitive biases are is the first step toward a life and a career with more conscious and correct decisions. One of the best resources in this area is "The Art of Thinking Clearly" by Rolf Dobelli, but if time is short the Wikipedia Cognitive Biases Codex will do it:

Cognitive biases range from obvious to subtle to unnoticeable, but their categorization alone does not make us, by default, less prone to error.

Let’s take a few examples:

Thinking that the omelet should be avoided because Einstein was not eating it is a logical mistake known as authority bias (we have just transferred Einstein’s authority in physics to nutrition). Similarly, an executive giving detailed recommendations outside their areas of competence and being taken seriously is an example of somebody wrongly using the authority they have and of us failing to identify the authority bias.

Now imagine you are in a classroom and your teacher gives you a number sequence. For easiness, let that sequence be 2-4-6. You are requested to find the underlying rule of the number sequence. You will say the next number in the sequence and the teacher will tell you if it fits or not the rule, you can have as many tries as you want for the numbers, but you can guess the rule only once. Take a few moments to think about what the rule is.

This was a real experiment. Most students asked if 8 is the next number. The teacher replied, “fits the rule”. Then they asked about 10, 12 and 14. The teacher’s answer was always “fits the rule”. So the students concluded that the rule is “each number is the previous number plus 2”. But this was not the rule.

Another student asked if -3 is next, and the answer was “does not fit the rule”. Then he asked about 67, and got “fits the rule”. Then different other crazy numbers: 234 - “fits the rule”, 1008 - “fits the rule”, 7 - “does not fit the rule”. After more iterations, as apparently he had a rule in mind and when he could not find counterexamples to it, the students formulated the rule: “the numbers must be in ascending order”. The professor said, “correct, that’s the rule”.

What this last student did was he did not fall into the confirmation bias trap, also called “the mother of all misconceptions”.

Pay a lot of attention to it, as it appears every day, in all aspects of life. It’s our natural tendency to dismiss information that is not in line with what we think as being true or correct. You noticed it when finding so much supporting evidence for your chosen political party, your religious belief, the new car you made up your mind on, or the part of the city you bought an apartment in.

Last but not least, if you now tend to see yourself as less biased than other people, this has a name and it’s called the bias blind spot.

123 views0 comments

Recent Posts

See All


bottom of page