About the SA Blog Network

MIND Guest Blog

MIND Guest Blog

Commentary invited by editors of Scientific American Mind
MIND Guest Blog HomeAboutContact

The Bias within the Bias

The views expressed are those of the author and are not necessarily those of Scientific American.

Email   PrintPrint

Recall this pivotal scene from the 1997 movie, Men in Black. James Edwards (Will Smith, or Agent J) arrives at the headquarters of MiB – a secret agency that protects Earth from extraterrestrial threats – to compete with “the best of the best” for a position. Edwards, a confident and cocky NYPD officer, completes various tests including a simulation where he shoots an ostensibly innocent schoolgirl. When asked why, Edwards explains that compared to the freakish aliens, the girl posed the biggest threat. He passes the test: potentially dangerous aliens are always disguised as real humans. Agent K (Tommy Lee Jones) offers him a position at MiB and the remaining candidates’ memories are erased. They return to normal life without ever realizing that the aliens were a ruse – a device for Agent K to detect how sagacious the candidates really were.

This wily test of intelligence and mindfulness is defined by two characteristics. The first is that most people fail it; the second is a subtle trick intentionally implemented to catch careless thinking (the schoolgirl for example). Narratives in literature and film that incorporate this test go something like this: scores have tried and failed because they overlooked the trick – even though they confidently believed they did not – until one day a hero catches it and passes the test (Edwards). Game of Thrones readers may recall the moment Syrio became the first sword of Braavos. Unlike others before him, when the Sealord asked Syrio about an ordinary cat, Syrio answered truthfully instead of sucking up. (The ending of Indiana Jones and the Last Crusade also comes to mind, but this does not fit the narrative for a critical reason. Those who failed did not live under the mistaken belief that they succeeded – they were beheaded.)

Here’s my worry. The same thing occurs when lay audiences read books about thinking errors. They understand the errors, but don’t notice the trick – that simply learning about them is not enough. Too often, readers finish popular books on decision making with the false conviction that they will decide better. They are the equivalent of Edwards’ competition – the so-called best of the best who miss the ruse.

The overlooked reason is that there are two components to each bias. The first is the phenomenon itself. Confirmation bias, for example, is your tendency to seek out confirmation information while ignoring everything else. The second is the belief that everyone else is susceptible to thinking errors, but not you. This itself is a bias – bias blind spot – a “meta bias” inherent in all biases that blinds you from your errors.

Popular literature on judgment and decision-making does not emphasize the second component enough, potentially inhibiting readers from fully understanding the source of their irrationalities. Although we intuitively believe that we correct for biases after being exposed to them, it is impossible to truly accomplish this until we consider how the bias blind spot – the bias within the bias – distorts thinking. The ironic implication is that these books are perhaps part of the problem. The common sendoff, “now that you know about these biases, perhaps you’ll decide better,” instills a false confidence – it’s the trick we’re all failing to notice.

I first noticed this after learning about confirmation bias, overconfident bias and above-average-effects and concluding, dubiously, that I was a genius living in a world of idiots. Of course, the joke was on me, and it took many years to figure that out.

This effect appears all over the place when you stop to look around. Construction projects often finish late and over budget because planners, after researching previous late and over budget projects, confidently estimate that their undertaking will never suffer the same fate. Wars are the same. Iraq, some believed, would never turn out like Vietnam. Yet that attitude may have caused our prolonged stay. When we learn about thinking errors, we falsely conclude that they do not apply. That’s when we mess up.

The problem is rooted in introspection. Biases are largely unconscious, so when we reflect on thinking we inevitably miss the processes that give rise to our errors. Worse, because we’re self-affirming spin-doctors, when we introspect, we only identify reasons for our infallibility. In this light, we see why mere exposure to biases compounds the problem: they actually make us more confident about how we decide.

I admit that I’ve painted a rather pessimistic picture of human rationality. We are plagued by systematic biases, and reflecting on those biases only exacerbates the problem. Like knifing Hydra, every time we think about thinking errors we commit even more errors. It’s an epistemic Chinese finger trap. Is there any way out?

System 2 thinking – the ability to reflect and think deliberately – is capable of critical self-analysis. So I am ultimately optimistic about our Promethean gift. It’s impossible not to notice the power of reason, especially in the 21st century. It is one of our “better angels,” as Steven Pinker notes, and it has nudged us towards cooperation and the mutual benefits of pursuing self-interest and away from violence.

A note of caution, however. It’s crucial that we use our ability to reflect and think deliberately not to introspect, but to become more mindful. This is an important distinction. Introspection involves asking questions, yet we’ve seen that we’ll tend to answer those questions in a self-serving way. As Nietzsche hinted in Twilight of the Idols, “We want to have a reason for feeling as we do… it never suffices us simply to establish the mere fact that we feel as we do.”

Mindfulness, in contrast, involves observing without questioning. If the takeaway from research on cognitive biases is not simply that thinking errors exist but the belief that we are immune from them, then the virtue of mindfulness is pausing to observe this convoluted process in a non-evaluative way. We spend a lot of energy protecting our egos instead of considering our faults. Mindfulness may help reverse this.

Critically, this does not mean mindfulness is in the business of “correcting” or “eliminating” errors. That’s not the point. Rather, mindfulness means pausing to observe that thinking errors exist – recognizing the bias within the bias. The implication is we should read popular books on decision making not to bash rationality (that will backfire) but to simply reconsider it with an open mind. Instead of pulling harder to escape the finger trap, try relaxing. Maybe then you’ll notice the trick.

Image: CarolSpears

Samuel McNerney About the Author: Sam McNerney graduated from the greatest school on Earth, Hamilton College, where he earned a bachelors in Philosophy. After reading too much Descartes and Nietzsche, he realized that his true passion is reading and writing about cognitive science. Now, he is working as a science journalist writing about philosophy, psychology, and neuroscience. He has a column at and a blog at called "Moments of Genius". He spends his free time listening to Lady Gaga, dreaming about writing bestsellers, and tweeting @SamMcNerney. Follow on Twitter @SamMcNerney.

The views expressed are those of the author and are not necessarily those of Scientific American.

Comments 7 Comments

Add Comment
  1. 1. rshoff 8:14 pm 05/15/2013

    Brilliant, yes. Thanks for introducing these ideas to me as a reader. Now I wonder if the bias within the bias is only as good as dividing a number by two. In other words we can never attain an unbiased state just as we can never reach zero regardless of the number of times we divide each quotient again by two.

    I love it when philosophy intersects with science. Magic happens!

    Link to this
  2. 2. brian1625 8:22 pm 05/15/2013

    “It’s not that we don’t think before we act, it’s that we act like we’re thinking.” – Me. Being an INTP myself, this is thinking 101. You become a bitter skeptic after a while. Then overtime you realize that the bias has purpose. Without confirmation bias, how else do you store information? You just throw data in your brain hope for the best? But you can’t simply not have confirmation bias. It has value. Again, how are you storing data? It’s just a list that’s not even indexed? Good luck sorting through that. The plan is to build a biased system, destroy it, build it again.

    A bias I have is that I actually just assume people are smarter than me. And it doesn’t bother me. They can come at problems at different angles than I can, so why not assume this is so? They can add to my system or better yet, destroy it. Though I often make sure they make arguments and not statements. (People call me a contrarian, but I just want to know what they – really – think, not simply knowledge they have cut and pasted into their data banks) Could I be wrong in that I’m actually the smartest person in he room? yes, but that’s not likely and it puts me in a different, more relaxed mind set when I’m not jockeying for the alpha spot in a social group.

    Link to this
  3. 3. TTLG 8:57 pm 05/15/2013

    I think a big part of the problem is that people treat these mental biases like a curable illness: you fix it and then forget about it. In reality, I think it is more like a missing limb or untreatable illness: it is something you have to learn to deal with on an everyday basis. Something like the way AA teaches people to deal with alcoholism: admit that you have a problem and do your best to minimize the effect on your life, or in this case your perception of the world.

    Link to this
  4. 4. jeffpc_sciam 6:34 pm 05/16/2013

    Excellent blog Sam. Although not always successful, I find that constantly reminding myself of the errors that my typical human mind makes, helps see through those biases more often than otherwise. I re-read a cognitive bias book regularly and have a list of biases up on my wall to see and review some everyday. Certainly its still easier to see the biases in others but I do catch myself sometimes as well. I’m sure I still miss most of my errors. On the downside, being rational and logical more often than average (if not often) does make me a radical on many issues and, simultaneously, less confident of my own decisions than others are about theirs.

    Link to this
  5. 5. smcnerne 10:32 am 05/19/2013

    @brian1625 – Yes, a good heuristic: assume everyone else is smarter than you until you know otherwise.

    @TTLG “I think a big part of the problem is that people treat these mental biases like a curable illness: you fix it and then forget about it.” I agree, nice comparison

    @jeffpc_sciam. Thanks, you sound like a modern day Montaigne !

    Link to this
  6. 6. dvognar 12:25 pm 05/21/2013

    The bias within the bias of this article is that the author believes rational self-interest can help us overcome our biases, yet rational self-interest also reinforces our biases. Granted, it’s much easier to point out other people’s biases. Nevertheless:

    “…and it has nudged us towards cooperation and the mutual benefits of pursuing self-interest…”

    Followed by:

    “Introspection involves asking questions, yet we’ve seen that we’ll tend to answer those questions in a self-serving way.”

    How to overcome this?

    Link to this
  7. 7. smcnerne 3:58 pm 05/30/2013

    “The bias within the bias of this article is that the author believes rational self-interest can help us overcome our biases.”

    No, I believe rational self-interest has contributed to the global decline of violence. Read carefully next time.

    Link to this

Add a Comment
You must sign in or register as a member to submit a comment.

More from Scientific American

Email this Article