About the SA Blog Network

Guest Blog

Guest Blog

Commentary invited by editors of Scientific American
Guest Blog HomeAboutContact

The Irrationality of Irrationality: The Paradox of Popular Psychology

The views expressed are those of the author and are not necessarily those of Scientific American.

Email   PrintPrint

In 1996, Lyle Brenner, Derek Koehler and Amos Tversky conducted a study involving students from San Jose State University and Stanford University. The researchers were interested in how people jump to conclusions based on limited information. Previous work by Tversky, Daniel Kahneman and other psychologists found that people are “radically insensitive to both the quantity and quality of information that gives rise to impressions and intuitions,” so the researchers knew, of course, that we humans don’t do a particularly good job of weighing the pros and cons. But to what degree? Just how bad are we at assessing all the facts?

To find out, Brenner and his team exposed the students to legal scenarios. In one, a plaintiff named Mr. Thompson visits a drug store for a routine union visit. The store manager informs him that according to the union contract with the drug store, plaintiffs cannot speak with the union employees on the floor. After a brief deliberation, the manager calls the police and Mr. Thompson is handcuffed for trespassing. Later the charges were dropped, but Mr. Thompson is suing the store for false arrest.

All participants got this background information. Then, they heard from one of the two sides’ lawyers; the lawyer for the union organizer framed the arrest as an attempt to intimidate, while the lawyer for the store argued that the conversation that took place in the store was disruptive. Another group of participants – essentially a mock jury – heard both sides.

The key part of the experiment was that the participants were fully aware of the setup; they knew that they were only hearing one side or the entire story. But this didn’t stop the subjects who heard one-sided evidence from being more confident and biased with their judgments than those who saw both sides. That is, even when people had all the underlying facts, they jumped to conclusions after hearing only one side of the story.

The good news is that Brenner, Koehler and Tversky found that simply prompting participants to consider the other side’s story reduced their bias – instructions to consider the missing information was a manipulation in a later study – but it certainly did not eliminate it. Their study shows us that people are not only willing to jump to conclusions after hearing only one side’s story, but that even when they have additional information at their disposal that would suggest a different conclusion, they are still surprisingly likely to do so. The scientists conclude on a somewhat pessimistic note: “People do not compensate sufficiently for missing information even when it is painfully obvious that the information available to them is incomplete.”

In Brenner’s study, participants were dealing with a limited universe of information – the facts of the case and of the two sides’ arguments. But in reality – especially in the Internet era – people have access to a limitless amount of information that they could consider. As a result, we rely on rules of thumb, or heuristics, to take in information and make decisions. These mental shortcuts are necessary because they lessen the cognitive load and help us organize the world – we would be overwhelmed if we were truly rational.

This is one of the reasons we humans love narratives; they summarize the important information in a form that’s familiar and easy to digest. It’s much easier to understand events in the world as instances of good versus evil, or any one of the seven story types. As Daniel Kahneman explains, “[we] build the best possible story form the information available… and if it is a good story, [we] believe it.” The implication here is that it’s how good the story is, not necessarily its accuracy, that’s important.

But narratives are also irrational because they sacrifice the whole story for one side of a story that conforms to one’s worldview. Relying on them often leads to inaccuracies and stereotypes. This is what the participants in Brenner’s study highlight; people who take in narratives are often blinded to the whole story – rarely do we ask: “What more would I need to know before I can have a more informed and complete opinion?”

The last several years have seen many popular psychology books that touch on this line of research. There’s Ori and Rom Brafman’s Sway, Dan Ariely’s Predictably Irrational and, naturally, Daniel Kahneman’s Thinking, Fast and Slow. If you could sum up the popular literature on cognitive biases and our so-called irrationalities it would go something like this: we only require a small amount of information, often times a single factoid, to confidently form conclusions and generate new narratives to take on new, seemingly objective, but almost entirely subjective and inaccurate, worldviews.

The shortcomings of our rationality have been thoroughly exposed to the lay audience. But there’s a peculiar inconsistency about this trend. People seem to absorb these books uncritically, ironically falling prey to some of the very biases they should be on the lookout for: incomplete information and seductive stories. That is, when people learn about how we irrationally jump to conclusions they form new opinions about how the brain works from the little information they recently acquired. They jump to conclusions about how the brain jumps to conclusions and fit their newfound knowledge into a larger story that romantically and naively describes personal enlightenment.

Tyler Cowen made a similar point in a TED lecture a few months ago. He explained it this way:

There’s the Nudge book, the Sway book, the Blink book… [they are] all about the ways in which we screw up. And there are so many ways, but what I find interesting is that none of these books identify what, to me, is the single, central, most important way we screw up, and that is, we tell ourselves too many stories, or we are too easily seduced by stories. And why don’t these books tell us that? It’s because the books themselves are all about stories. The more of these books you read, you’re learning about some of your biases, but you’re making some of your other biases essentially worse. So the books themselves are part of your cognitive bias.

The crux of the problem, as Cowen points out, is that it’s nearly impossible to understand irrationalities without taking advantage of them. And, paradoxically, we rely on stories to understand why they can be harmful.

To be sure, there’s an important difference between the bias that comes from hearing one side of an argument and (most) narratives. A corrective like “consider the other side” is unlikely to work for narratives because it’s not always clear what the opposite would even be. So it’s useful to avoid jumping to conclusions not only by questioning narratives (after all, just about everything is plausibly a narrative, so avoiding them can be pretty overwhelming), but by exposing yourself to multiple narratives and trying to integrate them as well as you can.

In the beginning of the recently released book The Righteous Mind, social psychologist Jonathan Haidt explains how some books (his included) make a case for how one certain thing (in Haidt’s case, morality) is the key to understanding everything. Haidt’s point is that you shouldn’t read his book and jump to overarching conclusions about human nature. Instead, he encourages readers to always think about integrating other points of view (e.g., morality is the most important thing to consider) with other perspectives. I think this is a good strategy for overcoming a narrow-minded view of human cognition.

It’s natural for us to reduce the complexity of our rationality into convenient bite-sized ideas. As the trader turned epistemologist Nassim Taleb says: “We humans, facing limits of knowledge, and things we do not observe, the unseen and the unknown, resolve the tension by squeezing life and the world into crisp commoditized ideas.” But readers of popular psychology books on rationality must recognize that there’s a lot they don’t know, and they must be beware of how seductive stories are. The popular literature on cognitive biases is enlightening, but let’s be irrational about irrationality; exposure to X is not knowledge and control of X. Reading about cognitive biases, after all, does not free anybody from their nasty epistemological pitfalls.

Moving forward, my suggestion is to remember the lesson from Brenner, Koehler and Tversky: they reduced conclusion jumping by getting people to consider the other information at their disposal. So let’s remember that the next book on rationality isn’t a tell-all – it’s merely another piece to the puzzle. This same approach could also help correct the problem of being too swayed by narratives – there are anyways multiple sides of a story.

Ultimately, we need to remember what philosophers get right. Listen and read carefully; logically analyze arguments; try to avoid jumping to conclusions; don’t rely on stories too much. The Greek playwright Euripides was right: Question everything, learn something, answer nothing.


The author personally thanks Dave Nussbaum for his helpful editorial comments and criticisms. Dave is a social psychologist who teaches at the University of Chicago. Follow him on Twitter and check out his homepage.

Image: by Wyglif on Wikimedia Commons.

Samuel McNerney About the Author: Sam McNerney graduated from the greatest school on Earth, Hamilton College, where he earned a bachelors in Philosophy. After reading too much Descartes and Nietzsche, he realized that his true passion is reading and writing about cognitive science. Now, he is working as a science journalist writing about philosophy, psychology, and neuroscience. He has a column at and a blog at called "Moments of Genius". He spends his free time listening to Lady Gaga, dreaming about writing bestsellers, and tweeting @SamMcNerney. Follow on Twitter @SamMcNerney.

The views expressed are those of the author and are not necessarily those of Scientific American.

Comments 10 Comments

Add Comment
  1. 1. Jim Lacey 12:26 pm 04/27/2012

    As any poker player knows, acting on incomplete information is a form of high intelligence.

    Link to this
  2. 2. HowardB 7:33 pm 04/27/2012

    The truth is that we human beings are not built as machines or computers. We are not built to naturally think rationally and cleanly and objectively. Almost the opposite. In bumbling through our survival story we have mostly benefited from our organic humanity in an environment that rewarded our other abilities.
    However in a new world where knowledge is what matters and where we aspire to more than just surviving, we need to be taught to think logically and rationally. We need to be taught how to resist the superstitions that lead to beliefs based on superstitions and intuition. We need to be taught how to think, period.
    America is being destroyed by religious zealotry and bigotry based on a complete failure to learn how to think rationally. We as a country, are reaping the rewards of rewarding ‘faith’ as something to aspire to; of promoting the teaching of ignorant creationism.

    Link to this
  3. 3. jtdwyer 8:11 am 04/28/2012

    I must once again point out a study that, using a select group of students as their subject population, jumps to conclusions based on incomplete information. Results obtained by testing students cannot be reliably generalized to represent all of humanity.

    The results do not necessarily apply to ‘people’ in general but specifically to students. Imagine that the statements made about the results obtained by testing students were applied to other specific groups: in the following quotation by Daniel Kahneman from the article, replace the general “we” with various specific groups:

    “[we] build the best possible story form the information available… and if it is a good story, [we] believe it.”

    Now replace “we” with “students” to produce a more accurate statement of test results. Now, try replacing the general “we” with “doctors”, “engineers”, “scientists”, “lefties”, “firemen”, “bicyclists”, etc. The only group that correctly applies to the general statement is the one tested.

    It can be argued that the reason this specific result applies to students is that they spend many years being taught in classrooms by teachers using narratives that only describe the ‘correct’ interpretation of events.

    Students are a special population with unique characteristics that produce specific results when tested under specific conditions. Those results may or may not apply to or represent other specific populations or especially humans in general. I find this failing to be ubiquitous in the ‘scientific’ study of human characteristics.

    Link to this
  4. 4. Dredd 3:14 pm 04/28/2012

    “People do not compensate sufficiently for missing information even when it is painfully obvious that the information available to them is incomplete.”

    Take for example genetic scientists who said 98% of the human genome was “junk DNA”, then based critical science on the sparse 2% that was “treasure DNA.”

    Now, imagine the truckloads of textbooks being taken to the shredders.

    Link to this
  5. 5. Rozzer 5:12 pm 04/28/2012

    I agree with Mr. McNerney’s main points but would go farther in a darker direction. Not only do I believe that the current sad state of rationality as depicted here will continue in the future, I’m convinced that humans may well be hard-wired to let themselves be hijacked by appealing narrative. Ever since Descartes many of us have been convinced of the possibility of mentally placing ourselves outside our not only social and cultural but our physical limitations, to be able from time to time to create and exist in a purely rational mental universe, at least for some of our endeavors. And there’s no doubt at all that the human race has gotten quite a lot of good mileage out of the Cartesian project. We may now, however, be running into a roadblock in that respect. It’s all too possible, as far as I’m concerned, that we’re so dominated by appealing narrative that efforts to become more rational will all ultimately fail.

    And that conclusion is based on long and relevant experience. I earned my living for twenty years creating bulletproof, dominating narratives for one of the most sophisticated audiences in the world: trial and appellate judges. None of the narratives were in any way “false,” none were in any manner morally or ethically objectionable, but they were all intended to make the opponent’s narrative simply incredible. And in the large majority of cases they succeeded in doing so. Believe me, the experimental subjects in the studies cited by Mr. McNerney are in every way similar to the gray-haired, experienced, sophisticated male and female judges presiding over legal matters every day of the week.

    We do of course need much more serious research into this topic, to try to determine whether or not people can really be trained to “be rational,” to not have their minds captured by particularly appealing narratives that may or may not reflect the ultimate truth. In my experience, Reader jtdwyer is unfortunately entirely wrong: the siren song of narrative is very appealing to all human beings regardless of category (would there be best-selling novels if that weren’t so?) The quirks we’ve inherited from our evolutionary past are only too often unhelpful in our “civilized,” technological present. I’d very much like to believe it could happen, though I’m not sure it will.

    Link to this
  6. 6. jtdwyer 5:56 pm 04/28/2012

    Rozzer – Sorry to disagree, but lawyers and judges are most certainly a specific population with unique characteristics. Even juries do not accurately represent the general population, as I suspect you know.

    I also assure you that your statement that “narrative is very appealing to all human beings regardless of category (would there be best-selling novels if that weren’t so?)” is false, since I for one do not enjoy narratives or read novels. I’m sure most judges would find your case quite compelling, though…

    Link to this
  7. 7. tetrahedral 11:58 am 04/29/2012

    I find it useful to make a distinction between logic and reason; reason being the more inclusive. If one considers only the logic within a story, it is usually compelling. But if one looks for unstated assumptions, questions the probability of some event, or the meaning of key words, the story may become less compelling.
    Words dominate our educational and religious teachings; and I assume the consciousness of most of us in the “modern” world. It is difficult to use words to teach someone how to think analytically, and how to look beyond the meaning of abstract words; but I think it is critically important to our future well-being.

    Link to this
  8. 8. engineer.sci 1:17 pm 04/30/2012

    Per the classic mid-20th Century work by Polya, “Patterns of Plausible Inference,” there is the possibility of working with a very small portion of weakly quantitative information — what would seem not statistically sound, and come to consistently and surprisingly accurate results. This could be a factor in the instinctive idea that a hunch based upon limited information can prove accurate. However, there are two requirements:

    (1) The information come upon is not intentionally skewed (even if based upon opinion or taste, it becomes available in a statistically random manner — not via pre-screening).

    (2) The information much be qualitatively rich — in other words, there is actually more information than meets the eye and there might be hidden correlations that tighten the overall accuracy.

    So perhaps the most realization out of Brenner et al’s results, is an additional dimension of the power of psychological manipulation of people just in what information is presented to them. Combine this with video/sound “info.” bytes, the general hypnotic power of the video screen (now replacing classic billboards as well, and present in check out lines and ever new “in your face” venues), and one has a truly frightening understanding of the population-wide mind-control capabilities of our very mercenary mass media. And the highest bidders here are special interests that are particularly addicted to an “eat & drink to gluttony, and be merry to fantastic excess, for only tomorrow after we die, will the world go to hell” outlook. Only the decay is happening fast that they think, and if not them, surely their children will be caught up in the catastrophe to follow.

    This makes for a dangerous minefield, masking as “free speech,” that we must come to reckon with when re-establishing the true free speech, a universal round table approach, to come to save our global civilization — for everyone’s sake.

    Link to this
  9. 9. statstrade 6:53 am 05/1/2012

    Great read. Thanks!

    It’s amazing how this applies in particular to the stock markets. Indeed, “we would be overwhelmed if we were truly rational” as the article said, and therefore a lot of investors base their trading decisions on partial information that they deem suffice, e.g. technical data only with specific strategies.

    Link to this
  10. 10. bucketofsquid 5:36 pm 05/3/2012

    @HowwieB – American exists due to religious zealotry and bigotry. If you are going to expound your atheist zealotry in a science forum, please indicate sources. Life is America is easier than ever and crime rates are way down. What exactly is this destruction that you mention? The only destruction I’ve seen is purely greed which is condemned by the vast majority of religions.

    Your irrational focus on the bad side of religion simply validates the article. I’m tempted to make a joke about the Pink Footed Snogglewalloper being the source of all of Americas woes but as has been pointed out many times, humor often fails without eye contact.

    Link to this

Add a Comment
You must sign in or register as a member to submit a comment.

More from Scientific American

Email this Article