About the SA Blog Network



Opinion, arguments & analyses from the editors of Scientific American
Observations HomeAboutContact

Refuse to learn from experience? Thank your genes

The views expressed are those of the author and are not necessarily those of Scientific American.

Email   PrintPrint

gene learn from experience advice suggestionSome people are incurable contrarians or imperturbable logicians. But most of us, whether we like it or not, allow other people’s opinions and advice to color our own experiences and opinions. Have you found that restaurant to really be as good as people say it is?

New findings suggests that a person’s willingness to coolly consider the facts gleaned from their own experience—apart from others’ previous verbal suggestions—might be based in large part on genetics.

It has been known and frequently demonstrated that "people will distort what they experience to be perceived as more consistent with what they thought already," Michael Frank, of the Brown Institute for Brain Science at Brown University, and a collaborator in the new research, said in a prepared statement. Even researchers can fall prey to confirmation bias, thinking they have discovered what they actually had expected to find  in the noise of data.

So, why do we often struggle to accept our own impressions if they contradict what we’ve been told to expect? The disconnect occurs in part because these two types of information, the abstract and the experiential, are processed in different parts of the brain. Advice ("go to that Italian restaurant") is filtered, along with other higher-level cognition, in the prefrontal cortex. Experience ("that Italian restaurant is usually mediocre"), on the other hand, is lodged in a more primitive region of the brain, the striatum.

Although perhaps we should be more inclined to stick with what our gut (or tastebuds) has learned from personal experience, most people tend to lean on what their prefrontal cortex—i.e. outside instruction—has to say for more time than they rationally should.

"Maintaining instructions in the prefrontal cortex changes the way that the striatum works," Bradley Doll, a researcher at Brown, said in a prepared statement. "It biases what people learn about the contingencies they are actually experiencing," noted Doll, who coauthored a new paper detailing the results, which published online April 19 in The Journal of Neuroscience.

People’s willingness to let advice color their experience hinges at least in part on the neurotransmitter dopamine, which is associated with pleasure, reward and learning. The researchers pinpointed one gene in particular, COMT, that seems to play a role in a person’s inclination to learn from his or her own experiences. Individuals in the study with different alleles of this gene had differing propensities to be biased by outside advice in interpreting their own experiences.

Frank, Doll and colleague Kent Hutchinson tested more than 70 adults on a computer-based learning program. Subjects had to learn which symbols were most likely to be classified as the "correct" answer. The correlation was based on probability, rather than strict correlation, creating a gray area in which subjects had to weigh their past experiences with each symbol. In some tests, people were given advice about which symbols were correct most often—but this advice sometimes proved to be incorrect.

People with an exceptional ability to spot inaccurate instructions and start making decisions using their own experience tended to have the Val/Val version of the gene, whereas those who needed "greater confidence" that their experience was telling them to jettison earlier advice were more likely to have the Met allele.

Overall, the researchers concluded, "these findings suggest that the striatal learning process is modulated by prior expectations, and that the resulting associative weights cannot be easily ‘undone’ after the prior is rejected." So that might mean you have to order many bowls of substandard pasta before you finally admit to yourself that a much-lauded Italian restaurant isn’t actually all that great.

Of course, it’s certainly easier—and less painful—to learn to avoid a hot plate by being told to do so, and we’ve likely evolved to take this into account, prizing the prefrontal cortex’s retained instructions. "This phenomenon of confirmation bias might actually just be a byproduct of a system that tries to be more efficient with the learning process," Frank said.

But the human mind is rarely satisfied with simple instruction, as instruction—and advice—often turn out to be wrong. And what’s a few burnt fingertips in the grand scheme of independent thought?

Image courtesy of iStockphoto/Yuri_Arcurs

Rights & Permissions

Comments 10 Comments

Add Comment
  1. 1. jtdwyer 8:22 pm 04/19/2011

    The preferential bias to interpret experiences in accordance with previous suggestions is also beneficial in socializing and cooperating with others.

    Most people have little interest in explanations of how they were wrong, no matter how compelling the argument is. Discussions of how you agree go much better…

    Link to this
  2. 2. 11:48 pm 04/19/2011

    Another example of this is that people sometimes make statements of fact that they believe because someone had told them it was true, yet at the same time they knew from their own experience that it was completely false. I am not talking about differences in interpretation (as in "is the food good or bad") but differences in fact (as in they say "we never had lunch in that restaurant" yet when prompted they also remember the time "we went there for lunch five days in a row".) That always seemed paradoxical and contradictory to me, but perhaps this phenomenon described in the article explains it.

    When I have challenged people who said such falsehoods by prompting them to remember a contradictory memory, they first tried to confabulate a rational for how the false statement could be true, even though they had also acknowledged knowing it was false. But if one version of the fact is stored as a fact in the prefrontal cortex, and the opposite version is stored as an unrelated memory in the striatum, then apparently the two opposite facts can coexist in the same mind…until some person or event makes the person aware of the contradiction.

    Beliefs and memories are often stored or recovered with a motivation, people often have a motivation for remembering events a certain way. So, perhaps differences in storage location also reflects differences in the motivation for believing.

    Bernard Schuster

    Link to this
  3. 3. halneufmille 1:52 am 04/20/2011

    They should test tea party supporters and religious people.

    Link to this
  4. 4. jbairddo 6:57 am 04/20/2011

    halneufmille, Do you mean Dems and Reps have been doing such a bang up job they remain immune from this (gotta agree on the religulous thing though)? This does explain how urban myths stay alive no matter how stupid the premise.

    Link to this
  5. 5. xponen 7:18 am 04/20/2011

    This is strange.
    The article promote the use of gut feeling rather than logic.
    It is a pointable fact that logic and idea were embedded in language, and people learn language to grasp the logic process described by it,thus ignoring this language means ignoring logic (and not just the biased instruction, but the benefit the instruction confer to its listener).

    Link to this
  6. 6. xponen 7:18 am 04/20/2011

    This is strange.
    The article promote the use of gut feeling rather than logic.
    It is a pointable fact that logic and idea were embedded in language, and people learn language to grasp the logic process described by it,thus ignoring this language means ignoring logic (and not just the biased instruction, but the benefit the instruction confer to its listener).

    Link to this
  7. 7. Cramer 5:23 pm 04/20/2011

    xponen said, "The article promote the use of gut feeling rather than logic."

    I don’t believe the article promoted using gut over logic, but maybe was just poorly worded. The article started with a statement that claimed some people are "imperturbable logicians," but most simply follow "other people’s opinions and advice."

    The author then says that people tend to stick with advice from others (prefrontal cortex) over our own experiences (striatum). I believe what confused you is the author’s poor choice of the word gut (or tastebuds) to mean knowledge that comes from experience. I don’t think she meant it to be mutually exclusive with logic. Maybe she should have used the word intuition or eliminated the word altogether.

    The author said, "Although perhaps we should be more inclined to stick with what our gut (or tastebuds) has learned from personal experience, most people…"

    The author probably should have said, "Although perhaps we should be more inclined to stick with what we (or our tastebuds) have learned from personal experience, most people…"

    Link to this
  8. 8. jgrosay 9:03 am 04/21/2011

    Huh !: the issue is that failure in learning from experience is an schyzophrenic’s feature. Do the involved genes coincide ?

    Link to this
  9. 9. hoamingin 9:07 am 04/21/2011


    This article describes what the brain evolved to do. The brain did not evolve to think. For millions of years it evolved among ancestors who did not have speech. It evolved to observe how others handled situations and absorb those behaviours as scenarios that produced expert, automatic responses, without having to think. That is how our ancestors passed on from generation to generation the practices that made them expert hunter gatherers.

    The reason there was so little innovation for the first 6my was because the mechanism evolved to do the opposite of innovation. It evolved to preserve behaviours that already existed that had proved they worked because the people being observed used them and survived. People who used behaviours that did not work did not survive and their behaviours were not there to be observed.

    Language did not change things much. Modern humans with speech evolved about 200,000ya. For the next 190,000 years their main innovations were better stone tools.

    Sure, some began to use body paint and bury the dead with rituals, but you would expect some signs of consciousness when, for the first time in millions of years, humans had names by which to individuate objects and themselves. But they continued to live as hunter gatherers in the Stone Age until a sudden dip back into a mini Ice Age about 11,000 years ago forced humans to change their behaviours to become herders and farmers because the traditional practices that had been passed down for millions of years could no longer keep them alive.

    So the brain we use is the same brain that our hunter gatherer used, but we use it differently. When we use the brain to develop new ideas, it follows that we are using it in a way it did not evolve to be used. But the basic mechanism has not changed. It shows up in distinctive cultures of different societies, that are preserved by children who observe and absorb attitudes and behaviours in the formative years before they have developed logic and rational thinking.

    So there is nothing strange about what the article describes. It simply describes the basic mechanism by which the brain evolved to do things – to observe behaviours used by others, memorise them as scenarios and when similar circumstances happen again, respond automatically, without having to think consciously. "Gut" responses are those automatic, not-thinking responses from stored scenarios. More detail on my website:

    Link to this
  10. 10. okwhen 10:52 am 04/24/2011

    As an independent atheist may I second that suggestion.

    Link to this

Add a Comment
You must sign in or register as a member to submit a comment.

More from Scientific American

Scientific American Holiday Sale

Give a Gift &
Get a Gift - Free!

Give a 1 year subscription as low as $14.99

Subscribe Now! >


Email this Article