Skip to main content

What Does It Take to Change a Mind? A Phase Transition

This week’s Virtually Speaking Science episode featured yours truly in conversation with Laurie Paul, a philosopher at the University of North Carolina, Chapel Hill, 2014 Guggenheim Fellow, and author of a new book, Transformative Experience.

This article was published in Scientific American’s former blog network and reflects the views of the author, not necessarily those of Scientific American


This week's Virtually Speaking Science episode featured yours truly in conversation with Laurie Paul, a philosopher at the University of North Carolina, Chapel Hill, 2014 Guggenheim Fellow, and author of a new book, Transformative Experience. We chatted about so-called transformative experiences, empathy, identity and the fluid nature of the self, and whether having a child can be considered a rational decision. You can listen to the entire conversation at Blog Talk Radio, but I wanted to highlight one aspect of our discussion, since it pertains to a topic of great interest among those interested in science education and outreach -- notably, how best to deal with staunch denialism.

Recently an excellent May 2014 article at The New Yorkerby Maria Konnikova has been recirculating among my social networks. It describes a 2014 study by political scientist Brendan Nyhan examining what it might take to get people with strongly held beliefs on certain hot-button issues -- in this study's case, attitudes about vaccines -- to change their minds. Nearly 2000 parents were shown one of four pro-vaccination campaigns, each adopting a different persuasive strategy (facts, science, emotions or stories) plus one control group, to see which was most effective in changing minds. The punchline: none of the above. Nothing changed people's minds, and in fact, the strategies often backfired. "It's depressing," Nyhan admitted to Konnikova, adding after a pause, "We were definitely depressed."

It's just one more piece of mounting evidence that those most "dug in" when it comes to rejecting well established science are pretty much immune to the traditional strategy of presenting them with "Just the facts, ma'am." Surely, rationalists have thought for decades, those in denial are merely ignorant, and if we just educate them and show them the error of their ways, they will change their minds and embrace the truth. Maybe they'll even thank us profusely for our trouble. (Dare to dream, Jen-Luc Piquant murmurs indulgently.)


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Now we know this can backfire: presenting hardline denialists with the facts just makes them dig their heels in deeper. As Konnikova concludes, "Facts and evidence may not be the answer everyone thinks they are: they simply aren't that effective, given how selectively they are processed and interpreted."

Yet people do change their minds; it depends on how strongly they connect a particular opinion or belief with their personal identity -- and it's not always about how someone identifies politically. "When there's no immediate threat to our understanding of the world, we change our beliefs," Konnikova writes. "It's when that change contradicts something we've long held as important that problems occur." As she observes about the results of a climate change study from 2012, "If information doesn't square with someone's prior beliefs, he discards the beliefs if they're weak and discards the information if the beliefs are strong."

So what's a science writer/communicator to do? Give up in despair at ever making a difference in swaying public opinion? Au contraire! I do think it's possible to have an impact. I'm just not sure that impact can be quantitatively measured over the short time-frames associated with most psychological studies. The key here is how beliefs are tied to personal identity, and as Paul and I discussed, identity is fluid: it shifts and evolves continually over time in response to personal experiences (some transformative per Paul's definition) and other variables.

That means the question, "Who are you?" will evoke a different answer at different points in a person's life. I readily acknowledge that while my core temperament has remained the same, I'm a very different person today than I was at 15, or 25, or 35, and my life took many unexpected twists and turns, all of which shaped my personal identity -- for the better, I think. Nobody would have predicted, had they met me at 15, that I would become a science writer, an atheist, or a wife, thriving in sunny Southern California. My 15-year-old self wouldn't have believed such a prediction. Yet I couldn't imagine a better life.

I posed the following premise to Paul for discussion: Change a person's self-identity, and maybe you've got a better shot of swaying their opinion. Maybe there was a time when someone couldn't imagine being married, being a parent, or ever accepting the reality of climate change, but that doesn't mean they will always feel that way. A transformative experience, or series of such experiences, can make all the difference.

I like to think of the process as something akin to a phase transition. (Regular readers know of my great fondness for this ubiquitous concept.) As I wrote in 2011:

Any substance has a specific moment when the pressure or temperature is just right to cause it to shift from one state to another. Water is the most common example: lower the temperature sufficiently and it will turn into ice; raise the temperature to a boil and it will evaporate into steam. That’s a phase transition. The precise moment when this happens is called the critical point, when the substance is perfectly poised halfway between one phase and the other. That critical point can vary even for the same substance. At sea level, water boils at 212 degrees F (100 degrees C) and freezes at 32 degrees F (0 degrees C). But try to boil water in Denver — the Mile-High City — and you'll need a slightly lower temperature (approximately 95 °C).

When you see this laid out neatly on a graph, you'll note that it doesn't produce a continuous curved slope as the substance moves between phases. Rather, it resembles a staircase of sorts: the temperature drops (or rises) a bit, then there's a long stretch where nothing appears to be happening at all. Things seem to flatline, and the substance looks like it will be stuck in that phase permanently. (Will this water never freeze?!?) But then there's a sudden drop again as it hits the critical point and it moves into a new phase.

You wouldn't be able to see the shift coming if all you were measuring was the temperature, because that's a higher-level emergent property arising out of interactions of all the atoms making up the material. But change was still burbling at the atomic level, below what you could see on the surface.

That sudden shift is the physics equivalent of a Saul on the road to Damascus moment, and frankly, such clear-cut transformative moments are rare. But there’s more than one kind of phase transition. What I described above is a first-order transition, which occurs abruptly, such as boiling water or melting ice. A second order phase transition occurs more smoothly and continuously, such as when ferromagnetism switches to paramagnetism in metals like iron, nickel and cobalt, or when a substance becomes superconductive. That strikes me as a better analogy for how we change our minds.

Last week the Washington Postran a wrenching Op-Ed by Gal Adam Spinrad, detailing how she moved -- over a period of 12 years -- from being fearful of vaccinating her young daughter, to insisting the entire family stay up-to-date on all immunizations, including flu shots. There was no one moment when she changed her mind completely; rather, many different incidents over the years reshaped her thinking by degrees. Initially as a new young mother in San Francisco, she self-identified strongly with the local home birth collective's views that vaccines could be harmful to newborns, delaying her daughter's first immunizations until the child was one year old.

Then Spinrad developed shingles while abroad, a result of having had chickenpox as a child, and her views began to shift; she began to see the value in such protections. A second daughter, born with a serious congenital defect, lived just 58 days, and a broken-hearted Spinrad realized she couldn't continue to take her children's health (and her ability to protect them) for granted. Moving to the Midwest and finding a staunchly pro-vaxx doctor sealed the deal. By 2013 she finally understood the concept of herd immunization and why it wasn't just about protecting her children: it was also about protecting other vulnerable members of society who for various reasons couldn't have vaccinations.

Those are just the episodes Spinrad recalled as she shaped her narrative with the advantage of hindsight. There were likely countless other tiny things, adding up over the years with seemingly imperceptible effects, until that critical threshold was reached. Spinrad's views about vaccines gradually underwent a phase transition -- more of a second-order phase transition, rather than the abrupt phase shift of ice melting into water.

Of course, as Paul pointed out, you can't precisely control how people respond and evolve over time. It's a complex system, and your input is just one variable among many working to shape said system. All you can do is sow the seeds and hope some find fallow ground. And since most of us can't see into a person's innermost thoughts, there's no way of knowing where that fallow ground might lie. Those seeds might not flourish for months, or years. You might not see any outward change at all for a good long while. That doesn't mean your efforts are useless. I find some small comfort in remembering that.

UPDATE, 2/15/15: Brendan Nyhan kindly emailed me with a link to an interesting 2010 paper in Political Psychology that supports my phase transition analogy: "The Affective Tipping Point: Do Motivated Reasoners Ever “Get It”?" (A tipping point is basically the critical point in a phase transition.) Per the abstract:

In order to update candidate evaluations voters must acquire information and determine whether that new information supports or opposes their candidate expectations. Normatively, new negative information about a preferred candidate should result in a downward adjustment of an existing evaluation. However, recent studies show exactly the opposite; voters become more supportive of a preferred candidate in the face of negatively valenced information. Motivated reasoning is advanced as the explanation, arguing that people are psychologically motivated to maintain and support existing evaluations. Yet it seems unlikely that voters do this ad infinitum. To do so would suggest continued motivated reasoning even in the face of extensive disconfirming information. In this study we consider whether motivated reasoning processes can be overcome simply by continuing to encounter information incongruent with expectations. If so, voters must reach a tipping point after which they begin more accurately updating their evaluations. We show experimental evidence that such an affective tipping point does in fact exist. We also show that as this tipping point is reached, anxiety increases, suggesting that the mechanism that generates the tipping point and leads to more accurate updating may be related to the theory of affective intelligence. The existence of a tipping point suggests that voters are not immune to disconfirming information after all, even when initially acting as motivated reasoners.

I'll just bet "anxiety increases" the closer one gets to the proverbial tipping point: the cognitive dissonance must be severe. So perhaps that person loudly and passionately ignoring any and all evidence to the contrary is so forceful precisely because they are close to the tipping point and fear the phase change. Per Nyhan, "It’s hard to get people to that stage - it may take overwhelming evidence to the contrary if the belief is deep-seated or psychologically meaningful enough - but it can and does happen. I think strong social cues and elite consensus play an important role, as with (say) negative stereotypes about race and more recently gay people. Those beliefs haven’t gone away, of course, but a lot of people have revised their beliefs."

See? People can change their minds and progress can be made on a broad social scale. Don't despair just yet.