Skip to main content

Felt up or blown up? The psychology of the TSA, body scans and risk perception

This article was published in Scientific American’s former blog network and reflects the views of the author, not necessarily those of Scientific American


The choice between felt up or blown up seems like a no-brainer. So does the choice between the low-dose radiation exposure of a backscatter x-ray exam at the airport or getting on the plane and spending a couple hours high enough in a thinner atmosphere that you’ll get far more exposure to cosmic radiation. So why this fuss about new TSA security screening procedures? Why can’t people just get risk right?

Wrong question…because it presumes there is a "right" to get. Rationalists and scientists and worshippers of Enlightenment reason (and lots of self-professed Wise Men in the comment-o-sphere) may call the way we’re behaving irrational, but risk is not just a matter of the facts. It’s a matter of how those facts feel, how we interpret them. Facts are just ones and zeroes that mean nothing until we run them through the software of our interpretations. The TSA muddle is just the latest clarion example of how arrogantly irrational it is to presume that, when it comes to risk, rationality can rule the day. Let’s deconstruct the inner psychology of this example. There are lessons here.

We know, from the research of Paul Slovic and Baruch Fischhoff and others, that the less personalized a risk is–-the more abstract it is–-the less scary it is. It’s been more than nine years since we saw the faces and learned the names and family details of the nearly 3,000 victims of 9/11, victims we could relate to. People, like us. The risk of terrorism to air travel, while it remains very real, is now more of a possibility, an idea, not as concrete or human as it felt in 2001. And the shoe bomber and liquid bombers and underwear bomber and toner cartridge bomber all failed, in part because of our security efforts and in part because of their ineptness. Whatever the reason, the attempts didn’t scare us as much as the failures (no fresh images of actual human victims) reassured us.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


We also know from the heuristics and biases research of Daniel Kahneman and Amos Tversky and others that the more readily "available" a risk is to our brains–-the quicker and more powerfully something comes to mind–-the more influence it has on how we feel about it. There is plenty in the news about the possibility of bad guys bombing planes, but the image of it actually happening is fortunately faded and dim (actually, of course, the 9/11 attackers crashed the planes), so the availability heuristic triggers less concern.

These two psychological factors help explain the findings of a Washington Post-ABC poll that found that only 30 percent of Americans are worried about terrorism against air travel, the lowest it’s been since 9/11.

Now let’s consider the psychology on the other side of this risk-risk tradeoff, the risk of being exposed to radiation, or groping, minimal risks compared to being blown out of the sky at 30,000 feet, but fraught with their own emotional/affective baggage. For many people radiation instantly evokes subconscious negative associations with bombs and nuclear power plant accidents and cancer. Cancer prompts more fear than many other risks, even greater risks like heart disease, in part because the greater the pain and suffering (dread) a risk can cause, the scarier it is likely to be. Radiation also feels scary because we can’t detect it, so we can’t protect ourselves, and "uncontrollability" makes any risk scarier.

In this case there is another psychological characteristic that makes the radiation feel more frightening, the facts about wavelength and sieverts notwithstanding. The scanner dose is imposed. Compare how some fliers feel about the x-ray dose to how they feel about the much greater dose of cosmic radiation they’ll get during the flight. The in-flight exposure is acceptable, because it’s voluntary. The psychological study of risk perception has found that an imposed risk almost always prompts more worry. (The x-ray and cosmic doses are both infinitesimal and as close to zero actual risk as you could ever hope to get. But they sure feel different!)

Wait, you say. Radiation-phobes can opt out and go for what’s behind the blue curtain, a physical inspection so intimate that after it’s over the inspector doesn’t have to ask if that’s a pistol in your pocket or you’re glad to see him. Submit or don’t fly. Yeah, that feels like choice!

So let’s put this in the form of a ratio. From the perspective of the security experts and the government officials who have to think about risk assessment dispassionately at the societal level, it would be framed this way:

Risk A - fear of being planes being blown up vs. Risk B - fear of tiny doses of radiation or a bit of personal discomfort. A is clearly greater.

But to you and me as individuals, thinking more about ourselves than society at large, the psychology of risk perception ratio looks like…feels like, this:

Risk A - Complacency vs. Risk B –undetectable/uncontrollable, high dread, imposed. Framed this way, for some B is the bigger bogeyman.

What’s really remarkable, and reassuring, are the findings in the Washington Post-ABC poll that a solid majority of the flying public sees things like the dispassionate experts do. Grumbling perhaps, two thirds of us accept the need for the scans. Half accept the need for physical pat downs of those who decline the scan. Even without recent victims, we know that the risk of bad guys with bombs on planes remains clear and present.

In fact, as further proof of the whole argument that risk perception is not a purely fact-based coldly rational process, the survey respondents who were more afraid of the terrorist threat were more supportive of the tighter security than their complacent fellow travelers who were readier to tell the TSA people "Don’t Touch My Junk."

There are several important lessons here. First, government risk management policy makers need to understand and account for how people feel about a risk, and potential policy responses, as those policies are designed. After all, how those policies work depends in part about how people feel about them, as the TSA mess demonstrates. There is plenty of robust scientific evidence, from various fields, that can guide this thinking, which should be part of a more holistic approach to risk analysis.

Second, we need to move past the false dichotomy of the supposed battle between System One and System Two in risk perception…reason VERSUS emotion. It’s not one or the other. Our single system uses both, though the wiring and chemistry of the brain gives the edge to the feelings and emotions. We don’t function as perfect reasoning machines. Nor can we trust ourselves to always make the right calls when our reason is mixed with emotions and instincts not yet evolved to handle the more complex threats we face in the modern world. What we can do is accept that reason and gut reaction, cognition and intuition, are both part of how we perceive and respond to danger, and factor all of that into figuring out which policies make the most sense. That approach might have helped the TSA avoid some of the mess we’re now in.

Sure glad I’m not flying Wednesday!

About The Author: David Ropeik is an Instructor at the Harvard Extension School, a consultant in risk perception and risk management, and author of How Risky Is It, Really? Why Our Fears Don’t Always Match the Facts (McGraw-Hill, 2010).

 

The views expressed are those of the author and are not necessarily those of Scientific American.

David Ropeik is an Instructor at the Harvard Extension School and author of 'How Risky Is It, Really? Why Our Fears Don't Always Match the Facts'.

More by David Ropeik