About the SA Blog Network

Doing Good Science

Doing Good Science

Building knowledge, training new scientists, sharing a world.
Doing Good Science Home

The line between persuasion and manipulation.

The views expressed are those of the author and are not necessarily those of Scientific American.

Email   PrintPrint

As this year’s ScienceOnline Together conference approaches, I’ve been thinking about the ethical dimensions of using empirical findings from psychological research to inform effective science communication (or really any communication). Melanie Tannenbaum will be co-facilitating a session about using such research findings to guide communication strategies, and this year’s session is nicely connected to a session Melanie led with Cara Santa Maria at last year’s conference called “Persuading the Unpersuadable: Communicating Science to Deniers, Cynics, and Trolls.”

In that session last year, the strategy of using empirical results from psychology to help achieve success in a communicative goal was fancifully described as deploying “Jedi mind tricks”. Achieving success in communication was cast in terms of getting your audience to accept your claims (or at least getting them not to reject your claims out of hand because they don’t trust you, or don’t trust the way you’re engaging with them, or whatever). But if you have the cognitive launch codes, as it were, you can short-circuit distrust, cultivate trust, help them end up where you want them to end up when you’re done communicating what you’re trying to communicate.

Jason Goldman pointed out to me that these “tricks” aren’t really that tricky — it’s not like you flash the Queen of Diamonds and suddenly the person you’re talking to votes for your ballot initiative or buys your product. As Jason put it to me via email, “From a practical perspective, we know that presenting reasons is usually ineffective, and so we wrap our reasons in narrative – because we know, from psychology research, that storytelling is an effective device for communication and behavior change.”

Still, using a “trick” to get your audience to end up where you want them to end up — even if that “trick” is simply empirical knowledge that you have and your audience doesn’t — sounds less like persuasion than manipulation. People aren’t generally happy about the prospect of being manipulated. Intuitively, manipulating someone else gets us into ethically dicey territory.

As a philosopher, I’m in a discipline whose ideal is that you persuade by presenting reasons for your interlocutor to examine, arguments whose logical structure can be assessed, premises whose truth (or at least likelihood) can be evaluated. I daresay scientists have something like the same ideal in mind when they present their findings or try to evaluate the scientific claims of others. In both cases, there’s the idea than we should be making a concerted effort not to let tempting cognitive shortcuts get in the way of reasoning well. We want to know about the tempting shortcuts (some of which are often catalogued as “informal fallacies”) so we can avoid falling into them. Generally, it’s considered sloppy argumentation (or worse) to try to tempt our audience with those shortcuts.

How much space is there between the tempting cognitive shortcuts we try to avoid in our own reasoning and the “Jedi mind tricks” offered to us to help us communicate, or persuade, or manipulate more effectively? If we’re taking advantage of cognitive shortcuts (or switches, or whatever the more accurate metaphor would be) to increase the chances that people will accept our factual claims, our recommendations, our credibility, etc., can we tell when we’ve crossed the line between persuasion and manipulation? Can we tell when it’s the cognitive switch that’s doing the work rather than the sharing of reasons?

It strikes me as even more ethically problematic if we’re using these Jedi mind tricks while concealing the fact that we’re using them from the audience we’re using them on. There’s a clear element of deception in doing that.

Now, possibly the Jedi mind tricks work equally well if we disclose to our audience that we’re using them and how they work. In that case, we might be able to use them to persuade without being deceptive — and it would be clear to our audience that we were availing ourselves of these tricks, and that our goal was to get them to end up in a particular place. It would be kind of weird, though, perhaps akin to going to see a magician knowing full well that she would be performing illusions and that your being fooled by those illusions is a likely outcome. (Wouldn’t this make us more distrustful in our communicative interactions, though? If you know about the switches and it’s still the case that they can be used against you, isn’t that the kind of thing that might make you want to block lots of communication before it can even happen?)

As a side note, I acknowledge that there might be some compelling extreme cases in which the goal of getting the audience to end up in a particular place — e.g., revealing to you the location of the ticking bomb — is so urgent that we’re prepared to swallow our qualms about manipulating the audience to get the job done. I don’t think that the normal stakes of our communications are like this, though. But there may be some cases where how high the stakes really are is one of the places we disagree. Jason suggests vaccine acceptance or refusal might be important enough that the Jedi mind tricks shouldn’t set off any ethical alarms. I’ll note that vaccine advocates using a just-the-empirical-facts approach to communication are often accused or suspected of having some undisclosed financial conflict of interest that is motivating them to try to get everyone vaccinated — that is, they’re not using the Jedi mind trick social psychologists think could help them persuade their target audience and yet that audience thinks they’re up to something sneaky. That’s a pretty weird situation.

Does our cognitive make-up as humans make it possible to get closer to exchanging and evaluating reasons rather than just pushing each other’s cognitive buttons? If so, can we achieve better communication without the Jedi mind tricks?

Maybe it would require some work to change the features of our communicative environment (or of the environment in which we learn how to reason about the world and how to communicate and otherwise interact with others) to help our minds more reliably work this way. Is there any empirical data on that? (If not, is this a research question psychologists are asking?)

Some of these questions tread dangerously close to the question of whether we humans can actually have free will — and that’s a big bucket of metaphysical worms that I’m not sure I want to dig into right now. I just want to know how to engage my fellow human beings as ethically as possible when we communicate.

These are some of the questions swirling around my head. Maybe next week at ScienceOnline some of them will be answered — although there’s a good chance some more questions will be added to the pile!

Janet D. Stemwedel About the Author: Janet D. Stemwedel is an Associate Professor of Philosophy at San José State University. Her explorations of ethics, scientific knowledge-building, and how they are intertwined are informed by her misspent scientific youth as a physical chemist. Follow on Twitter @docfreeride.

The views expressed are those of the author and are not necessarily those of Scientific American.

Rights & Permissions

Comments 11 Comments

Add Comment
  1. 1. tuned 9:15 am 02/22/2014

    All persuasion is a form of manipulation for a desired purpose.

    Link to this
  2. 2. tuned 9:18 am 02/22/2014

    Also, if you say “Jedi” one more time I am persuaded you should be laffed off the block.

    Link to this
  3. 3. Marc Levesque 9:39 am 02/22/2014

    Another excellent post; as usual I find it interesting, relevant and informative.

    Link to this
  4. 4. rkipling 2:57 pm 02/22/2014


    These are not the articles you are looking for.

    Link to this
  5. 5. evelyn haskins 5:47 pm 02/22/2014

    A very thought provoking article.

    I think that JB Watson’s work is relevant here. As well as the books on human body language — specifically probably Allan Pease’s book because it was directed at Sales people.

    If you haven’t read it, I think you’ll find Dean Buonomano’s “Brain Bugs: How the brain’s flaws shape our lives” interesting. And it might answer some of your questions :-)

    Link to this
  6. 6. oldfarmermac 8:59 pm 02/22/2014

    I share the author’s concerns to some extent I came to the conclusion a long time ago that philosophy doesn’t answer the questions that really matter except as a matter of intellectual interest.

    We are creatures created by evolution and while we are capable of using our ” new and improved ” brains to reason, we usually don’t, except within certain narrow limits and under certain conditions.

    Engineers are very good at reasoning within the framework of their profession for instance but I know a number of engineers and none of them are particularly interested in using their heads for anything other than a hat rack outside working hours.

    All of them have firm political opinions that have nothing to do with any close examination of obvious facts that are not disputed by anybody with enough brains to come in out of the rain.

    A couple of them do not believe we will ever run out of oil for instance even though they admit that any and every oil field must eventually run dry and that this planet we live on is of a finite size rather than infinite.

    Some of them believe in one sort of policy and others different policies when it comes to the role of government and even though a policy is clearly good or bad based on objective facts their position is almost always determined by their sense of community and culture rather than any relevant facts.

    If I know an engineer’s political affiliations I can predict his positions quite accurately on many policies that are either arguable or easily proven either pro or con, objectively. In some cases the liberals are wrong, in other cases the conservatives are wrong.

    In either case if his political in group, the people he identifies with, are in favor of a policy, he will be too, even though he is easily capable of evaluating it objectively for himself and coming to a contrary conclusion.

    Being a member in good standing in the perceived community trumps reason almost every time.

    Now insofar is convincing people to accept an argument thru the skillful use of sophisticated psychological techniques is concerned, all I can say for it insofar as advising scientists is this

    Link to this
  7. 7. oldfarmermac 9:08 pm 02/22/2014

    I ran out of space and will finish in a second comment.

    The people who are pushing antiscientific arguments are mostly doing so because they have very good reasons to do so such as making a lot of money. As a matter of fact they are usually making so much that they can easily afford the very best advertising agencies and these ad agencies employ the very best psychologists to help the copywriters and artists to create narratives that play on the intended victim’s , er customer’s I mean, emotions like a violin.

    If scientists don’t employ similar tactics they haven’t a chance in this day and time.

    Call it what you please. Personally I call it fighting fire with fire in order to survive in the current communications paradigm.

    Link to this
  8. 8. PascalLapointe 9:20 pm 02/22/2014

    In some way, one could say that every act of communication is some sort of a manipulation. We are bending reality to present it in a certain way. And on top of that, what has been developed in the last two centuries as communication for larger audiences —the writing techniques used by journalists, and today by everybody involved in communications at large— are inevitably involving some sort of manipulation. But this is very far from Jedi mind tricks. If it was that easy, there would be no climate skeptic and no creationists anymore.

    I do believe that psychology can teach us how to communicate with climate skeptics. But certainly not to the point of “convincing” them to think the way we do. Probably to the point of making a few steps in a direction that will answer to their own values.

    Link to this
  9. 9. Jerzy v. 3.0. 7:00 am 02/24/2014

    It may be unavoidable, that many scientists will start to manipulate audiences – given how heavy is competition for a shrinking pool of grants and tenures.

    However, scientists must now watch very carefully for manipulation from their desperately competing colleagues. Publications are no more to be trusted blindly, pieces of evidence should not be automatically extended into revolutionary new principles, etc. etc.

    Link to this
  10. 10. Jerzy v. 3.0. 7:02 am 02/24/2014

    Overall, it damages quality of science – ironically, that scientific institutions themselves teach students to “make your data scream” “make your graphs visceral” “make impact” etc.

    Link to this
  11. 11. DavidRopeik 10:38 am 02/24/2014

    Ms Stemwedel,

    The flaw in your assumption, and the answer to your dilemma, are captured in your statement; that persuasion is a matter of “presenting reasons for your interlocutor to examine, arguments whose logical structure can be assessed, premises whose truth (or at least likelihood) can be evaluated. I daresay scientists have something like the same ideal in mind when they present their findings or try to evaluate the scientific claims of others. In both cases, there’s the idea than we should be making a concerted effort not to let tempting cognitive shortcuts get in the way of reasoning well.

    This presumes the ability to reason, empirically, objectively, and reach a conclusion based solely on the facts. As a philosopher, you must be aware of the massive evidence from cognitive science that pretty comprehensively refutes that naive assumption. We simply can not reason the way your statement assumes. Human perception is inherently subjective, a tapestry woven of the facts and how we feel about them, that can’t be separated into threads of one and the other.

    With that as a given, there is ONLY persuasion – i.e. see the facts MY way – to reach a goal. The smarter we are about how to do this persuading, the more effective we are. And the better we are at it, the more we can achieve our own ends. All of which accrues to our success and our own survival. (You are probably aware of Hugo Mercier’s theory on the purpose for arguing and persuasion in the first place, not in order to get things empirically right, but to make things go our own way.)

    So while its is intellectually interesting to raise the ethical question of when persuasion become manipulation, the discussion ignores truths about human cognition that essentially make the question moot. It presumes an ability to objectively reason that manipulation immorally jerks around, a sort of reasoning that simply does not exist.

    By the way, I make my living as a risk communication consultant and Instructor at Harvard, and I do struggle with just the question you raise, all the time! Striving to behave ethically is a really effective (manipulative) way to persuade more effectively.

    Link to this

Add a Comment
You must sign in or register as a member to submit a comment.

More from Scientific American

Email this Article