Skip to main content

Put Yourself in the Cheater's Shoes

Could seeing the world through the eyes of the scientist who behaves unethically be a valuable tool for those trying to behave ethically? Last semester, I asked my “Ethics in Science” students to review an online ethics training module of the sort that many institutions use to address responsible conduct of research with their students [...]

This article was published in Scientific American’s former blog network and reflects the views of the author, not necessarily those of Scientific American


Could seeing the world through the eyes of the scientist who behaves unethically be a valuable tool for those trying to behave ethically?

Last semester, I asked my "Ethics in Science" students to review an online ethics training module of the sort that many institutions use to address responsible conduct of research with their students and employees. Many of my students elected to review the Office of Research Integrity's interactive movie The Lab, which takes you through a "choose your own adventure" scenario in as academic lab as one of four characters (a graduate student, a postdoc, the principal investigator, or the institution's research integrity officer). The scenario surrounds research misconduct by another member of the lab, and your goal is to do what you can to address the problems -- and to avoid being drawn into committing misconduct yourself.

By and large, my students reported that "The Lab" was a worthwhile activity. As part of the assignment, I asked them to suggest changes, and a number of them made what I thought was a striking suggestion: players should have the option to play the character who commits the misconduct.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


I can imagine some imminently sensible reasons why the team that produced "The Lab" didn't include the cheater as a playable character. For instance, if the scenario were to start before the decision to cheat and the user playing this character picks the options that amount to not cheating, you end up with a story that lacks almost all of the drama. Similarly, if you pick up with that character in the immediate aftermath of the instance of cheating and go with the "come clean/don't dig a deeper hole" options, the story ends pretty quickly.

Setting the need for dramatic tension aside, I suspect that another reason that "The Lab" doesn't include the cheater as a playable character is that people who are undergoing research ethics training are supposed to think of themselves as people who would not cheat. Rather, they're supposed to think of themselves as ethical folks who would resist temptation and stand up to cheating when others do it. These training exercises bring out some of the particular challenges that might be associated with making good ethical decisions (many of them connected to seeing a bit further down the causal chain to anticipate the likely consequences of your choices), but they tend to position the cheater as just part of the environment to which the ethical researcher must respond.

I think this is a mistake. I think there may be something valuable in being able to view those who commit misconduct as more than mere antagonists or monsters.

Part of what makes "The Lab" a useful exercise is that it presents situations with a number of choices available to us, some easier and some harder, some likely to lead to interactions that are more honest and fair and others more likely to lead to problems. In real life, though, we don't usually have the option of rewinding time and choosing a different option if our first choice goes badly. Nor do we have assurance that we'll end up being the good guys.

It's important to understand the temptations that the cheaters felt -- the circumstances that made their unethical behaviors seem expedient, or rational, or necessary. Casting cheaters as monsters is glossing over our own human vulnerability to these bad choices, which will surely make the temptations harder to handle when we encounter them. Moreover, understanding the cheaters as humans (just like the scientists who haven't cheated) rather than "other" in some fundamental way lets us examine those temptations and then collectively create working environments with fewer of them. Though it's part of a different discussion, Ashe Dryden describes the dangers of "othering" here quite well:

There is no critical discussion about what leads to these incidents -- what parts of our culture allow these things to go unchecked for so long, how pervasive they are, and how so much of this is rewarded directly or indirectly. …

It's important to notice what is happening here: by declaring that the people doing these things are others, it removes the need to examine our own actions. The logic assumed is that only bad people do these things and we aren't bad people, so we couldn't do something like this. Othering effectively absolves ourselves of any blame.

The dramatic arc of "The Lab" is definitely not centered on the cheater's redemption, nor on cultivating empathy for him, and in the context of the particular training it offers, that's fine. Sometimes one's first priority is protecting or repairing the integrity of the scientific record, or ensuring a well-functioning scientific community by isolating a member who has proven himself untrustworthy.

But, that member of the community who we're isolating, or rehabilitating, is connected to the community -- connected to us -- in complicated ways. Misconduct doesn't just happen, but neither is it the case that, when someone commits it, it's just the matter of the choices and actions of an individual in a vacuum.

The community is participating in creating the environment in which people commit misconduct. Trying to understand the ways in which behaviors, expectations, formal and informal reward systems, and the like can encourage big ethical transgressions or desensitize people to "little" lapses may be a crucial step to creating an environment where fewer people commit misconduct, whether because the cost of doing so is too high or the payoff for doing so (if you get away with it) is too low.

But seeing members of the community as connected in this way requires not seeing the research environment as static and unchangeable -- and not seeing those in the community who commit misconduct as fundamentally different creatures from those who do not.

All of this makes me think that part of the voluntary exclusion deals between people who have committed misconduct and the ORI should be an allocution, in which the wrongdoer spells out the precise circumstances of the misconduct, including the pressures in the foreground when the wrongdoer chose the unethical course. This would not be an excuse but an explanation, a post-mortem of the misconduct available to the community for inspection and instruction. Ideally, others might recognize familiar situations in the allocution and then consider how close their own behavior in such situations has come to crossing ethical lines, as well as what factors seemed to help them avoid crossing those lines. As well, researchers could think together about what gives rise to the situations and the temptations within them and explore whether common practices can be tweaked to remove some of the temptations while supporting knowledge-building and knowledge builders.

Casting cheaters as monsters doesn't do much to help people make good choices in the face of difficult circumstances. Ignoring the ways we contribute to creating those circumstances doesn't help, either -- and may even increase the risk that we'll become like the "monsters" we decry