About the SA Blog Network

Doing Good Science

Doing Good Science

Building knowledge, training new scientists, sharing a world.
Doing Good Science Home

Getting scientists to take ethics seriously: strategies that are probably doomed to failure.

Email   PrintPrint

As part of my day-job as a philosophy professor, I regularly teach a semester-long “Ethics in Science” course at my university. Among other things, the course is intended to help science majors figure out why being ethical might matter to them if they continue on their path to becoming working scientists and devote their careers to the knowledge-building biz.

And, there’s a reasonable chance that my “Ethics in Science” course wouldn’t exist but for strings attached to training grants from federal funding agencies requiring that students funded by these training grants receive ethics training.

The funding agencies demand the ethics training component largely in response to high profile cases of federally funded scientists behaving badly on the public’s dime. The bad behavior suggests some number of working scientists who don’t take ethics seriously. The funders identify this as a problem and want the scientists who receive grants from them to take ethics seriously. But the big question is how to get scientists to take ethics seriously.

Here are some approaches to that problem that strike me as unpromising:

  • Delivering ethical instruction that amounts to “don’t be evil” or “don’t commit this obviously wrong act”. Most scientists are not mustache-twirling villains, and few are so ignorant that they wouldn’t know that the obviously wrong acts are obviously wrong. If ethical training is delivered with the subtext of “you’re evil” or “you’re dumb,” most of the scientists to whom you’re delivering it will tune it out, since you’re clearly talking to someone else.
  • Reducing ethics to a laundry list of “thou shalt not …” Ethics is not simply a matter of avoiding bad acts — and the bad acts are not bad simply because federal regulations or your compliance officer say they are bad. There is a significant component of ethics concerned with positive action — doing good things. Presenting ethics as results instead of a process — as a set of things the ethics algorithm says you shouldn’t do, rather than a set of strategies for evaluating the goodness of various courses of action you might pursue — is not very engaging. Besides, you can’t even count on this approach for good results, since refraining from particular actions that are expressly forbidden is no guarantee you won’t find some not-expressly-forbidden action that’s equally bad.
  • Presenting ethics as something you have to talk about because the funders require that you talk about it. If you treat the ethics-talk as just a string attached to your grant money, but something with which you wouldn’t waste your time otherwise, you’re identifying attention to ethics as a thing that gets in the way of research rather as something that supports research. Once you’ve fulfilled the requirement to have the ethics-talk, would you ever revisit ethics, or would you just get down to the business of research?
  • Segregating attention to ethics in a workshop, class, or training session. Is ethics something the entirety of which you can “do” in a few hours, or even a whole semester? That’s the impression scientific trainees can get from an ethics training requirement that floats unconnected from any discussion with the people training them about how to be a successful scientist. Once you’re done with your training, then, you’re done — why think about ethics again?
  • Pointing trainees to a professional code, the existence of which proves that your scientific discipline takes ethics seriously. The existence of a professional code suggests that someone in your discipline sat down and tried to spell out ethical standards that would support your scientific activities, but the mere existence of a code doesn’t mean the members of your scientific community even know what’s in that code, nor that they behave in ways that reflect the commitments put forward by it. Walking the walk is different from talking the talk — and knowing that there is a code, somewhere on your professional society’s website, that you could find if you Googled it probably doesn’t even rise to the level of talking the talk.
  • Delivering ethical training with the accompanying message that scientists who aren’t willing to cut ethical corners are at a competitive career disadvantage, and that this is just how things are. Essentially, this creates a situation where you tell trainees, “Here’s how you should behave … unless you’re really up against it, at which point you should be smart and drop the ethics to survive in this field.” And, what motivated trainee doesn’t recognize that she’s always up against it? It is important, I think, to recognize that unethical behavior is often motivated at least in part by a perception of extreme career pressures rather than by the inherent evil of the scientist engaging in that behavior. But noting the competitive advantage available for cheaters only to throw up your hands and say, “Eh, what are you going to do?” strikes me as a shrugging off of responsibility. At a minimum, members of a scientific community ought to reflect upon and discuss whether the structures of career rewards and career punishments incentivize bad behavior. If they do, members of the community probably have a responsibility to try to change those structures of career rewards and career punishments.

Laying out approaches to ethics training that won’t help scientists take ethics seriously might help a trainer avoid some pitfalls, but it’s not the same as spelling out approaches that are more likely to work. That’s a topic I’ll take up in a post to come.

Janet D. Stemwedel About the Author: Janet D. Stemwedel is an Associate Professor of Philosophy at San José State University. Her explorations of ethics, scientific knowledge-building, and how they are intertwined are informed by her misspent scientific youth as a physical chemist. Follow on Twitter @docfreeride.

The views expressed are those of the author and are not necessarily those of Scientific American.

Rights & Permissions

Comments 3 Comments

Add Comment
  1. 1. jdjentsch 3:06 pm 08/31/2012

    I think another unpromising aspect of some ethics training approaches is the fact that students are given the impression that they are merely recipients of ethical knowledge and must act according to the ideas given to them… Instead, I feel that it may be useful to make them aware that they are themselves part of a scientific (and broader civil) community that must discuss, create, revise and implement ethical standards and behaviors. Perhaps by seeing themselves as participants, rather than “victims” of a top-down rule-creating process, they can see the value of being ethical agents. Any thoughts on the general nature of the point or how to overcome the issue I propose?

    Link to this
  2. 2. LinzHill 3:34 pm 08/31/2012

    So what IS your favourite strategy? Do you have students read papers or scenarios (real life if possible) and play “spot the ethical issue” and discuss? “Spot the bad science” worked the best for me, when I was a student studying pseudoreplication and poorly controlled variables. It lead to lively discussions, especially when some good papers were thrown in to make us question our standards.

    Link to this
  3. 3. ethics 1:52 pm 09/7/2012

    We came across some of the same suggestions when researching Elsevier’s new Ethics in Research & Publication program. We hope the program helps to educate researchers in an approachable way.

    Link to this

Add a Comment
You must sign in or register as a member to submit a comment.

More from Scientific American

Scientific American MIND iPad

Give a Gift & Get a Gift - Free!

Give a 1 year subscription as low as $14.99

Subscribe Now >>


Email this Article