Last week's deadly collapse of an eight-story garment factory building in Dhaka, Bangladesh has prompted discussions about whether poor countries can afford safe working conditions for workers who make goods that consumers in countries like the U.S. prefer to buy for bargain prices.
Maybe the risk of being crushed to death (or burned to death, or what have you) is just a trade-off poor people are (or should be) willing to accept to draw a salary. At least, that seems to be the take-away message from the crowd arguing that it would cost too much to have safety regulation (and enforcement) with teeth.
It is hard not to consider how this kind of attitude might get extended to other kinds of workplaces -- like, say, academic research labs -- given that last week UCLA chemistry professor Patrick Harran was also scheduled to return to court for a preliminary hearing on the felony charges of labor code violations brought against him in response to the 2008 fire in his laboratory that killed his employee, Shari Sangji.
Jyllian Kemsley has a detailed look at how Harran's defense team has responded to the charges of specific violations of the California Labor Code, charges involving failure to provide adequate training, failure to have adequate procedures in place to correct unsafe conditions or work practices, and failure to require workers wear appropriate clothing for the work being done. Since I'm not a lawyer, it's hard for me to assess the likelihood that the defense responses to these charges would be persuasive to a judge, but ethically, they're pretty weak tea.
Sadly, though, it's weak tea of the exact sort that my scientific training has led me to expect from people directing scientific research labs in academic settings.
When safety training is confined to a single safety video that graduate students are shown when they enter a program, that tells graduate students that their safety is not a big deal in the research activities that are part of their training.
When there's not enough space under the hood for all the workers in a lab to conduct all the activities that, for safety's sake, ought to be conducted under the hood -- and when the boss expects all those activities to happen without delay -- that tells them that a sacrifice in safety to produce quick results is acceptable.
When a student-volunteer needs to receive required ionizing radiation safety training to get a film badge that will give her access to the facility where she can irradiate her cells for an experiment, and the PI, upon hearing that the next training session in three weeks away, says to the student-volunteer, "Don't bother; use my film badge," that tells people in the lab that the PI is unwilling to lose three weeks of unpaid labor on one aspect of a research project just to make the personnel involved a little bit safer.
When people running a lab take an attitude of "Eh, young people are going to dress how they're going to dress" rather than imposing clear rules for their laboratories that people whose dress is unsafe for the activities they are to undertake don't get to undertake them, that tells the personnel in the lab that whatever cost is involved in holding this line -- losing a day's worth of work, being viewed by one's underlings as strict rather than cool -- has been judged too high relative to the benefit of making personnel in the lab safer.
When university presidents or other administrators proclaim that knowledge-builders "must continue to recalibrate [their] risk tolerance" by examining their "own internal policies and ask[ing] the question—do they meet—or do they exceed—our legal or regulatory requirements," that tells knowledge-builders at those universities that people with significantly more power than them judge efforts to make things safer for knowledge-builders (and for others, like the human subjects of their research) as an unnecessary burden. When institutions need to become leaner, or more agile, shouldn't researchers (and human subjects) do their part by accepting more risk as the price of doing business?
To be sure, safety isn't free. But there are also costs to being less safe in academic research settings.
For example, personnel develop lax attitudes toward risks and trainees take these attitudes with them when they go out in the world as grown-up scientists. Surrounding communities can get hurt by improper disposal of hazardous materials, or by inadequate safety measures taken by researchers working with infectious agents who then go home and cough on their families and friends. Sometimes, personnel are badly injured, or killed.
And, if academic scientists are dragging feet on making things safer for the researchers on their team because it takes time and effort to investigate risks and make sensible plans for managing them, to develop occupational health plans and to institute standard operating procedures that everyone on the research team knows and follows, I hope they're noticing that facing felony charges stemming from safety problems in their labs can also take lots of time and effort.
UPDATE: The Los Angeles Times reports that Patrick Harran will stand trial after an LA County Superior Court judge denied a defense motion to dismiss the case.