February 6, 2012 | 3
In my classes, I often ask my students to wrestle with what I call damned-if-you-do-or-don’t dilemmas, which offer no easy solutions. Every choice would pose certain risks and violate one valued principle or another. We often must choose what we deem to be the “least bad” option, and hope things work out. Research involving the bird-flu virus H5N1 poses an especially knotty dilemma, in which scientists’ commitment to openness—and to reducing humanity’s vulnerability to potential health threats—collides with broader security concerns.
The H5N1 virus normally only infects humans who come into direct contact with infected birds; so far there have been no reported cases of airborne transmission among birds and humans. Of the 583 people known to have been infected with the virus, 344 have died as a result, a mortality rate of 59 percent. To be sure, many other infected people may have recovered without coming to the attention of medical authorities. But in comparison, the infamous flu pandemic of 1918, which killed at least 50 million people worldwide, had a mortality rate of two percent.
Last year teams from the Erasmus Medical Center in the Netherlands and from the University of Wisconsin at Madison each announced that they had engineered highly contagious versions of H5N1, which can be transmitted from one ferret to another through the air. The immune systems of ferrets are similar to those of humans. The Dutch team submitted its paper to Science and the Wisconsin team to Nature, but publication of both papers has been held up since December on the advice of a federal committee of experts, the National Science Advisory Board for Biosecurity.
“Our concern is that publishing these experiments in detail would provide information to some person, organization, or government that would help them to develop similar… viruses for harmful purposes,” the board announced in Nature and Science on January 31. Another concern is that the new virus could be accidentally released into the environment.
In a recent post, Christine Gorman of Scientific American revealed a bitter debate among scientists over the H5N1 research. Some insist that the H5N1 research performed by the two teams was dangerous and perhaps should never have been done. Others contend that such work is beneficial, because it can help epidemiologists anticipate—and develop vaccines and other defenses for—outbreaks.
Ron Fouchier, a member of the Dutch team, seems to hold both opinions simultaneously. When Fouchier publicly discussed the work at a scientific meeting last September, he defended its value but also called the ferret experiments “really, really stupid,” according to a report by Scientific American‘s Katherine Harmon.
The World Health Organization has scheduled a February 16 meeting to ponder the case of the engineered H5N1 and other research aimed at making pathogens deadlier. Here are some options for resolving the dilemma:
A: Disease researchers could continue experimenting and publishing their results without any restrictions. This option strikes me as far too risky, given the possibility that terrorist organizations such as Aum Shinrikyo or Al Qaeda—or simply a Unabomber-type nut-job with lab skills–might exploit such knowledge for nefarious purposes.
B: Researchers could only do work deemed by authorities to offer much greater benefits than risks, and publications would not disclose details that could be exploited by terrorists or others who might carry out biological attacks. Given the fluidity of information in the Internet era, this option might end up being virtually the same as option one.
C: The research could continue but only under secret, classified conditions in military facilities. This alternative would limit, if not eliminate, the potential for the research to help protect civilian populations; it would foster paranoia about U.S. intentions; and the dangerous information might leak out anyway.
D: Ban all research, open or classified, aimed at making pathogens deadlier. This is my “least-bad” choice, because I believe that the risks of research like the recent H5N1 experiments outweigh potential benefits. In general, I favor unrestricted research and communication, just as I favor free speech. But if scientists keep introducing more lethal pathogens into the world, the odds grow that one of them will be unleashed intentionally or accidentally. Moreover, if the U.S. keeps pursuing research into new strains of infectious disease, other nations and groups are more likely to do so as well.
My fears stem in part from the history of biological-warfare research, as detailed in accounts such as A Higher Form of Killing: The Secret History of Chemical and Biological Warfare, by Robert Harris and Jeremy Paxman (Random House, 2002). Such research, which has been carried out at least since World War II by the U.S., United Kingdom, Soviet Union, Japan and other states, has repeatedly led to releases of pathogens. In 1979, biological-warfare experiments in a Soviet facility in Sverdlovsk triggered an anthrax epidemic that killed 70 people.
From the 1950s through the 1970s, the U.S. military sought to test the vulnerability of Americans to bio-attacks by releasing supposedly harmless bacteria in New York City, Washington, D.C., and other population centers. In 1950, a Navy ship sprayed Serratia marcescens over San Francisco; according to various published reports, the bacterium caused 11 cases of pneumonia, one fatal, in people with weak immune systems.
There have been at least 456 reported incidents, three fatal, in which workers at Fort Detrick, Maryland, the U.S. military’s main biological warfare facility, were infected by dangerous microbes. In 1989, researchers at Fort Detrick and a nearby facility were exposed to an Ebola virus, an event dramatized in the bestseller The Hot Zone by Richard Preston (Anchor, 1994). FBI investigators suspect that the anthrax-laced letters that killed five Americans in 2001 were concocted by a Fort Detrick biologist, Bruce Ivins, who committed suicide in 2008.
In all these ways—and no doubt others that we can’t even imagine—programs that are supposed to protect us from diseases may end up hurting us. Nature already does all-too-good a job of inventing new microbes to sicken and kill us. Do we really want to bring more into the world?
Photo of Malaysian bird-flu workers courtesy of Wikimedia Commons.