June 21, 2012 | 4
After months of contentious debate, the journal Science is publishing a controversial study on Friday about H5N1 avian influenza‘s ability to spread among mammals. The report faced a tortuous path to publication as some researchers sought to censor the study’s findings for fear that they could be replicated and put to nefarious use. In a Science Perspective article accompanying the H5N1 research led by Ron Fouchier at the Erasmus Medical Center in the Netherlands, cryptographer Bruce Schneier draws parallels between cyber security and efforts to control access to scientific data.
Whether a virus effects computers or living things, a head-in-the-sand approach to protecting information about the virus’s nature is unlikely to be successful, according to Schneier, chief security technology officer and co-founder of digital security services firm BT Counterpane. He points out that virologists take significant risks when they rely on secrecy to protect their findings. Secrets are difficult to keep and once the data is exposed the researchers have little recourse—the genie is out of the bottle.
Likewise, the omission of technical details—initially proposed by the U.S. government’s National Science Advisory Board for Biosecurity’s (NSABB) for Fouchier’s research as well as another H5N1 study led by Yoshihiro Kawaoka at the University of Wisconsin–Madison published last month in Nature—provides poor security, according to Schneier. (Scientific American is part of Nature Publishing Group.) Kawaoka’s research detailed the mutation of lab-made strains of the H5N1 avian flu virus to the point where it became highly transmissible in ferrets. Sooner or later someone would have filled in any missing pieces, either through trial and error or working backward from the study’s results.
It is also a mistake to think that an experiment is too difficult for others to replicate for lack of access to the necessary equipment. “What is impossible today will be a Ph.D. thesis in 20 years, and what was a Ph.D. thesis 20 years ago is a high-school science fair project today,” Schneier says.
Although much research is now analyzed, stored and disseminated via computer networks, scientists should understand something that most businesses still haven’t grasped: “Everything gets hacked,” Schneier says. The list of organizations whose data has been compromised by cyber attackers or insiders leaking information includes banks, government agencies, military institutions—and the list goes on. One of the most glaring recent examples is the professional networking Web site LinkedIn. After a massive security breach and a seemingly complacent response to the theft of millions of poorly protected customer passwords LinkedIn now faces a $5-million class action suit and, perhaps worse, a general sense of mistrust among its users.
Trying to keep scientists from publishing the results of sensitive research for fear that an enemy might use this information is problematic, says Schneier. If the scientific community is facing a difficult problem with serious consequences, someone, somewhere will be working on a solution. And even a prestigious journal like Science or Nature had refused to publish the H5N1 research, it would have found its way to the public online, where there are no international borders.
Labs have some recourse against “opportunistic attackers”—those who exploit weak security for financial gain. Against them, relative security is important. “You are safe if you are more secure than other networks,” Schneier says. Targeted attacks are another matter. “It is almost impossible to secure a network against a sufficiently skilled and tenacious adversary,” he writes. “All we can do is make the attacker’s job harder.”
Schneier’s Perspective article elaborates on a presentation he gave at April’s Royal Society H5N1 research conference in London. “I wasn’t asked to comment on H5N1—I was asked to explain the realities of cyber security,” he says. “But after listening to everyone talk about the issue, I realized that I had a lot of related experience that might be useful to the virology community.”
Although this was the first time Schneier has specifically applied lessons from computer security and cryptography to another academic discipline in this way, the nature of the data being protected is irrelevant, he says. “The names and motivations of the attackers are slightly different, but so what? It’s all on the same computers and networks,” he adds. “The attack tools are all the same.”
Image courtesy of the U.S. Geological Survey