Kaspersky Labs isn't as well known in the U.S. as Norton or Symantec, but the company is a major international provider of anti-virus tools. Its co-founder, Evgeny Kaspersky, is one of the world's most prominent computer security experts. Recently he gave an interview to the German news magazine Spiegel about the future of cyber attacks and the potential for full-scale cyber war. The entire interview is interesting and I encourage you to read it in full, but one particular passage jumped out at me:
Kaspersky: ... Everything depends on computers these days: the energy supply, airplanes, trains. I'm worried that the Net will soon become a war zone, a platform for professional attacks on critical infrastructure.
SPIEGEL: When will that happen?
Kaspersky: Yesterday. Such attacks have already occurred.
SPIEGEL: You're referring to Stuxnet, the so-called "super virus" that was allegedly programmed to sabotage Iranian nuclear facilities.
Kaspersky: Israeli intelligence unfortunately doesn't send us any reports. There was a lot of talk—on the Internet and in the media—that Stuxnet was a joint U.S.–Israeli project. I think that's probably the most likely scenario. It was highly professional work, by the way, and one that commands a lot of respect from me. It cost several million dollars and had to be orchestrated by a team of highly trained engineers over several months. These were no amateurs; these were total professionals who have to be taken very seriously. You don't get in a fight with them; they don't mess around.
SPIEGEL: What kind of damage can a super virus like this inflict?
Kaspersky: Do you remember the total power outage in large parts of North America in August 2003? Today, I'm pretty sure that a virus triggered that catastrophe. And that was eight years ago.
What gets lost here is the distinction between a targeted virus such as Stuxnet, which appears to have been designed explicitly to disrupt the specialized industrial control systems built by Siemens (and used in Iranian centrifuges), and a virus that generates enough random havoc to incidentally muck up industrial systems, too—a blackout as collateral damage. The security expert Bruce Schneier has written that the Blaster worm contributed to the blackout by disrupting all the secondary systems that help to keep the grid up and running (later reports that Chinese agents specifically targeted the computers have been thoroughly debunked). Schneier writes:
The computer systems we use on our desktops are not reliable enough for critical applications. Neither is the Internet. The more we rely on them in our critical infrastructure, the more vulnerable we become. The more our systems become interconnected, the more vulnerable we become.
In short: we shouldn't use Windows to run critical industrial applications, as Windows machines are especially vulnerable to targeted viruses and random acts of chaos.
In this month's issue of Scientific American, Professor David Nicol describes in detail the security problems that plague the U.S. power grid, focusing specifically at the weaknesses involved with using Windows operating systems to run critical infrastructure. In particular, he highlights the problem with upgrades: Unlike the computer systems you use at work, the computers in a power plant can't be taken down once a week for system maintenance—power needs to keep flowing 24/7/365. Because of this, power plant computers remain vulnerable to known viruses long after security patches become available. Nicol writes:
Grid operators also have a deep-rooted institutional conservatism. Control networks have been in place for a long time, and operators are familiar and comfortable with how they work. They tend to avoid anything that threatens availability or might interfere with ordinary operations.
This is the situation we find ourselves in. Eight years ago a computer virus accidentally killed power to the entire Northeast. The next blackout may be no accident.
Photo of a Toronto street corner during the 2003 blackout by John R. Southern (krunkwerke) on Flickr.