Skip to main content

Can Engineers and Scientists Ever Master "Complexity"?

This article was published in Scientific American’s former blog network and reflects the views of the author, not necessarily those of Scientific American


I'm pondering complexity again. The proximate cause is the December 11 launch at my school, Stevens Institute of Technology, of a Center for Complex Systems & Enterprises. The center's goal is "to enable deep understanding of complexity and create innovative approaches to managing complexity." This rhetoric reminds me of the Santa Fe Institute, a hotbed of research on complex systems, which I criticized in Scientific American in June 1995 in "From Complexity to Perplexity." Speakers at the Stevens event include a mathematician I interviewed for that article, John Casti, who has long been associated with the Santa Fe Institute.

The event's organizers asked a few professors in the College of Arts & Letters, my department, to offer some concluding comments on complexity. I jumped at the chance, because I'm fascinated by the premise of complexity studies, which is this: Common principles underpin diverse complex systems, from immune systems and brains to climates and stock markets. By discovering these principles, we can learn how to build much more potent, predictive models of complex systems.

Here are some points I hope to make on December 11:


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


*Researchers have never been able to agree on what complexity is. The physicist Seth Lloyd has compiled a "non-exhaustive" list of more than 40 definitions of complexity, based on thermodynamics, information theory, linguistics, computer science and other fields. Can you study something if you're not sure exactly what it is?

*Previous attempts to master complexity have undergone a boom-bust cycle, as I pointed out in a 2010 obituary of mathematician Benoit Mandelbrot. Over the past century, researchers have become temporarily infatuated with various approaches to complex systems, including cybernetics, information theory, catastrophe theory, chaos theory, self-organized criticality and fractals (Mandelbrot's invention). In each case, excitement waned as the limits of the method became apparent.

*A key insight to emerge from chaos theory is that many complex systems are inherently unpredictable, because infinitesimal causes can have enormous consequences. This is the notorious butterfly effect—a term coined by meteorologist Edward Lorenz--which says that the fluttering of a butterfly's wings in Iowa can culminate in a typhoon in India.

*Complex social systems are especially hard to model, as I pointed out in an essay in The Chronicle of Higher Education last year, because humans are so hard to model. Humans are the atoms, in a sense, of social systems, and yet unlike atoms, each individual human is unique, a product of his or her physiology and life history. And whereas atoms are indifferent to what scientists say about them, we humans may alter our behavior when we learn what scientists are saying about us. Think of the impact of eugenics and Marxism on the twentieth century. In other words, scientists' models of societies can change societies in ways that the models cannot anticipate. As anthropologist Clifford Geertz used to say, social science is chasing a rapidly moving target and it can never catch up.

*In the heyday of chaos and complexity in the 1980s and 1990s, researchers prophesied that increasingly powerful computers would lead to increasingly precise models of complex systems. Those forecasts were much too optimistic, as the struggles of artificial intelligence and artificial life have demonstrated. Moreover, computer models can alter reality in unpredictable ways. Take finance, which is one focus of the Stevens Center for Complex Systems & Enterprises. Many of the world's leading financiers, armed with the best computer models money can buy, were still caught off guard by the global economic crisis of 2008. Moreover, computer-based trading has made markets much more volatile. That leads me to my final and most important point:

*Engineers hope to master complexity through innovation, but new technologies can create more problems than they solve. (Nassim Nicholas Taleb, whom I brought to Stevens a year ago, makes this same point in his new book Antifragile.) In addition to finance, which I discussed above, the Center for Complex Systems & Enterprises also focuses on health care and national security. The U.S. leads the world in medical innovation, and yet our health care system is dysfunctional. Americans spend much more per capita on health care than any other nation in the world, and yet we rank 38th in longevity, just behind Cuba. Similarly, the U.S. has no rival in military spending or technology, but our ceaseless invention of new weapons systems is arguably imperiling our long-term security. Take drones: By showing that unmanned aircraft can carry out attacks with minimal risks to operators, the U.S. has triggered an international arms race. More than 40 nations and sub-national groups—including some, such as Iran and Hezbollah, hostile to the U.S. and its allies--are now developing drones.

So what's my take-away message for my colleagues in the Center for Complex Systems & Enterprises? That they should give up trying to understand and master complex systems? And especially complex systems involving humans? Quite the contrary. Engineers and scientists have demonstrated their ability to invent and manage extraordinarily complex systems, which provide us with energy, transportation, food and water, health care, entertainment, communication, shelter, security. We must, and will, find ways to further minimize the downside and maximize the upside of civilization. But given the history of complexity research, our can-do optimism should always be tempered by skepticism and caution.

Image: uxmag.com.