Skip to main content

Will Computers Ever Know Everything?

This article was published in Scientific American’s former blog network and reflects the views of the author, not necessarily those of Scientific American


What was Alan Turing's greatest contribution? Here was a man who invented the idea of the modern computer, a man upon whose insights the information technology revolution firmly rests. He was the first to understand that instructions are themselves data, making algorithms capable of the recursive thinking that makes humans unique. (I think that I think, therefore I am.) He realized that machines would become so smart that they would eventually be mistaken for humans—this in an era when the chairman of IBM was claiming that there would be a worldwide market for perhaps five computers—and devised a test to tell man from automaton. Oh, and he also led the successful Allied effort to break Germany's secret Enigma codes during World War II, as he realized that "no one else was doing anything about it and I could have it to myself."

An essay in this week's issue of Science magazine by Andrew Hodges, the dean of Wadham College at Oxford University and the author of Alan Turing: The Enigma argues that, great as these achievements are, Turing's greatest contribution was defining the limits of what computers can "know"—that is, what is computable. By formalizing the computability question in 1936, Turing illuminated the deeper issue of what humans could know: Is our knowledge limited in the same way as computers? Or do we have some sort of mental "intuition" (Turing's word) that supersedes the power of mere machinery?

Turing wasn't sure (though he suspected that the strange rules of quantum mechanics may give our brains some non-deterministic wiggle room). Three quarters of a century later, we're not much closer to an answer. Even in this age of "big data," where computers churn through gobs of information to come up with cannily human-like responses (consider IBM's Jeopardy-beating Watson computer, named, incidentally, after the CEO with poor computer-demand forecasting skills), humans are far better at everyday tasks like making sense of a scene. Artificial intelligence remains a dream.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Turing's work on computability led to an even deeper question, according to Hodges: "Does computation with discrete symbols give a complete account of the physical world?" In other words, is the world computable? Can a machine, in principle, rise not just to the intellectual capabilities of human beings, but supersede those capabilities? Can a computer know everything?

This past December we asked the Harvard researcher David Weinberger to profile the Living Earth Simulator, an project that would take in lots of data and compute the future of human civilization. It is perhaps the most ambitious effort ever launched to test the computability of the world. (Weinberger came away unconvinced.) Perhaps the Living Earth Simulator will succeed in due time, and computers will be able to predict the future.

More likely it will not, and another 75 years hence, we will still be struggling with the rich philosophical loam that Turing left us with. You don't need a computer to predict that we're going to require a mind as exceptional as Turing's to make sense of it.

Image courtesy Chris Brown (zoonabar) on Flickr

Michael Moyer is the editor in charge of physics and space coverage at Scientific American. Previously he spent eight years at Popular Science magazine, where he was the articles editor. He was awarded the 2005 American Institute of Physics Science Writing Award for his article "Journey to the 10th Dimension," and has appeared on CBS, ABC, CNN, Fox and the Discovery Channel. He studied physics at the University of California at Berkeley and at Columbia University.

More by Michael Moyer