Skip to main content

How Alan Turing Invented the Computer Age

Alan Turing.

Credit:

Getty Images

This article was published in Scientific American’s former blog network and reflects the views of the author, not necessarily those of Scientific American


In 1936, whilst studying for his Ph.D. at Princeton University, the English mathematician Alan Turing published a paper, “On Computable Numbers, with an application to the Entscheidungsproblem,” which became the foundation of computer science. In it Turing presented a theoretical machine that could solve any problem that could be described by simple instructions encoded on a paper tape. One Turing Machine could calculate square roots, whilst another might solve Sudoku puzzles. Turing demonstrated you could construct a single Universal Machine that could simulate any Turing Machine. One machine solving any problem, performing any task for which a program could be written—sound familiar? He’d invented the computer.

Back then, computers were people; they compiled actuarial tables and did engineering calculations. As the Allies prepared for World War II they faced a critical shortage of human computers for military calculations. When men left for war the shortage got worse, so the U.S. mechanized the problem by building the Harvard Mark 1, an electromechanical monster 50 feet long. It could do calculations in seconds that took people hours.

The British also needed mathematicians to crack the German Navy’s Enigma code. Turing worked in the British top-secret Government Code and Cipher School at Bletchley Park. There code-breaking became an industrial process; 12,000 people worked three shifts 24/7. Although the Polish had cracked Enigma before the war, the Nazis had made the Enigma machines more complicated; there were approximately 10114 possible permutations. Turing designed an electromechanical machine, called the Bombe, that searched through the permutations, and by the end of the war the British were able to read all daily German Naval Enigma traffic. It has been reported that Eisenhower said the contribution of Turing and others at Bletchley shortened the war by as much as two years, saving millions of lives.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


As the 1950s progressed business was quick to see the benefits of computers and business computing became a new industry. These computers were all Universal Turing Machines—that’s the point, you could program them to do anything.

There will positively be no internal alteration [of the computer] to be made even if we wish suddenly to switch from calculating the energy levels of the neon atom to the enumeration of groups of order 720. It may appear somewhat puzzling that this can be done. How can one expect a machine to do all this multitudinous variety of things? The answer is that we should consider the machine to be doing something quite simple, namely carrying out orders given to it in a standard form which it is able to understand.” – Alan Turing

By the 1970s a generation was born who grew up with “electronic brains” but they wanted their own personal computers. The problem was they had to build them. In 1975 some hobbyists formed the Homebrew Computer Club; they were excited by the potential the new silicon chips had to let them build their own computers.

One Homebrew member was a college dropout called Steve Wozniak who built a simple computer around the 8080 microprocessor, which he hooked up to a keyboard and television. His friend Steve Jobs called it the Apple I and found a Silicon Valley shop that wanted to buy 100 of them for $500 each. Apple had its first sale and Silicon Valley’s start-up culture was born. Another college drop-out, Bill Gates, realized that PCs needed software and that people were willing to pay for it—his Microsoft would sell the programs.

Turing’s legacy is not complete. In 1950 he published a paper called “Computing machinery and intelligence.” He had an idea that computers would become so powerful that they would think. He envisaged a time when artificial intelligence (AI) would be a reality. But, how would you know if a machine was intelligent? He devised the Turing Test: A judge sitting at a computer terminal types questions to two entities, one a person and the other a computer. The judge decides which entity is human and which the computer. If the judge is wrong the computer has passed the Turing Test and is intelligent.

Although Turing’s vision of AI has not yet been achieved, aspects of AI are increasingly entering our daily lives. Car satellite navigation systems and Google search algorithms use AI. Apple’s Siri on the iPhone can understand your voice and intelligently respond. Car manufacturers are developing cars that drive themselves; some U.S. states are drafting legislation that would allow autonomous vehicles on the roads. Turing’s vision of AI will soon be a reality.

In 1952 Turing was prosecuted for gross indecency, as being gay was then a crime in Britain. He was sentenced to chemical castration. It’s believed that this caused depression, and in 1954 Turing committed suicide by eating an apple poisoned with cyanide. Outside of academia Turing remained virtually unknown because his World War II work was top-secret. Slowly word spread about Turing’s genius, his invention of the computer and artificial intelligence, and after a petition campaign in 2009, the British Prime Minister Gordon Brown issued a public apology that concluded:

“…on behalf of the British government, and all those who live freely thanks to Alan’s work, I am very proud to say: we’re sorry. You deserved so much better.”

June 23, 2012 is the centenary of Alan Turing’s birth. I’m happy to say that finally Turing is getting the recognition he deserves, not just for his vital work in the war, but also for inventing the computer—the Universal Machine—that has transformed the modern world and will profoundly influence our future.

Image of Apple I computer via Wikimedia Commons

Ian Watson is an Associate Professor of Artificial Intelligence at the University of Auckland, New Zealand where he researches in machine learning and Game AI. Ian has just written a popular science book called The Universal Machine - from the dawn of computing to digital consciousness. Ian blogs about the history and future of computing at The Universal Machine. You can follow him on twitter @driwatson, Google+, or Facebook.

More by Ian Watson