Skip to main content

Why Language and Thought Resemble Russian Dolls

Michael Corballis is a professor emeritus at the University of Auckland, who has written extensively on the evolution of language and the origins of thought.

This article was published in Scientific American’s former blog network and reflects the views of the author, not necessarily those of Scientific American


Michael Corballis is a professor emeritus at the University of Auckland, who has written extensively on the evolution of language and the origins of thought. In his 2011 book The Recursive Mind, he wrote about how the structure of human language allows for recursion—in which ideas are nested within each other: “He thinks that I think that he thinks.” Recursion allows the construction of sentences of of theoretically unlimited complexity

The main argument of The Recursive Mind is that recursion applies to thought processes and actions that are not limited to language itself, but characterize other aspects of human thought, such as our ability to imagine ourselves in the past or future. I queried Corballis on some of these ideas. An edited transcript follows:

You wrote a book in the last few years called The Recursive Mind. What is recursion and why is it important?


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Recursion can refer to a procedure that references itself or similar procedures, or to a structure that contains an embedded structure of the same type. It is important because it allows one to build entities of theoretically unlimited complexity by progressively embedding structures within structures.

Some scholars think that language may be built on the use of recursive building blocks. Isn't that a fundamental tenet of the modern linguistics pioneered by Noam Chomsky.

Yes, Chomsky’s view of language is that it is recursive, and this gives language its potential for infinite complexity—or what he has also called “discrete infinity.” In recent formulations, this is achieved through an “unbounded Merge,” in which elements are merged, and the merged elements can then themselves be merged—a process that can be repeated without bounds. To Chomsky, though, language is essentially internal thought, a mental monologue, known as I-language, and not a means of communication. The structure of language is therefore a by-product of internal thought. This implies a common structure, called “universal grammar,” that underlies all languages. But there is growing doubt as to whether such a structure exists.

Wasn't one of the flaws of Chomsky's work that he thought that the recursion as found in internal self-talk was not linked in any way to natural selection and evolution?

Yes, I think so. In Chomsky‘s view, the recursive principle emerged in a single step—a “great leap forward”—at some particular point in human evolution, probably within the time span of our own species. This is contrary to Darwinian evolution, which postulates that change occurs in small, incremental steps, implying that something as complex as language could not have occurred in a single step. Chomsky also argues that language could not have evolved through natural selection because the elements of language are fundamentally symbolic, with no direct reference to the external world, and could not have been “selected” by natural events.

I think these are flawed arguments. The idea of a single step is based in part on evidence of a sudden appearance of symbolic thought in the archeological record, but one can just as easily make a case for a gradual rise. Apes and dogs can fairly easily learn to respond meaningfully to spoken words, suggesting that they, like humans, have innate, mental concepts, to which words are easily attached. Even in humans, the apparently abstract nature of language might well have arisen from more iconic or even pantomimimed representations used as forms of communication that have become “conventionalized” through the generations, sustained through culture rather than biological endowment

I would argue instead that recursive thought may have origins in our ape-like forebears over 6 million years ago, but gained increasing complexity during the Pleistocene, dating from over 2 million years ago, largely as an adaptation toward increasing social complexity. Language depended on this broader recursive capacity, but also owed its earlier origins to manual gesture, perhaps developing a more complex structure through pantomime, with the iconic structure eventually replaced by vocal gestures (speech) or the more arbitrary signs of signed languages.

Hasn't more recent work extended the idea of recursion? Isn't it thought that recursive behaviors may actually precede language—and may have led to the emergence of language?

To some extent Chomsky’s view can be seen as consistent with this, since recursion applies to what he calls I-language, which is the language of thought rather than of communication. Communicative language can then regarded as a means of externalizing thought so that we can share our thoughts with others. It may have emerged after the “great leap forward” that gave us I-language, although Chomsky does not, as far as I know, suggest that thought and communicative language arose in sequence.

My own view is that recursive thoughts preceded language in broader ways. So-called theory of mind (“I know what you’re thinking”) or mental time travel (mentally reliving what I did yesterday, or foreseeing what I will do tomorrow) involve the embedding of thoughts within thoughts, and otherwise seem quite independent of what we understand as language. Navigation may be another example, as we embed maps within maps (my office in my house in my city in my country in the world). As I understand it, I-language is based on the structuring of internal symbols, whereas the extended examples I have given are more spatial or iconic than abstract, and therefore have a more direct relation with the external world. It is entirely plausible that they could have emerged through natural selection, which rescues language evolution from the “miracle” of a sudden great leap forward.

Since writing the book, I have moved further from the Chomskyan notion that language is uniquely human to finding the basis of mental time travel even in the ability of rats to “replay” and perhaps “preplay” trajectories in spatial environments. The aim of researchers should be to develop an account of the evolution of thought and language through natural selection, and not through some miraculous event within the past 100,000 years. .

Why are the origins of human language, considered to be one of the hardest problems in science?

The reason is that grammatical language, with its recursive structure, is considered unlike any other form of animal communication, and is restricted to our own species. Part of the difficulty is that we have no historical evidence to go on, since we are the only remaining species among the 20 or so hominins that split from the line leading to the great apes, and even the great apes closest to us do not appear to have grammatical language.

My sense is that we now have enough information from fossil evidence, primate communication, ancient DNA, and the structure of human cognition to begin to construct plausible Darwinian scenarios of how language might have evolved over the past 6 million or so years, and perhaps even earlier, without having to postulate a one-off miracle within the past 100,000.

Gary Stix, Scientific American's neuroscience and psychology editor, commissions, edits and reports on emerging advances and technologies that have propelled brain science to the forefront of the biological sciences. Developments chronicled in dozens of cover stories, feature articles and news stories, document groundbreaking neuroimaging techniques that reveal what happens in the brain while you are immersed in thought; the arrival of brain implants that alleviate mood disorders like depression; lab-made brains; psychological resilience; meditation; the intricacies of sleep; the new era for psychedelic drugs and artificial intelligence and growing insights leading to an understanding of our conscious selves. Before taking over the neuroscience beat, Stix, as Scientific American's special projects editor, oversaw the magazine's annual single-topic special issues, conceiving of and producing issues on Einstein, Darwin, climate change, nanotechnology and the nature of time. The issue he edited on time won a National Magazine Award. Besides mind and brain coverage, Stix has edited or written cover stories on Wall Street quants, building the world's tallest building, Olympic training methods, molecular electronics, what makes us human and the things you should and should not eat. Stix started a monthly column, Working Knowledge, that gave the reader a peek at the design and function of common technologies, from polygraph machines to Velcro. It eventually became the magazine's Graphic Science column. He also initiated a column on patents and intellectual property and another on the genesis of the ingenious ideas underlying new technologies in fields like electronics and biotechnology. Stix is the author with his wife, Miriam Lacob, of a technology primer called Who Gives a Gigabyte: A Survival Guide to the Technologically Perplexed (John Wiley & Sons, 1999).

More by Gary Stix