May 22, 2013 | 6
“Let’s say you have an axe. Just a cheap one, from Home Depot,” opens the horror-comedy novel John Dies at the End. “On one bitter winter day, you use said axe to behead a man.” This blow splinters the axe’s handle – so the story goes – so you get the hardware store stick a new handle on the blade.
The repaired axe sits in your garage until one day the next spring, when you damage the blade while fending off “a foot-long slug with a bulging egg sac on its tail,” which requires another trip to the store to replace the axe-head. Unfortunately, when you arrive home, you’re greeted by the enraged reanimated corpse of the man you beheaded last year. He takes a long look at the weapon you’re holding, and he screams, “That’s the same axe that beheaded me!”
The question is, Is he right?
It’s a riddle with a long and distinguished pedigree, dating at least as far back as ancient Greece, where the historian Plutarch posed essentially the same question by telling a story involving an aging ship, an adventurous crew, and a notable absence of vengeful zombies. Philosophers through the ages have regarded the riddle as a sort of dead end, because any given person’s answer hinges not on any actual attribute of the axe or ship in question, but on how the answerer chooses to define the word “same.” In one sense, if we swap the ancient Greek ship for a zombie-slaying axe, we’re not even posing the “same” riddle.
Consider, by way of comparison, a set of identical twins. Although plenty of anecdotes attest that sets of twins often share the “same” tastes in music, pets and mates – even if they’ve spent most of their lives apart – they still remain distinct individuals, and may have converged on those tastes for surprisingly different reasons. Parents and close friends, meanwhile, can tell one twin from his brother or her sister by tiny differences in behavior – one might be a little more adventurous than the other, or more introverted, or pickier about food; and over a lifetime, these small difference tend to add up to more significant ones.
In other words, despite the genes – and the womb – shared by a set of identical twins, they can’t share everything in common. Each twin lives in an inner mental world that’s distinct from every other on earth – and what’s more, researchers are learning that tiny variations in brain development begin to lay the foundation of each twin’s individual personality long before birth.
And as a new study has discovered, those prenatal variations add up to distinct personalities even in animals engineered to be genetically identical – twins that are, in effect, exact clones of one another. These results raise pointed questions not only about our framing of the old “nature versus nurture” debate, but also about what it means, precisely, to be an individual.
Look back far enough in history, and you’ll find that the concept of “individuality” was once synonymous with that of the soul. For millennia, religious leaders taught that God (or the gods, or one’s karma) placed a specific soul into one’s body sometime around the moment of conception or birth, infusing that body not only with breath and animation, but also with a specific self, unique among all others and distinct from the bodily vessel that contained it.
Even in the midst of such dualistic thinking, though, philosophers warred over the questions of how much “self-stuff” descended directly from the heavens, and how much of it was shaped by a person’s life here on earth. Confucius wrote, “By nature, men are nearly alike; by practice, they get to be wide apart” – but in the West, Plato and Aristotle launched the debate in earnest by taking stances at opposite ends of it: Plato with his notion of the individual as a perfect heavenly form marred by earthly struggles; Aristotle with his idea of the newborn mind as a blank slate.
Echoes of this old Greek argument were still ringing when Shakespeare wrote The Tempest, though the debate had grown more nuanced over the intervening centuries. When Shakespeare’s character Prospero describes the monster Caliban as “a born devil, on whose nature nurture can never stick,” he’s alluding to the idea that an individual’s nature must somehow be shaped by social interaction – and coining the phrase “nature/nurture” while he’s at it.
In Shakespeare’s day, official Church doctrine taught that at least some “innate ideas” – those of God, of virtue and of absolute truth – existed in the soul from the moment of birth, and possibly even in the womb. It wasn’t for another century or so that a philosopher like John Locke could freely and openly describe the mind as a “tabula rasa” – a blank slate on which experience inscribes a personality – while the more cynical Thomas Hobbes could argue that the only truly innate idea is probably selfishness. And it wasn’t until the nineteenth century, when scientists like Gregor Mendel and Francis Galton began describing heredity in precise mathematical terms, that the “nature” part of the equation really came under the microscope.
Ever since James Watson and Francis Crick explained the structure and behavior of DNA, most scientists have understood that “genes load the gun, but environment pulls the trigger.” In other words, DNA doesn’t actually encode your individual traits; it encodes long lists of instructions for building various kinds of biological structures if – and only if – the right circumstances arise. In programming terms, much of your DNA could be described as a nested hierarchy of “IF-THEN” statements, each waiting for the signal to execute the code within. And many of those signals come not from the DNA itself, but from the environment in which it unzips.
When we speak of a “gene for Parkinson’s disease,” for example, we’re not talking about a gene that literally encodes instructions for producing the disease’s symptoms; we’re talking about certain genetic mutations – tiny alterations in DNA code – that can help create those symptoms under certain unfortunate circumstances. These kinds of correlations become even murkier when we speculate about “psychopath genes” or “genes for depression.” As much as Shakespeare’s character Prospero might disagree, psychologists have found that a nurturing childhood environment can soften even the most “devilish” genetic nature.
And yet, as much as we might agree that an individual’s environment pulls DNA’s triggers, we’re still left with the question of what exactly we mean by “environment” – just the external world, or internal biology as well? And should we start counting at birth, or even earlier?
As anyone who’s raised a litter of puppies or kittens knows, even newborn animals often look and act distinct from their siblings. Some are noisy and adventurous while others stick close to mom; some are born runts while others are bullies from day one. So it’s hard to imagine any newborn as a complete blank slate, because it’s clear that at least some individual differences must start taking shape before birth.
What’s less clear, though, is where exactly these differences come from, and how they develop. Placing the burden on DNA just pushes our questions back one level: We’re still left to wonder how brothers’ and sisters’ very similar genes actually add up to such striking differences, and what factors shape that individuation in the months leading up to birth. And since we’re still a long way from understanding exactly what every gene in the human genome – let alone every gene in many other animals – actually does, many developmental researchers have turned to a comparatively straightforward tactic: Tracking the progress of genetically identical twins – clones – as they grow up and learn about their world.
Since comic-book supervillains (fortunately) don’t fund our scientific research, it’s out of the question to perform these experiments on human clones. Instead, scientists typically work with mice, because their brain anatomy is roughly similar to that of humans, plus they’re small, they’re easy to breed, and they grow from infancy to adulthood in less than six months.
Study after study has validated our intuition that even genetically identical mice can be nurtured to develop distinct personalities within their first few weeks of life. Some spend most of their time exploring their cages, while others hog their water and food, or groom themselves for hours. Some get stressed out easily; others keep their cool in the midst of annoyances. Whatever “nature” DNA provides, then, must be pretty flexible.
But this explanation wasn’t specific enough for Gerd Kempermann, a professor of genomics at the Center for Regenerative Therapies (CRTD) in Dresden, Germany, and Speaker of the German Center for Neurodegenerative Diseases (DZNE). Kempermann and his team started with a large group of newborn genetically identical mice, measured a few attributes of their brains and bodies, then set them loose in a large and complex enclosure to see how each mouse would develop. The study was a unusual interdisciplinary cooperation between the neurobiologists from Dresden, studying mice, and psychologists at the Max Planck Institute for Human Development in Berlin.
“We tracked the movements of the mice and derived a measure we called ‘roaming entropy,’ essentially an estimate of exploration,” Kempermann says. “That measure gave us an idea of how well and how flexibly the mice covered their terrain or range, which seemed like a good proxy for ‘experience.’” The more a given mouse wandered, the higher its roaming entropy grew – and, Kempermann’s team figured, the wider variety of experiences it was getting.
Three months later, the researchers reexamined the mice, and found not only that their brains had grown more and more individually distinct over time, but that the brains of the mice with the highest roaming entropy had grown and changed the most of all. Specifically, these mice sprouted far more new nerve cells in their hippocampus – a brain region crucial for forming and retrieving memories in mice and in humans – than less adventurous mice did.
“At the beginning of the experiment,” Kempermann says, “those differences were nonexistent or very subtle. But by the end of the study, it was clear that each mouse’s hippocampus had been shaped according to that mouse’s individual needs, by its own experiences.” In other words, he says, “experience shapes the brain and this adaptation allows altered behavior.”
As striking as these developmental distinctions are, they still beg some of the same questions that got Plato and Aristotle arguing in the first place: Are some individuals predisposed to explore more adventurously than others? If so, how and when might those differences start to take hold?
Contrary to what most of us were taught in school, the DNA you’re born with isn’t the same DNA you die with. Not exactly. Challenges you survive throughout your life lead to measurable changes in the ways certain pieces of your DNA are made accessible for use or prevented from being read – and some of these regulatory changes can be even be passed on to your descendants. Scientists have coined the term “epigenetics” (“over genetics”) to describe the study of aspects of genetics that are due not to the genetic code of DNA itself, but to the ways in which it’s read. And many many researchers believe that epigenetics may be responsible for the differences that emerge in genetically identical animals – perhaps from their first few months in the womb.
Studies of human identical twins have found that tiny epigenetic differences start increasing in young adulthood. Other researchers have suggested that factors like maternal stress and disease, early nutrition, and even position in the uterus may trigger individual epigenetic variations months before birth. Though no study has yet examined mouse hippocampal cells in the womb, the fact that baby animals can look and behave so differently from their twins seems to argue strongly for the idea that these changes have already shaped some subtle individual differences by the time of birth – even in genetically identical individuals.
After that point, a sort of self-reinforcing loop takes over. “Small perturbations in genes and anatomy,” Kempermann says, “may lead to individual differences in action tendencies – and those differences, in turn, may trigger differences in experience that accumulate over time.” As those individual experiences trigger even more epigenetic changes – not to mention differences in memories, and thus in brain function – an old mouse’s (or human’s) genes, body and brain cells may end up looking and working very differently from those of its identical twins.
In short, Kempermann says, “Experience matters. Genes are clearly critically important – but even given identical genes and an identical environment, different experiences lead to different personalities, and to the individualization of the brain.” It’s a comforting thought, in its way: Even if DNA encodes the basic recipe for your personality; even if epigenetic changes started cooking up the ingredients months before you were born, your personal choices and experiences also define how those ingredients come together – and, to a certain extent, what sorts of recipes you pass on to your children.
It doesn’t seem reasonable, then, to claim that individuality is somehow innate at the moment of conception; or at any single point in anyone’s development. Rather, it’s a process – a series of changes that begin at the molecular level and add up over time, spurred on by the unbroken flow of unique accidents, triumphs, letdowns and challenges each individual faces every day, from pre-conscious prenatal development to the last gasping breaths of old age. Individuality isn’t the same as “personhood” or “selfhood” – it is, in a word, uniqueness.
Like the axe or ship from the riddle, even individuals that share the same mother, the same environment, and the same genes are never exactly the same. Nor are you the same person you were yesterday, or a year ago, or at birth or conception. Your individuality grows and changes every day, and shapes the circumstances that’ll keep making you even more unique, long into the future.
Related: Becoming an individual twin isn’t about genetics or environment, but how you experience them by Scicurious