ANATOLIA, 9,000BC – The rising sun advanced over the hills, engulfing the arid land in a blaze of warmth. Below the amber sky lay a patchwork of wheat fields, in which a scattering of stooped figures silently harvested their crops.
Later, their harvest would be scrutinised, and only the largest grains selected for planting in the autumn.
A revolution was occurring. For the first time in 3.6 billion years, life had subverted the evolutionary process and began to steer it not with natural selection, but artificial selection. Selection pressures became synonymous with the needs of the architects; the farmers. The technique led to a widespread transition from hunter-gathering to agriculture, a shift that would transform human culture and lay the foundations for the first civilisations. Moreover, in their efforts to permanently remodel the characteristics of a species, early farmers were pioneers of genetic modification.
The modification of plants would later be followed by the domestication of animals, and perhaps eventually, human beings.
From the promotion of eugenics to justify genocide in Nazi Germany, to the mass-produced and homogenous population of Aldous Huxley’s dystopian future in the novel ‘Brave New World’, to ‘Frankenfood’, genetic engineering has amassed a reputation as a treacherous pursuit. However, a recent development appears to have slipped under the public radar: human pre-natal diagnosis. Screening foetal genomes to eliminate genetic ‘defects’ may lead to incremental changes in the human genetic reservoir, a permanent shift in our characteristics and eventually, self-domestication.
The technique involves testing for diseases in a human embryo or foetus, and may be performed to determine if it will be aborted, or in high-risk pregnancies, to enable the provision of immediate medical treatment on delivery. Until recently, pre-natal screening required invasive procedures such as amniocentesis, in which the fluid from the sac surrounding the foetus, the amnion, is sampled and the DNA examined for genetic abnormalities. The procedure can only be performed after the 15th week of pregnancy, and carries a 1% risk of miscarriage and the possibility of complications. In the light of such limitations and risks, the technique hasn't gained widespread popularity.
However, a research group based at the University of Washington in Seattle has developed an alternative. Their simple test can be performed weeks earlier than current pre-natal screening, and crucially, requires only a maternal blood sample and DNA from both parents. The technique exploits the fragments of foetal DNA in the mother’s blood plasma, which can be strung together by sequencing each nucleotide many times, and then differentiated from maternal and paternal DNA by statistical comparison. It’s quick, harmless, and may soon become widely available. Therein lies the problem. Such a tool is a powerful new route gleaning information about unborn offspring. The object of the exercise: to identify foetuses with the earmarks of genetic disease as candidates for abortion.
Inevitably, the technique is vulnerable to abuse and will empower parents to discriminate the characteristics of their progeny pre-emptively, in a step towards ‘designer babies’. Nevertheless, there is a more immediate concern. Screening for inheritable disorders requires knowledge of their genetic basis, which can be dangerously precarious. Some conditions, such as Down’s syndrome; characterised by the presence of an extra chromosome, are glaringly obvious. Others have more subtle and complex genetic origins. Just as the invention of vaccines to prevent infectious diseases was followed by attempts at total eradication, our efforts to eliminate genetic characteristics may have permanent consequences.
Autism spectrum disorder (ASD) has already been singled out as a potential target for the screening technology. The disorder, which is characterised by difficulties in communication and social interaction, and repetitive or stereotyped behaviours and interests, has a strong but elusive genetic basis. Intriguingly, there has been much speculation that the genes involved in the development of ASD may be linked to mathematical and scientific ability.
The theory has roots in the overlap between certain useful aptitudes in technical professions, and behaviour typical of ASD. An obsessive attention to detail, the ability to understand predictable rule- based systems, ‘systemising’, and a narrow range of interests, are traits characteristic of both groups. Professor Baron Cohen of the University of Cambridge is a strong proponent of the idea, and has suggested that scientist couples are more likely to have children with the disorder. It’s a compelling idea with intuitive plausibility, but the evidence isn’t there (yet). Until we know better, perhaps restraint is needed in eliminating these potentially important genes from our gene pool. There has been speculation that Einstein and Newton were ‘on the spectrum’- what if we inadvertently ‘cured’ the future world of similar talent?
Will our descendants be less than human? Another candidate for remedy with reproductive technology is schizophrenia. The disorder affects cognition, and can lead to chronic problems with emotional responsiveness. The 1% prevalence of schizophrenia makes it an apt target for prevention. However, the globally consistent and high incidence of this disease may be an indicator of its association with advantageous genetic characteristics. The ‘social brain hypothesis’, the main theory to explain the evolution of schizophrenia, suggests that the human brain evolved to select for genes associated with schizophrenia in a trade for higher order cognitive traits. These include language and the ability to interpret the thoughts and emotions of others. Schizophrenia is the cost that humans pay for being able to communicate, and as such, the genes responsible may be an essential component of the human gene pool. As with ASD, the elimination of the disease may have unintended consequences, and permanently alter the social dynamics within our species.
This mechanism, termed a ‘heterozygote advantage’, can arise from the benefits of carrying different forms of a gene, as opposed to two of the same variant, or ‘alleles’. The phenomenon has been proposed for a wide variety of genetic diseases; however usefulness is often dependent on environmental context. Because human lifestyles have diversified to such an extent from those of our ancestors, certain advantages may be outdated. The malaria protection conferred by carrying a single sickle-cell gene is hardly worth the risk of debilitating anaemia if you end up with two- especially in a modern world where anti-malarial medication is widely available. The systematic eradication of this disorder, and many others, will be a welcome and significant medical advancement. But caution is needed.
Following a recent project to build a comprehensive map of the functional elements in the human genome, ENCODE, a function was assigned to 80% of our DNA sequence. However, our genomes are still poorly understood. Many sequences are multi-functional, and knowledge of mechanisms of gene expression is essential to any meaningful model.
We urgently need a regulatory framework for the use of procedures such as pre-natal screening, and to exercise restraint in gene eradication. A detailed assessment and forecast of the long- term consequences is essential before a potentially corrosive procedure become entrenched in modern society. The alternative: we might just end up domesticating ourselves.
DNA image: Altered from original by Sponk on Wikimedia Commons.