Long before I began stockpiling a rather eclectic collection of curiously homoerotic Men’s Fitness magazines in my closet as a randy teenager, decades still before the global pornification of the 21st -century Internet age, my tender childhood libido found a secret refuge amidst the colorful scenes contained in a handful of old university textbooks placed happily among my parents’ bookshelves in the family room of our home. There were of course images of famous imposing nudes made of marble and granite, and also the vibrant Romantic paintings with fully liberated genitalia proudly occupying the glossy pages of those cumbersome art history volumes common to 1960’s liberal arts colleges. And, indeed, I am forever indebted to the ancient Romans for visually educating me—a wide-eyed, insignificant little prehomosexual first-grader in a suburban cul-de-sac in 1980’s Virginia—about the idealized male form.
But it is safe to say that my very first exposure to any naked human was actually to a very different species of human altogether, when at the age of about five or six, my eyes lit upon an image—one that is seared in my memory to this day—in my father’s old anthropology textbook. It was a waxy, reconstructed display of an anatomically correct, ragamuffin group of Homo neanderthalensis people going about some daily routine at the yawning entrance to their cave, its walls dancing with the shadows from a roaring hearth inside. I don’t know if this quite qualifies me as a zoophile, since, after all, humans and Neanderthals do share an immediate common ancestor in Homo erectus (and given my geographic heritage and in the light of evolutionary genetic models of interbreeding, I could very well have some Neanderthal material floating about in my genome), but I must say, while their faces frankly left much to be desired, and personally I’ve never found prominent eyebrow ridges to be especially pretty, those other hominins did have some very fit and desirable bodies, to say the least. Comparing their ripped, muscular physiques to, say, many of those portrayed in the art history textbook, such as Peter Paul Rubens’ corpulent pastry-popping damsels, or even the most ancient goddess of fertility herself, the blubbery Venus of Willendorf (pictured above), it was the toned Neanderthals toward whom my amorous affections always returned.
And no wonder they were in such good shape. Although they were—as they are often characterized—intelligent, large-brained tool users who had mastered the use of fire and appear to have had at least a semblance of culture, Neanderthals were also advanced carnivorous predators whose diets consisted mainly of meat, fat and marrow and who devoted an extraordinary amount of their time to the relentless pursuit of large, wily game. Bone isotope analyses of Neanderthal remains reveal that their bodies were highly specialized to an intensely active lifestyle in the arctic conditions of Ice Age Europe, consuming as many as 4000 calories a day in their slaughter of meaty animals such as reindeer, bear, mammoth, bison, horses and wild cattle.
But, although they would put most modern-day bodybuilders to shame, Wright State University researcher Anna Bellisari points out in her excellent review article on the history of fatness in a 2008 issue of Obesity Reviews , this massive dietary intake, and in particular an almost exclusive, physiologically pigeonholed reliance on flesh, led eventually to the Neanderthals’ downfall. “When the European climate became even colder during the last glacial maximum,” writes Bellisari:
… metabolic demands finally outstripped the ability to supply adequate energy for maintenance and reproduction, despite their use of caves, rock shelters and fire for protection and warmth. Large game animals, the primary food sources, had become more difficult to find. Neanderthals were forced to move to southern Europe and western Asia, their final habitats before they completely disappeared.
In fact, there is even evidence that the last remaining Neanderthals in France and Spain may have become so desperate for meat that some even resorted to survival cannibalism. Cave sites in these areas have yielded Neanderthal bones with cutmarks and percussion scars similar to those found on butchered animal remains; these bones were discarded haphazardly among the heaps of random animal bones in ancient abattoirs rather than buried deliberately as they were for other individuals.
The Neanderthals had a good run, all things considered. They survived on this planet for over 250,000 years. Considering that we’ve been around ourselves as a species for only about 195,000 years, you’ve got to respect the Neanderthals’ carnivorous tenacity. And for reasons that will become apparent very soon, our own evolved feeding habits may well mean that we will actually be an even shorter-lived species. In our case, we’re our own worst enemy; but for the Neanderthals, it was indirect competition for scarce food resources with our Cro-Magnon (anatomically modern) human ancestors that seems to have put the final nail in their coffin.
We now know that modern humans and Neanderthals coexisted for at least 10,000 years—and, again, evolutionary genetics data show that we occasionally even mated with them. See, it’s not just me being titillated by depictions of naked Neanderthals; for many of you prudish readers of a certain descent, your own ancestors were just as kinky as I: Those cold-weather apes may have had brooding Neanderthal faces, after all, but their bodies were simply divine. Having said that, though, it turns out that one of the reasons interbreeding with Neanderthals was not even more prevalent than it might otherwise have been is that our Homo sapiens predecessors actually seemed to prefer their reproductive partners to be a bit on the juicier side of the scale.
Earlier I mentioned the 22,000-year-old, adipose-laden Venus of Willendorf statuette, which, since it’s discovery in 1908 in a loess deposit near a small village in Lower Austria’s Danube Valley, has been interpreted by scholars as representing the Upper Palaeolithic exaggerated ideal of female beauty, with extreme fat deposits in her breasts, buttocks and hips, precisely the areas mobilized during pregnancy and lactation. And, as University of Oxford anthropologists Stanley Ulijaszek and Hayley Lofink discuss in their 2006 piece on the biocultural history of obesity in the Annual Review of Anthropology, the closest living behavioral analogues to our early Homo sapiens ancestors, contemporary foragers in traditional societies such as those in Nauru, Samoa and Malaysia, continue to view plumpness as attractive. “Various societies across the world,” the authors also note, “practice or have practiced ritual fattening to promote fertility, marriageability and embodied social status.” One can see how powerful cultural factors are indeed only by imagining an anxious soon-to-be bride in, say, central New Jersey agreeing to be ceremonially leavened up like a fatted calf by her loving family members and adoring fiancé in preparation for the big event.
But the cultural abhorrence of fatness in modern industrial societies can also be understood in deeper evolutionary terms, since it may reflect an overcompensated aversion to a very recent adaptive problem facing a growing majority of our species. (It’s worth pointing out that there are some meaningful within-cultural differences in this area; overweight African-American women, for example, are less dissatisfied with their bodies, see themselves as healthier, more attractive generally, and as being more attractive to the opposite sex than do white women of similar weight and age.) Having some curves is one thing, amorphous obesity quite another. The cute Palaeolithic -era excess gathering coquettishly around the fertile hips of yesteryear has today morphed into a heaving, gigantic, mobility-cart-using problem signalling serious chronic disease. Obesity—which is the condition in which excess body fat has accumulated to such a degree that health and function are impaired, and which is operationally defined as having a body mass index (BMI) greater than 30 —would have been virtually unheard of among our hunter-gatherer ancestors, just as it is among modern foragers. According to Bellisari in her 2008 Obesity Reviews article:
The diet of Palaeolithic foragers was probably the most nutrient-dense and healthful in all of human history. The ‘Palaeolithic Diet’ has been reconstructed and its nutritional impact evaluated by combining archaeological data with observations of the few remaining modern foraging peoples. Daily calorie consumption was high, an average of 3000 calories per day, and meat constituted a significant 35–50% of the diet, with wild plant foods making up the remainder. Wild game contains much less saturated fat and up to five times more healthful polyunsaturated fat than meat from domestic farm animals. The Palaeolithic combination of lean meat, wild nuts, fruits and vegetables was lower in carbohydrates and higher in protein and micronutrients, including cancer-preventing antioxidants, than the modern industrial diet. It also contained less sodium, more fiber, and more of virtually every vitamin and mineral — potassium, calcium, Vitamin B, Vitamin C, iron and folate. Sugar, salt and alcohol were unknown. Skeletal remains of Upper Palaeolithic populations indicate their tall stature and generally good skeletal and dental health.
So Homo sapiens screwed the Neanderthals in more ways than one. We had a much more flexible and variable diet, and when we emigrated from Northern Africa to Europe and Asia around 100,000 years ago, we brought along with us an impressively evolved suite of anatomical, cognitive and social advantages that the Neanderthals—who’d been rigidly specialized to Ice Age conditions—just couldn’t compete with. Perhaps most important of all adaptations allowing us to edge out the last remaining Neanderthals, however, was our ancestors’ differential capacity to get fat—or at least, to store “energy reserves” in the form of adipose tissue that the body could draw from in times of scarcity and famine that came with often extreme seasonal fluctuations. Although the human foraging strategy enabled permanent settlement in nearly all of Earth’s ecological niches, food availability was massively unpredictable; it was literally a world of feast or famine, and so squirreling away energy reserves as fat deposits in our bodies supplied vital backup fuel during famine or incapacity (such as during pregnancy and the long period of dependency in young children), as well as the go-to energy source required for significant subsistence-related physical exertion.
According to Ulijaszek and Lofink, it is highly probable that natural selection operated on human phenotypes that were better able to store fat in this manner. That is to say, human fatness evolved, and it was in fact a remarkable evolutionary innovation. Compared to any other primate species, human beings have substantially greater levels of body fatness and reduced levels of muscle mass. When I was an undergraduate, I used to baby-sit a chimpanzee named Noelle, who at six months of age could easily lift herself up with one hand and had the biceps to show for it. That may sound impressive—and indeed it is in the physical fitness domain. Yet while Noelle’s chubby human counterparts lacked her muscle mass and could barely sit up at the same age, all that baby fat (enabled largely by their human mother’s own fat lower bodies) was busy giving them a cognitive advantage by providing these human infants with all that stored energy for costly brain metabolism and simultaneously reducing their energy expenditure by rendering them largely immobile. In other words, human evolution underwent a sort of trade-off of brawn for brains; and genes for fatness, especially as they are expressed during the long dependent period of human infancy when babies are undergoing rapid, radical cognitive change, played a vital role in the origins of human intelligence. Of course other species, such as those raised in captivity or domestic pets (like my gluttonous cat, Tommy), can become obese when placed in environments in which food availability is not offset by energy expenditure demands. But, strictly in terms of body fatness relative to muscle mass, our own species easily wins the crown for being the fattest ape. In fact, biologist Alister Hardy incorporated this fact into his original—now mostly defunct—“aquatic ape” hypothesis: the subcutaneous layer of fat in human bodies facilities buoyancy, whereas the muscle density of other apes leads them to sink like stones in the water.
If you’re an obese person and can trace your being significantly overweight to some combination of genetic factors—over 600 genes, markers and chromosomal regions have been associated with human obesity phenotypes using the Human Genome Obesity Map, and these heritable factors include everything from individual differences in metabolic rates to the tendency to engage in spontaneous physical activity to specific syndromes involving deficiencies of energy-regulating peptides—there’s a good chance that your genetic makeup would have given you a leg-up over your “naturally skinny” peers if only you’d been born about 10,000 years earlier. This is probably why you have these genetic contributions to your obesity today, in fact, because they helped your adipose-pocketing ancestors survive during food shortages.
It was 10,000 years ago, during the transition to the Neolithic period and, more specifically, to an agricultural lifestyle and the subsequent abandonment of foraging, that the tide began to turn and these fat-storing genes, ironically, became deadly. In the beginning, the tide turned very, very slowly. Although obesity occurred in some wealthy, high-status individuals in ancient Greece, Byzantine, Greco-Roman regions and elsewhere, until the late-19 th century, when the streamlined industrialization of food started to make basic subsistence relatively effortless for increasing numbers, farming and agriculture was a laborious business with energy expenditure demands rivalling—and in some cases, exceeding—those of foraging. As Bellisari points out:
Plant cultivation and animal husbandry required rigorous and constant, year-round labor. Energy expenditures of early farmers are estimated to be as high as those of modern farmers in non-industrial societies. Crops sometimes failed and animals died prematurely or did not reproduce as expected. Pests and natural disasters destroyed stored food surpluses. Food shortages and starvation were not uncommon. Domestic animals transmitted bacteria and parasites to humans in densely populated farming villages and towns, causing epidemics of infectious diseases not experienced by nomadic foragers.
So even during these difficult early agricultural times, fat-storing phenotypes would have been evolutionarily advantageous, since food availability was still highly unstable and starvation a very real threat. Farming may have offered convenience and was logical, given the growing impracticalities of foraging, the dwindling populations of wild game and human population growth with its increasingly large and complex societies, but this convenience of domesticating edible species came, initially, at a rather heavy cost. “Wherever farming replaced foraging,” writes Bellisari:
… there was a general decline in human health. Unequivocal signs of nutritional deficiencies, growth disturbances and increasing disease burdens are evident in the skeletal remains of farmers, from the earliest beginnings of agriculture to historic times. The most detailed studies of health decline come from early agricultural communities in North America, where skeletal lesions document iron deficiency anaemia and tubercular bone infections in populations that adopted maize as the primary food crop. Chronic protein-energy malnutrition, and a high prevalence of dental caries, abscesses and tooth loss were related to high-carbohydrate diets. Average adult stature was significantly lower, and life expectancy decreased relative to preceding foraging populations.
Although for different reasons than ever before, then, these harsh, early farming conditions were not exactly favorable for promoting obesity either. Most scholars believe that morbid obesity was relatively unheard of until we began industrializing the food industry and specialized production became privatized. In fact, Ulijaszek and Lofink argue that it was actually only about 60 years ago—after the dust settled from WWII and with the advent of slick advertising, cheap transport and prepackaged convenience foods—that those old, previously adaptive fat genotypes that evolved during the Palaeolithic -era materialized into the crippling, plus-sized problem that we have today. This is such a tiny sliver of time in our species’ evolutionary history that it can hardly be expressed mathematically, but needless to say, it is not enough time for natural selection to counteract what was, for so long before, clearly adaptive. (That’s not to say that natural selection isn’t operating against obesity today; again, the growing negative cultural attitudes toward fatness may be helping to drive this selection alongside the actual detriments to genetic fitness directly caused by obesity.)
Today, by the sheer sweat of our forefathers’ brows—or rather, their healthily metabolized fat—we have achieved unprecedented food security for the couch potato masses. This was a hard-won achievement, certainly, and has allowed our species to divert its attention from a mere subsistence-based lifestyle to other creative pursuits. But today’s industrialized nations are “obesogenic environments,” a term coined by University of Auckland medical researcher Boyd Swinburn and his colleagues in 1999, and which refers to human environments in which the physical, economic, social and cultural atmosphere involves “energy intake in excess of expenditure.” In other words, obesogenic environments are those in which food consumption, in combination with the increasingly sedentary lifestyle accommodated by food security, leads to such an accumulation of body fat that it compromises our physical health.
It’s astounding really. Consider what you’ve had to eat so far today. Don’t leave anything out. Now picture yourself trying to explain the nature of that food to an exhausted Palaeolithic forager straining to understand —while sitting cross-legged on the floor of some transposed temporal dimension in your living room and, yes, you’ve mastered some obscure Palaeolithic tongue, play along now— the concept of, say, a potato chip. To name all of the delicious sundries we ingested today, all the concocted chemical supernovas happening in our slobbering mouths, is simply obscene. How fortunate we are to live in a time of such abundance. And how unfortunate, too.
–he says while licking the chocolate from his fingers and finishing off his can of Red Bull.
Image ©Matthias Kabel