I've always found gyms a bit strange.
Think about it: Dozens of people sweating in close proximity, running on conveyor belts going nowhere, lifting and dropping heavy objects for no reason. There's a guy grunting as he flings a barbell to the ground, a woman repeatedly leaping on and off a stack of boxes, and me, casually climbing a stair-master, musing over the irony of having driven here and taken the elevator to the third floor.
Gyms, of course, exist for a perfectly good reason. They’re controlled environments—safe, enclosed and relatively clean—where we can stretch, strain and stress our bodies in the name of physical fitness. Gyms exist because the vast majority of us are no longer active enough in our daily personal and professional lives, and we need an artificial setting to simulate the exercise our bodies got naturally for hundreds of thousands of years.
How and why our bodies are poorly suited to modern environments—and the adverse health consequences that result—is a subject of increasing study. A new book The Story of the Human Body by Daniel Lieberman, chair of the Department of Human Evolutionary Biology at Harvard, chronicles major biological and cultural transitions that, over the course of millions of years, transformed apes living and mating in the African forests to modern humans browsing Facebook and eating Big Macs across the planet.
“The end product of all that evolution,” he writes, “is that we are big-brained, moderately fat bipeds who reproduce relatively rapidly but take a long time to mature.”
But over the last several hundred generations, it has been culture—a set of knowledge, values and behaviors—not natural selection, that has been the more powerful force determining how we live, eat and interact. For most of our evolutionary history, we were hunter-gatherers who lived at very low population densities, moved frequently and walked up to 10 miles a day in search of food and water. Our bodies evolved primarily for and in a hunter-gatherer lifestyle.
This began to change about 12,000 years ago when humans started domesticating plants and animals. The Agricultural Revolution allowed humans to thrive in new ways, providing food surpluses, fostering population growth and promoting labor diversification. But farming also unleashed a variety of previously rare infectious diseases because of high population densities, close contact with livestock and inadequate removal of waste. Epidemics did not exist before the advent of agriculture because, as Lieberman explains, “hunter-gatherer population densities are below one person per square kilometer, which is beneath the threshold necessary for virulent diseases to spread.”
Cultural changes that created competing benefits and challenges continued for thousands of years, but none altered our behavior and environment more profoundly than the Industrial Revolution in which humans harnessed the power of fossil fuels and machines on a large scale to produce and transport goods. The Industrial Revolution profoundly altered the way we eat, work, and communicate, and allowed for massive population growth at far greater densities. Because of it, more people today have better health and higher standards of living than ever before.
But again, cultural change comes at a cost. Many important inventions of the 20th century were laborsaving machines like automobiles, airplanes, vacuum cleaners, dishwashers and elevators. They have made life more convenient and more pleasant, but perhaps too convenient and pleasant. While in the past, humans trekked many miles each day, today the average American walks less than half a mile and travels over 30 miles by car. Whereas before, eating meant hunting, gathering or growing, today most of us eat food that is produced and processed thousands of miles away.
The net effect of all these changes has been a reduction in the energy we expend, an expansion of the energy we consume, and a shift in the types of foods we eat—all of which are problematic from an evolutionary perspective. The results are not surprising: the last few decades have seen an explosion of diet and lifestyle diseases like Type 2 diabetes, heart disease and hypertension.
The Agricultural and Industrial Revolutions enabled two major classes of disease resulting from mismatches between the environments for which our bodies evolved and the environments in which they find themselves now: infectious disease and lifestyle disease. Infectious diseases—while still important contributors to morbidity and mortality—have declined substantially over the past century. The first 80 years of the 20th century saw a 20-fold decrease in infectious disease mortality. In 1900, pneumonia, tuberculosis and diarrhea were the three leading causes of death in the U.S., accounting for nearly one-third of all deaths. Today, heart disease and cancer contribute to half of all deaths, with stroke and diabetes close behind. Even in developing countries, the noncommunicable disease burden is now double that of infectious disease.
Antibiotics and vaccines played a vital role in the decline in infectious disease mortality, but just as important were improvements in our home and food environments: improved sanitation and waste-disposal, safer drinking water, better working conditions and stronger pest control. We have built, cleaned and inspected our way to better health.
But if the burden of infectious disease has declined, the burden of chronic disease seems to be growing. The prevalence of diabetes has tripled over the past 30 years, and now costs our health system $245 billion per year. Four times as many adolescents are overweight today as were in the 1960s. In 2000, 125 million Americans, or 45 percent of the population, had one or more chronic conditions, and this number is expected to increase steadily in the coming decades. Life expectancy in the U.S. has grown by an impressive 30 years since 1900, but for every 10 months of healthy life added since 1990, we’ve also added 2 months of disease. We are living longer, yes, but not always living healthier. The question then becomes, does a longer life invariably mean more chronic disease?
Intuitively, it may appear so. We have more years for fat to deposit and cancer-causing mutations to accumulate. But the answer doesn’t have to be an unqualified yes. We’ve created environments at odds with our biology, and meaningfully addressing chronic disease will require fundamental changes in those environments, just as effectively addressing infectious disease has.
But so far we’ve been moving in the wrong direction. Since 1970, consumption of added fat has increased by about 60 percent and consumption of high-fructose corn syrup has increased 1000 percent. Between 1985 and 2000, the price of fresh fruit and vegetables increased more than 2.5 times as much as the price of sugar and sweets. One study found that if agricultural subsidies in the U.S. had gone directly to taxpayers, they would allow each of us to buy only half an apple each year, but about 20 Twinkies.
If our food environment affects how we eat, our built environment—the buildings, streets and open-spaces in our community—affects how we move. Studies have found that a number of neighborhood characteristics can increase physical activity and promote healthier eating for residents living in those communities, such as having more destinations safely accessible by foot or bike and greater access to community gardens.
Perhaps the most important consequence of designing healthier communities lay in their long-term effects for children and adolescents—who some public health officials have warned may be the first generation of Americans with shorter life expectancies than their parents, largely because of obesity. A growing body of evidence suggests that increasing the ability of children to walk to school, expanding access to safe outdoor areas, and restricting access to fast-food restaurants may be useful for combating youth obesity. For example, children living closer to parks and with more park-area in their neighborhoods have higher levels of physical activity.
For better population health, moving forward may mean looking back. We must recognize that we don’t just pass on genes; we pass on environments. The greatest advances in 20th century life expectancy came from public health and environmental changes—mostly outside the walls of a hospital, and often outside the domain of medicine altogether. The greatest strides in reducing illness in the 21st century—allowing people to live longer, disease and disability-free lives—will require the same.