September 18, 2013 | 16
Two of the most monumental developments in the history of human civilization, likely the innovations that have saved more human lives than any other, are soap and sanitation. When large numbers of people congregate in a single location for prolonged periods of time, excrement and waste quickly rise to unimaginable levels and are capable of spreading disease incredibly quickly. As I mentioned in my first post here at Food Matters, many pathogens utilize the fecal-oral transmission route, in which poop from an infected individual makes its way into the water supply or onto food by serial contact (touching a contaminated surface then touching food). Lack of hygiene dramatically increases the likelihood of this sort of infection, as many infectious microbes can grow unchecked on filth outside the body, and many viruses can linger on unwashed surfaces for long periods of time.
Thankfully, In modern developed countries, we don’t have to worry about this much. Most of us live in places with sanitation infrastructure that carries away our waste and delivers fresh water, and we tend to have bottles of life-saving soap in every room with a sink. In the absence of soap and sinks, waterless hand sanitizers, sanitizing wipes and bleach-containing cleaning products promise to keep away the dreaded germs. Yet while there’s no doubt that sanitation and hygiene are critical in reducing the spread of infectious disease, it’s possible that we’ve gone too far in trying to live a sterile life.
In 1989, British physician David Strachan proposed the “Hygiene Hypothesis,” which sought to explain a puzzling series of observations: Children in cities in developed countries and had fewer siblings, those that lived more sanitary lives and presumably had less exposure to infectious diseases, were more likely to develop allergies, asthma and other atopic diseases than those that lived on farms or in developing countries, or that had many siblings. In the nearly 25 years since this was first proposed, a great deal of research has shown that exposure to diverse bacteria or even parasitic worms helps to train and regulate the immune system, preventing it from becoming over-active.
Many of the ideas put forward to explain how microbes might regulate the immune system are, in my opinion, problematic. I’ll spare you the details for now (if you ask in the comments, I may just have the motivation to write it up!), but basically, they boil down to suggestions that turning on certain types of immune responses take energy away from other types of immune responses that might cause disease*. But there’s also evidence that, even in the absence of a full-blown immune response, exposure to different populations of bacteria can have a significant impact on the way that our immune system responds to other threats.
Take, for instance, the relationship between the denizens of our intestines and diabetes. I’m not talking about the science showing that these gut microbes can affect obesity (though that’s pretty amazing), I’m talking about the fact that having different populations of bugs can influence your liklihood of developing diabetes, regardless of weight. In 2008, a group of researchers led by Alexander Chervonsky at the University of Chicago showed that, in a mouse model of diabetes, mice lacking gut microbes had more severe disease, while those with gut microbes were protected. Three years later, a lab here at Harvard reported that the presence of a single type of bacterium called segmented filamentous bacteria (SFB) in the gut was sufficient to provide protection (unfortunately, consuming SFB is not a really a viable prevention strategy, since many other autoimmune disorders are more prevalent in animals with SFB in their guts). Other research has shown that diabetic humans have very different microbial populations in their guts, though this is probably an effect of altered metabolism rather than the cause.
All of which brings me to my dinner last Friday night. My fiancé and I receive a CSA farm share every week, and this week we got bok choy that had been pulled from the ground in central Massachussetts the same day. When shopping at a supermarket, it’s easy to forget that our vegetables are grown in the dirt – our obsession with cleanliness and sanitation has seeped into our food, and any produce that’s not squeaky clean is discarded or ignored by consumers. Again, it’s important to remember that this concern for cleanliness is not without merit – we know what happens when contaminated food gets into the retail pipeline – but though many microbes live in the earth, soil is not the source of most infectious disease.
A little rinse with water (clean, sanitized water from a municipal water system) was enough to remove most of the dirt on the bok choy, but what about the microbes that hitched a ride? Many environmental bacteria form biofilms that can prevent easy removal, and a cursory hand-scrub isn’t likely to do the trick. During most of human evolution, humans have been consuming microbes from the environment, and it’s clear that this exposure shapes the populations of microbes in our guts. A experimental link between microbes consumed in the diet and specific health conditions has not been shown, but it’s quite plausible that at least some of the observations linked to the hygiene hypothesis aren’t just due to passive microbial encounters, but because of what we put in our mouths.
As for dinner, it was delicious: spaghetti with bok choy, poached egg and romano cheese. Note: the recipe I linked to does not include the dirt.
*The short version of my objection to these ideas is that bacteria elicit one type of immune response that, when over-active, causes some types of diseases (like multiple sclerosis, type-I diabetes etc), while parasitic worms activate a different type of response that, when over-active, causes other problems (like allergies and asthma), yet exposure to both bacteria and worms seems to correlate with decreased prevalence of all of these disorders. Clearly, we still have a lot to learn.