ADVERTISEMENT
  About the SA Blog Network













Bering in Mind

Bering in Mind


A research psychologist's curious look at human behavior
Bering in Mind Home

This Queasy Love: How Having Frequent Diarrhea as a Child Shapes Your Adult Mate Choice

The views expressed are those of the author and are not necessarily those of Scientific American.


Email   PrintPrint



It’s not everyday that love and diarrhea come together in theoretical matrimony. Recently, however, a study by an interdisciplinary team of scientists managed to form this near-perfect union. Oh relax. I’m not about to share the sordid details of a revolting new sexual fetish. (That’s for another post.) The research I’m about to tell you about is actually bigger than that. In fact, it’s a pity that the report appeared without much fanfare last year in Evolution and Human Behavior, because when looked at in the right light, there’s a certain quiet beauty to these data in the way that they subtly illuminate gene-environment interactions.

Before we can hope to understand the curious connection between the hellish expunging of our intestinal contents and the type of person that we’re in turn most likely to marry—or simply to screw—it’s necessary to first step back to look at the wider framework in which the study’s main hypothesis was based. Unless you reject evolutionary psychology out of ignorance or spite (or some combination thereof), it’s noncontroversial to say that, all else equal, it’s biologically more adaptive to sexually reproduce with a healthy than with an unhealthy partner. Needless to say, “biologically adaptive” is rarely isomorphic with “nice and kind”, but kind or not, most of us don’t instinctively gravitate to strangers with a death rattle for a cough or get turned on by the sight of someone with random body parts sloughing off. Beyond such truisms, having children with a chronically ill person makes it not only difficult or impossible for that poor individual to “invest” in your mutual offspring, leaving you to shoulder the “costs” of rearing them on your own, the health of your kids (who, let’s not mince words, are the cooing keepers of your eternal genetic promise) may also be compromised if your partner’s disease is heritable.

While it’s all well and good to aim for a healthy baby-mama or baby-daddy, one hitch is that, in the real world, we can’t always count on a conspicuous cue like emaciation or puddles of pus to tip us off about a prospect’s dubious hardiness. Even if that guy giving you eyes looks fit enough now, who’s to say he won’t be the first to drop when the next scourge rolls in? Our ancestors would have faced similar challenges discerning the relative health of viable members of the opposite sex.

The solution to this adaptive problem was mindlessly ingenious. Among other fixes, our brains became aesthetically predisposed to faces that were the best genetic gambles in a world brimming with deleterious pathogens. When looking at aggregate data, these appealing faces tend to belong to folks who are more disease-resistant across the lifespan. Lucky bastards. Now, how to spot these anti-pathogenic lovers. That’s to say, what’s a “pretty face” exactly, and why do we often disagree on others’ attractiveness? Yes, there’s that old facial symmetry giveaway, but debate rages on about the relationship between this variable trait and perceptions of beauty. If you really want to stack the fitness odds in your offspring’s favour, there seems to be a more reliable marker of a person’s health than mere symmetry: having an extremely sex-typical mug. An already impressive, yet still-mounting, body of evidence reveals that the degree of “masculinity” in a man’s face correlates positively with his lifelong health, while the degree of “femininity” in a woman’s face does the same for hers.

“That’s nice,” you’re probably saying to yourself. “But where does diarrhea come into it?” I thought you’d never ask. The authors of this new study, led by the psychologist Mícheál de Barra of Stockholm University, suspected that our behavioral immune system, which is a hypothetical collection of evolved cognitive biases that spur us on to make adaptive decisions in the domain of disease avoidance, may get a sort of laxative fine-tuning during our early child development. There are clearly individual differences in the sensitivity and functioning of this system. After all, some people are less disgusted than others by overt signs of illness, even those insalubrious emblems of the contagious, leaking, oozing variety. And some of us are also less fazed than others when it comes to our sex partners’ attractiveness, or lack thereof. Maybe, de Barra’s group reasoned, some of this in-group variation—not all, but some—is the result of a calibrating process that takes place after we’re born but before adulthood.

The basic idea is easy enough to follow: recurring childhood experiences with illness (and here, we’re talking diarrhea) stamp an indelible warning upon a developing human brain: You’re lucky to get out of your childhood alive in this disease-ridden place with those flimsy genes of yours; your offspring may not be so fortunate. Actually, it’s not so much a stamp as it is a neurocognitive tweak, one that equips you with a cognitive bias uniquely tailored to the deleterious context in which you live—and into which you’re children will probably also be born. This would have been especially true, the researchers argue, under conditions in which rates of communicable diseases (or “pathogen stress” in the official jargon) remained fairly stable across time. Predictable cycles of viral infection go hand-in-hand with cultural and geographical aspects that endure over many generations in circumscribed human populations, involving factors such as food-preparation practices, water source, religious rituals, outsider contacts, weather patterns, hygiene standards, and so on.

One way to improve the survival odds for your future offspring, given your own touch-and-go childhood not so long ago in a very similar environment, is to set your sights on a reproductive partner who carries as healthy a compliment of pathogen-resistant genes possible. The hypothesis for de Barra and his colleagues’ study was therefore pretty clear-cut. Adults who are sickly in childhood grow up to prefer opposite-sex faces displaying exaggerated sex-typical features: testosterone-forged faces in men, that’s to say, and ultra-feminine faces in women. Putting this prediction to the test presents a challenge, however. Unless it was a particularly violent episode, most of can’t remember if we had diarrhea last month, let alone when we were pre-schoolers (although, to this day, I can recite several stanzas from the infamous “Diarrhea Song” that I must have memorized around that time: “When you’re climbin’ up a ladder and you feel somethin’ splatter, diarrhea! diarrhea!”).

Fortunately, the authors had direct access to the Matlab Health and Demographic Surveillance System Database (or HDSS), “one of the most detailed high-quality longitudinal databases from the developing world.” Originally lauched in 1963 as part of a Cholera vaccine trial in rural Bangladesh, this ongoing and expanded health project includes a wealth of childhood data for adults still living in the area. “Community health workers [would] make fortnightly or monthly visits to every household,” explain the authors. The workers gathered continual data on everything from breastfeeding regimens to vaccination schedules to… you guessed it…  diarrhea.

From January 1990 to July 1994, all mothers in the sample with children younger than five were asked if the child had had diarrhea in the past two weeks (“three or more loose stools per 24 h with or without mucus or blood”). Given that diarrhea is a leading cause of infant mortality in this place where, in 1990, newborns stood a 1-in-9 chance of dying before their fifth birthdays, it was quite an empirical boon for the authors to secure such strong data.

After tracking down a random sample of 150 women and 90 men from the HDSS database who’d been born in the aforementioned four-year window, each participant was then shown 15 photographic pairs of adult male faces and 15 pairs of adult female faces (all images were of Bangladeshis). For each same-sex pair, the participant was simply asked which of the two faces they found more attractive. Here’s where it gets interesting. The same-sex comparisons pitted different versions of the same model against each other: one image in the pair depicted a digitally rendered “feminized” version of the model’s face and the other an artificially “masculinized” version.

Ruling out possible confounds (SES, education, hygiene, and age), the women in this Bangladeshi sample preferred the more feminine versions of both male and female faces. Men, on the other hand, liked their male faces masculine, thank you very much, but found masculine and feminine female faces equally appealing (or equally unappealing, perhaps). Throw some diarrhea into the analysis, however, and the picture clearly changes. Compared to adult males who hadn’t had the runs so often and so early in life, men with a childhood history of diarrhea showed a distinctive preference for highly feminized female faces; by contrast, they betrayed no bias at all when asked to evaluate the faces of other men. Since you can’t inseminate another male—believe me, I’ve tried—that makes sense. The inverted pattern appeared for ladies with a childhood history of diarrhea: unlike their peers who preferred male faces with androgynous features, women who’d spent a worrisome amount of time on the toilet as little girls now liked their men, or at least their men’s faces, as manly as possible. Similarly, they showed no particular bias when it came to what makes a woman’s face attractive. The effect was even more pronounced for those men and women with a childhood history of diarrhea who also had a recent adult history of illness, suggesting that once carved into place in childhood, this bias for exaggerated sex-typical faces retains some degree of flexibility. A weapon of the behavioral immune system, it stays vigilant to mildly seesawing levels of disease threat.

The authors do caution against rushing to conclusions about the role of childhood illnesses in shaping adult mate choice, and certainly much work remains to be done in this area. It’s an intriguing set of findings, though, and personally I’ve a hunch that further studies will add support to these new data by de Barra and his colleagues.

Meanwhile, a word of wisdom—a little sonnet, really—that one of my elders once shared with me long, long ago: “When you’re running from a storm and you feel something warm…”

Write that one down. You never know.

Jesse Bering About the Author: Jesse Bering is Associate Professor of Science Communication at the University of Otago in New Zealand. He is the author of The Belief Instinct (2011), Why Is the Penis Shaped Like That? (2012) and Perv (2013). To learn more about Jesse's work, visit www.jessebering.com or add him on Facebook (https://www.facebook.com/jesse.bering). Follow on Twitter @JesseBering.

The views expressed are those of the author and are not necessarily those of Scientific American.





Rights & Permissions

Comments 1 Comment

Add Comment
  1. 1. CherryBombSim 7:12 pm 08/18/2014

    More adaptationism run amuck. The paper presents evidence that childhood illness affects mate preference. OK, I can accept that this might be true. Does it present any evidence that this change in behavior is adaptive? Any evidence at all? No. What we get is speculation and hand-waving. I’d say the authors failed to show that their hypothesis was true, and by a wide margin.

    Link to this

Add a Comment
You must sign in or register as a ScientificAmerican.com member to submit a comment.

More from Scientific American

Scientific American Back To School

Back to School Sale!

12 Digital Issues + 4 Years of Archive Access just $19.99

Order Now >

X

Email this Article



This function is currently unavailable

X