ADVERTISEMENT
  About the SA Blog Network













Guest Blog

Guest Blog


Commentary invited by editors of Scientific American
Guest Blog HomeAboutContact

Behavioral Telescope Shows How Cooperation Works


Email   PrintPrint



Nature’s games aren’t all “red in tooth and claw” competitions. Evolution can create stable cooperation. “Behavioral telescopes” provided by a new kind of logic have revealed laws of team productivity that are almost mathematical moral truths.

Game theory is to the behavioral universe what the telescope was to Galileo, or calculus to Newton—a powerful new tool for probing previously unsolvable problems. Modeling the agency and behavioral contingency that are key in biology, but don’t exist in physics, required new logic. Using these “behavioral telescopes” to scan for patterns in distant effects, scientists are discovering an evolutionary ethics that increases social productivity.

Game theory’s most studied scenario is the Prisoners Dilemma, which due to its origin in Cold War nuclear strategy, assumes everyone is untrustworthy. Its two players can’t communicate but must choose to either cooperate or defect. If both cooperate each payoff = Reward, if only one defects his payoff = Temptation while the cooperator payoff = Sucker, and if both defect each payoff = Punishment. In strong versions payoffs are ranked: Temptation > Reward > Punishment > Sucker. Conventional thinking says that since the other player is “rational” and can’t be trusted, he will defect. So it’s “rational” for you to defect. This “logic” guarantees poor outcomes. But evolution has a smarter solution.

Computer contests of iterated Prisoner’s Dilemmas show, as Sam Harris notes, “that evolution probably selected for two stable orientations towards human cooperation tit-for-tat…and permanent defection.” Permanent defection yields the low payoffs of “red in tooth and claw” zero-cooperation. In tit-for-tat a player cooperates initially then mimics the other player’s last move. Thus tit-for-tat-players retaliate if defected against, but forgive a defector that starts cooperating. Once established in a population tit-for-tat provides higher productivity for all, and can become an ‘evolutionarily stable strategy,’ which means it can’t be beaten by other approaches. Even with short-term incentives for defection, cooperation can thrive.

It wasn’t humanly possible to tackle game-theory problems until computers enabled large-scale simulations. But evolution has been doing precisely that for hundreds of millions of years, running trials of what Darwin called “endless forms most beautiful” and similarly endless varieties of behavioral strategies, and naturally selecting adaptations that are more productive. Species that evolved tit-for-tat behaviors could overcame the productivity ceiling of short-sighted selfish competitions.

Key games in our nature, such as group hunting, have greater incentives for cooperation and disincentives for defection than Prisoners Dilemmas. Christopher Boehm has shown that hunter gathering cultures have punished uncooperative behaviors for perhaps 10,000 generations.

As Steven Pinker notes, “The emotions of sympathy, gratitude, guilt, and anger allow people to benefit from cooperation without being exploited by liars and cheats.” Our highly interdependent social species has evolved mechanisms to create trustworthy team behaviors and to punish team disruptors. Any science that can’t see this moral math isn’t seeing humans clearly.

Illustration by Julia Suits, The New Yorker Cartoonist & author of The Extraordinary Catalog of Peculiar Inventions.

Previously in this series:

It Is in Our Nature to Be Self-Deficient
Inheriting Second Natures
Our Ruly Nature
It Is in Our Nature to Need Stories
Tools Are in Our Nature
We Fit Nature To Us: Evolutions two way street
Justice Is In Our Nature

Jag Bhalla About the Author: Jag Bhalla is an entrepreneur and writer. His current project is Errors We Live By, a series of short exoteric essays exposing errors in the big ideas running our lives, details at www.errorsweliveby.comwww.errorsweliveby.com. His last book was I'm Not Hanging Noodles On Your Ears, a surreptitious science gift book from National Geographic Books, details at www.hangingnoodles.comwww.hangingnoodles.com. It explains his twitter handle @hangingnoodles Follow on Twitter @hangingnoodles.

The views expressed are those of the author and are not necessarily those of Scientific American.

Tags:






Comments 10 Comments

Add Comment
  1. 1. JPGumby 3:14 pm 06/5/2013

    I would have to object to the insertion of the term “morality” in this discussion. The science says nothing about morality. All it says is that for species that live in packs or bands, there’s both an (obvious) advantage to being ability to enforce behaviors that benefit the group, and there’s at least one behavioral mechanism that is sufficiently simple to have reasonably been evolved.

    The same analysis leads to an advantage to purely competitive behavior with those outside the pack or band, which is also generally observed.

    Unless you first accepts the concept of basing your moral code on the utility to the larger group of a particular set of behaviors, this doesn’t say anything about morality. And it could undermine an argument in favior of altruistic morality, since it isn’t always best for the group, and certainly not for any individual member of the group.

    Link to this
  2. 2. lawman108 5:19 pm 06/5/2013

    Is this all it takes to get to write for Scientific American? First, the worst payoff in prisoner’s dilemma is to be the sucker, not if both defect. Second, the prisoner’s dilemma is a one time game. The fact that multiple plays of the game have a different rational strategy in no way changes the logic of the original game. The multiple play game is a different game. This article is superficial and adds no insight.

    Link to this
  3. 3. Mark Sloan 5:44 pm 06/5/2013

    JP, you might read the post again. It is about what moral behaviors objectively ‘are’. As Stephen Pinker says, they are strategies to “allow people to benefit from cooperation without being exploited by liars and cheats”. The post is silent about what moral behaviors ‘ought’ to be, moral philosophy’s normal subject.

    Science is fully within its magisteria when describing what moral behaviors ‘are’ as biological and cultural adaptations, how they came to be, and how we can use that knowledge to define moral codes that will better accomplish common human goals such as increased well-being.

    What is your basis for arguing that past and present enforced moral codes and the biology underlying “sympathy, gratitude, guilt, and anger” and are somehow off-limits for scientific inquiry? Or perhaps you think such descriptive knowledge somehow cannot be useful for figuring out which moral codes are most likely to achieve common human goals such as increased well-being?

    Sure, 1) our ‘moral’ biology and cultural moral codes motivate and advocate strategies for increasing cooperation within groups and 2) these biological and cultural adaptations are consistent with goals achieved by ruthless competition with and exploitation of out-groups. So what? As I have already said, science is, of logical necessity, silent about what moral behavior ought to be.

    Bizarrely, my experience is it not scientists who commit the well-known is/ought fallacy (which you seem to be guilty of in your response) but it is commonly moral philosophers who perversely do so in attempts to discredit the science of morality.

    Link to this
  4. 4. lynnoc 6:17 pm 06/5/2013

    Cooperation and altruism (and sure, add altruistic morality) are wired into our human nature, probably resting on the building blocks of our empathy system and possibly connected to mirror neurons (although there is debate about them). Empathy-based guilt (such as survivor guilt, feeling like you’ve done something wrong when you do better than a friend or family member, or like you’ve hurt a friend when you’re having fun at a party and you see him/her off in a corner looking miserable) is universal, with the exception of psychopaths who have a brain dysfunction. We hypothesize that this common emotion has been put into place (or hammered into place to quote evolutionary biologists, David Sloan Wilson and E. O. Wilson) through the mechanism of competition between groups, i.e., through group selection, meaning selection at the level of the group. Groups with more cooperators (and no doubt more empathy- and guilt- prone people) outcompete groups with fewer cooperators/altruists.

    However while altruism promotes fitness at the level of the group, it may not be so advantageous in all situations at the level of within group competition, i.e, individual fitness, natural selection. Someone who is high in survivor guilt- proneness may be inhibited in multiple ways, for fear that his/her success will harm another, like a family member, friend etc. So within group competition, high levels of empathic guilt proneness may not promote individual fitness. It appears that –at least in human evolution– group selection plays a significant role and in some cases clearly over-rides fitness at the level of the individual. We are studying altruism and cooperation etc. with people in life, not just within the context of games.

    Lynn O’Connor
    http://www.eparg.org

    Link to this
  5. 5. lynnoc 6:17 pm 06/5/2013

    Cooperation and altruism (and sure, add altruistic morality) are wired into our human nature, probably resting on the building blocks of our empathy system and possibly connected to mirror neurons (although there is debate about them). Empathy-based guilt (such as survivor guilt, feeling like you’ve done something wrong when you do better than a friend or family member, or like you’ve hurt a friend when you’re having fun at a party and you see him/her off in a corner looking miserable) is universal, with the exception of psychopaths who have a brain dysfunction. We hypothesize that this common emotion has been put into place (or hammered into place to quote evolutionary biologists, David Sloan Wilson and E. O. Wilson) through the mechanism of competition between groups, i.e., through group selection, meaning selection at the level of the group. Groups with more cooperators (and no doubt more empathy- and guilt- prone people) outcompete groups with fewer cooperators/altruists.

    However while altruism promotes fitness at the level of the group, it may not be so advantageous in all situations at the level of within group competition, i.e, individual fitness, natural selection. Someone who is high in survivor guilt- proneness may be inhibited in multiple ways, for fear that his/her success will harm another, like a family member, friend etc. So within group competition, high levels of empathic guilt proneness may not promote individual fitness. It appears that –at least in human evolution– group selection plays a significant role and in some cases clearly over-rides fitness at the level of the individual. We are studying altruism and cooperation etc. with people in life, not just within the context of games.

    Lynn O’Connor
    http://www.eparg.org

    Link to this
  6. 6. hangingnoodles 6:29 pm 06/5/2013

    @lawman108 – You are correct, I made a mistake in describing the Prisoner’s Dilemma. The post has been corrected. I hope others who are not already knowledgeable in game theory can benefit from reading the short post. Game theory’s results are not widely enough known.

    Link to this
  7. 7. hangingnoodles 6:34 pm 06/5/2013

    @JPGumby @MarkSloan – I used the word moral to intentionally provoke some thought about what types of rules could be evolutionary adaptive. As I note in the other posts in this series, strong feelings about behavior that is considered worthy of punishment is universal in humans. Particular social rules considered “moral” or that provoke “moral” emotions vary. But I’d argue it’s been 10,000 generations since our ancestors could survive without them.

    Link to this
  8. 8. Mark Sloan 4:07 am 06/6/2013

    @ hangingnoodles Yes, “strong feelings about behavior that is considered worthy of punishment is universal in humans”.

    I would add the universal existence of “behavior that is considered worthy of punishment” necessarily defines what is culturally moral because, as Herb Gintis shows, punishment of exploiters is necessary for altruistic cooperation strategies (and cooperative human societies) to be successful. I am delighted to see this and Pinker’s phrase that moral behaviors are strategies to “allow people to benefit from cooperation without being exploited by liars and cheats” on the Scientific American site, even if it is only in a blog posting and its comments.

    But I am puzzled by your intended meaning of “it’s been 10,000 generations since our ancestors could survive without them”.

    Morality was unhitched forever from being only about reproductive fitness by the emergence of culture. Culture enabled encoding altruistic cooperation strategies (such as versions of the Golden Rule, the core of indirect reciprocity) to be selected for by any “benefits of cooperation in groups” that people found attractive, not just reproductive fitness.

    Consistent with Pinker’s quote about our ‘moral’ biology, if you want to go big picture on what morality ‘is’, I suggest moral behaviors are strategies to overcome the cross-species universal dilemma of how to obtain the benefits of cooperation in groups without being exploited. That is, ‘morality’ as an evolutionary adaptation is substrate independent and solves the same cross species universal dilemma whether encoded in biology or cultural norms.

    Link to this
  9. 9. JPGumby 10:21 am 06/6/2013

    @hangingnoodles – Thanks for your response.

    My disagreement is really semantic, in that the common meaning of “moral” is different than the use here. I agree with the argument that natural selection in social species would favor wiring for certain behaviors and attitudes. This wiring would, in humans, then likely influence the development of moral codes.

    However, that says nothing about whether a particular behavior or attitude is “moral”, at least the standard sense of the word.

    Link to this
  10. 10. Mark Sloan 11:00 am 06/6/2013

    BK, societies have a strong, practical interest in what moral codes ought to be advocated and enforced in order achieve common goals such as increased well-being.

    For example: When is it immoral to follow “Do to others as you would have them do to you”? What moral codes are most likely to increase well-being in hostile environments, such as failed states? What ought punishments be for immoral behavior and who ought to administer them?

    These kinds questions about what moral codes instrumentally ought to be may be far better answered by game theory and the science of morality than by mainstream moral philosophy. So the science of morality may have quite a lot to say about whether a behavior or attitude is ‘moral’.

    Thus, the science of morality does deal directly with morality in the standard sense of the word, what cultural moral codes ought to be.

    Link to this

Add a Comment
You must sign in or register as a ScientificAmerican.com member to submit a comment.

More from Scientific American

Scientific American MIND iPad

Give a Gift & Get a Gift - Free!

Give a 1 year subscription as low as $14.99

Subscribe Now >>

X

Email this Article

X