Critical views of science in the news

Why B. F. Skinner, Like Freud, Still Isn't Dead


Behaviorism is back! That's what David Freedman proclaims in the June Atlantic cover story, "The End of Temptation: How the creepy science of behavior modification is reshaping our desires." The article is, on one level, a hyperbolic report on apps that are "transforming us into thinner, richer, all around-better versions of ourselves" by helping people (including Freedman's brother) overcome overeating, smoking and other bad habits. Freedman inflates this pop-culture mini-trend into a grandiose claim that B.F. Skinner, "psychology's most misunderstood visionary," who popularized behaviorism more than a half century ago, "may finally get his due."

Giving Skinner credit for apps like "Lose It" and "Habit Breaker"—which I predict will turn out to be as effective, or ineffective, as other self-improvement programs—is a stretch. Freedman's article is nonetheless a wonderful illustration of a thesis I advanced 16 years ago in "Why Freud Isn't Dead." My conceit was this: Ever since Freud invented psychoanalysis, critics have viciously attacked it, denouncing it as the equivalent of pseudo-scientific twaddle like phrenology, which held that skull shape mirrors personality. Countless alternative theories of and therapies for the mind have emerged in the past century, ranging from Jungian psychology up through cognitive neuroscience, behavioral genetics, evolutionary psychology and psychopharmacology.

Some of these allegedly new-and-improved fields have yielded valuable insights. The shocking experiments of Stanley Milgram, the reciprocal altruism hypothesis of Robert Trivers, the rise in IQ scores identified by James Flynn, the exposure of cognitive bias by Daniel Kahneman. And yet psychoanalysis is still hanging in there, not—as Freudians claim—because of its scientific merits but because a century of research on the brain and mind has not yielded a paradigm powerful enough to obliterate psychoanalysis once and for all. If Freudian psychoanalysis, in some sense, resembles phrenology, so, in some sense, do all its rivals. A corollary of my thesis is that psychological paradigms never really die; they just go in and out of fashion. Their creators endure too, neither dead nor alive but undead, like zombies or vampires.

Case in point: the "revival" of behaviorism, which treats subjective mental states as an irrelevant distraction for understanding humans; only objectively observable, measurable behavior matters. (The essence of behaviorism is summed up by an old joke: After two behaviorists make love, the man says to the woman, "It was good for you. How was it for me?") Freedman suggests that behaviorism fell out of favor because people found the behavior-modification techniques proposed by Skinner to be "manipulative," "fascist" and "morally bankrupt." Some critics did indeed raise moral objections to behavior modification. (See for example "The Clockwork Condition," a fascinating essay written in 1973 and printed in the June 4, 2012, New Yorker, in which Anthony Burgess traces connections between his 1962 novel A Clockwork Orange, which yielded one of my all-time favorite films, and Skinner's "evil" proposals.) But scientists abandoned behaviorism for reasons that were primarily empirical, not moral.

MIT linguist Noam Chomsky pointed out behaviorism's flaws in a coldly brutal 1959 vivisection of Skinner's views of language. Chomsky argued out that children cannot possibly acquire language through the simple stimulus-response mechanism postulated by Skinner; they must possess a priori knowledge that helps them learn rules of grammar so quickly. Children, Chomsky wrote, "generalize, hypothesize, and 'process information' in a variety of very special and apparently highly complex ways which we cannot yet describe or begin to understand, and which may be largely innate, or may develop through some sort of learning or through maturation of the nervous system. The manner in which such factors operate and interact in language acquisition is completely unknown."

Note what Chomsky is saying: that neither behaviorism nor any other scientific model can explain—or is even close to explaining—how humans learn language, which is arguably our defining trait. (In spite of his own emphasis on the genetic underpinnings of language, Chomsky has been cruelly dismissive of evolutionary psychology, which he once called a "philosophy of mind with a little bit of science thrown in.")

In a recent column on philosopher of science Thomas Kuhn, I pointed out that some fields, especially "hard" ones like physics and chemistry, converge on a paradigm and rapidly progress, while others "remain in a state of constant flux." Fields that address human thought and behavior—anthropology, economics, sociology, political science, psychology--are prime example of research endeavors that lurch faddishly from one paradigm to another.

Will psychologists ever find a paradigm powerful enough to unify the field and help it achieve the rigor of, say, nuclear physics or molecular biology? William James had his doubts. More than a century ago he fretted that psychology might never transcend its "confused and imperfect state." Harvard psychologist Howard Gardner has argued that James's concerns "have proved all too justified. Psychology has not added up to an integrated science, and it is unlikely ever to achieve that status." Gardner once told me that questions about free will, the self, consciousness and other topics with which psychologists (and, tellingly, philosophers) wrestle might not be amenable to conventional scientific reductionism, in spite of all the advances of modern genetics, neuroscience and brain imaging. Gardner suggested that researchers should perhaps consider adopting more "literary" styles of investigation and discourse, as practiced by Freud and James--and, I would add, even Skinner, who was a decent writer, if not in the same class as Freud and James.

And if literary psychology doesn't work out, we still have weight-loss apps.

Photo of Skinner courtesy Wikimedia Commons.

The views expressed are those of the author and are not necessarily those of Scientific American.

Share this Article:


You must sign in or register as a member to submit a comment.

Starting Thanksgiving

Enter code: HOLIDAY 2015
at checkout

Get 20% off now! >


Email this Article