In 1964, the occasionally enigmatic but always energetic physicist, Dr. Richard Feynman gave a lecture at Cornell University to a packed hall of eager, young scholars. Feynman's demeanor was crisp and purposeful that day, a style reinforced by his sharp appearance. The professor's hair was neat and tidy, and he was keenly attired in a trim, tailored suit.
His right hand grasping a piece of chalk, his left had nestled in his coat pocket, Feynman started to speak. "I'm going to discuss how we would look for a new law," he said in his unvarnished Queens accent, referring to his work as a theoretical physicist.
Feynman walked over to the chalkboard and began to write. His oration continued, almost in a manner synced with his scribbling. "First we guess it... Then we compute the consequences of the guess to see what it would imply. And then we compare those computation results... directly to observation to see if it works."
Feynman paused, removed his left hand from his coat pocket, and strode back over to the lectern to briefly peruse some notes. He then launched right back into his sermon.
"If it disagrees with experiment, it's wrong," he asserted, craning his neck forward and adroitly pointing his left hand at the chalkboard to accentuate the point. "In that simple statement, is the key to science."
"It doesn't make any difference how beautiful your guess is," Feynman proclaimed, gesticulating in wide, circular, somewhat flamboyant motions. "It doesn't make any difference how smart you are, who made the guess, or what his name is. If it disagrees with experiment, it's wrong. That's all there is to it."
Feynman was absolutely right.
A good scientist must be willing to be wrong. Such an inclination is liberating, for it allows him or her to investigate potential answers -- however unlikely they may be -- to the difficult questions inspired by this vast, wondrous universe. Not only that, a willingness to be wrong frees a scientist to pursue any avenue opened by evidence, even if that evidence doesn't support his or her original hunch.
"The hard but just rule is that if the ideas don't work, you must throw them away," The great science communicator Carl Sagan wrote. "Don't waste neurons on what doesn't work. Devote those neurons to new ideas that better explain the data."
Sagan's candid advice was perfectly followed in 1998, when two highly competitive groups of scientists from Harvard and Berkeley were racing to find the rate at which the universe's expansion was decelerating. It was a high stakes contest, for a Nobel Prize was thought to be on the line.
But to both groups' astonishment, the data ended up pointing in precisely the opposite direction. The scientists found that the universe's expansion was not slowing down; it was speeding up! "I was, quite frankly, denying [it] was happening," Harvard's Brian Schmidt reportedly said. But because Schmidt and his colleagues overcame their disappointment and were willing to be wrong, the world learned something entirely new about the cosmos.
For the Berkeley and Harvard astrophysicists, recognizing their wrongness was easy, as the data irrefutably pointed in a completely different direction. But it's not always that simple. Sometimes data can be inconclusive, leaving wiggle room for the researcher to draw a range of conclusions. Unfortunately, this occasionally leads to misconduct, especially for the scientists who are more interested in dogmatically pursuing pet theories instead of proof. They might tweak little bits of data in order to achieve statistical significance in the ubiquitous P-value test or they might ignore certain details that conflict with their hypothesis.
This is, of course, ethically wrong, but human nature often compels us to err in order to guard our ingrained beliefs. While scientists are oft considered to be marble men and women, the truth is, they never stop being human.
In order to recognize wrongness, scientists must maintain some level of detachment from their cherished theories and be open to the ideas of others in their respective fields. Richard Dawkins described a terrific example of this in his book, The God Delusion:
“I have previously told the story of a respected elder statesman of the Zoology Department at Oxford when I was an undergraduate. For years he had passionately believed, and taught, that the Golgi Apparatus (a microscopic feature of the interior of cells) was not real... Every Monday afternoon it was the custom for the whole department to listen to a research talk by a visiting lecturer. One Monday, the visitor was an American cell biologist who presented completely convincing evidence that the Golgi Apparatus was real. At the end of the lecture, the old man strode to the front of the hall, shook the American by the hand and said - with passion - 'My dear fellow, I wish to thank you. I have been wrong these fifteen years.'”
In the past year, we've been treated to two uplifting examples of that sort of modesty. Last year, University of California physics professor Richard Muller changed his skeptical stance on climate change when his own "BEST" study produced data that conflicted with his preconceived notions. He now admits that climate change is caused by human activity. In another noteworthy example, Dr. Robert Spitzer, the psychiatrist who, in a 2001 paper, touted that gays could be "cured," reversed his position and apologized for his "fatally flawed, study."
"I believe I owe the gay community an apology," Spitzer wrote in a letter.
Wrongness is something we all secretly or openly dread. According to self-described "Wrongologist" Kathryn Schulz, in the abstract, we all understand that we're fallible, but on the personal level, we leave little to no room for being wrong.
But Schulz believes that we should view this situation in a slightly different light. Realizing you're wrong is what's devastating, but being wrong often feels pretty good. As a matter of fact, it often feels identical to being right.
Like Wile E. Coyote chasing Road Runner off a cliff in those old Warner Brothers cartoons, we only start to fall when we come to the realization that we, along with our incorrect notions, have no solid ground to stand on. But the simple fact of the matter is that we had already run off the end of the precipice a long time ago! Thus, it's best to admit that we're wrong and get the fall over with so we can land (hopefully not too harshly), dust ourselves off, and get back on our feet.