Disagreement is part and parcel of the human condition. This is no less true for the scientific research community, and in my field, neuroscience, opposing opinions abound on even the most fundamental concepts. In science, the opinion backed by the largest amount of credible evidence is held as the most legitimate. But if, at any point, new information arises that challenges the orthodoxy, it will be accepted provided it presents more credible evidence than the previous consensus. This is the fundamental, defining principle of scientific advancement, and science prides itself on this adherence to empiricism.
However, even this seemingly straightforward concept can create issues. The problem can be summarized as follows: what level of evidence do we require to fulfill a hypothesis, and at what point (if ever) does a hypothesis become unequivocal truth? Though these are not easy questions to answer, the de facto protocol of science is to gradually accept answers as their viability withstands the test of time.
That said, there will always be those who reject evidence accepted by the majority. This leads us to more pressing questions: to what extent should the scientific community engage with ideas that fall outside the mainstream? And, perhaps more pertinently, does a dissenting scientist have the right to be heard and validated irrespective of the potential weight of evidence contradicting their beliefs?
While some dissenting voices are relatively harmless (step forward, people that think the earth is flat), others have devastating consequences. An obvious example of the latter is in HIV/AIDS denialism, whose proponents refuse to accept the overwhelming evidence that HIV causes AIDS and, to varying degrees also believe that diagnostic tests for HIV are inaccurate, the HIV/AIDS epidemic in Africa is a myth, and that anti-retroviral treatment makes the condition worse. Notably, several prominent denialists have extensive academic credentials, lending their views a veneer of legitimacy.
One prominent denialist was Christine Maggiore, founder of the alternative AIDS organization Alive & Well. Maggiore—who herself was HIV-positive—rejected conventional treatment and advocated a variety of holistic therapies—before dying from pneumonia with disseminated herpes, classic opportunistic infections arising from a compromised immune system associated with advanced AIDS. After she refusing anti-retrovirals throughout her pregnancy, her HIV-positive three-year-old daughter also died of infections resulting from AIDS. Tragically, these are not isolated cases of denialists dying for their beliefs.
When such extreme views become government policy, the potential for harm increases exponentially. Until 2008, Dr. Manto Tshabalala-Msimang, the health minister to the South African President Thabo Mbeki, advocated beetroot, lemon juice and garlic as a treatment for HIV. Experts subsequently estimated that more than 300,000 people died as a result of the AIDS denialist policies of the Mbeki government.
The growing “anti-vaxxer” movement is similarly corrosive, where rejecting scientific and medical evidence on the efficacy and safety of vaccinations contributes to demonstrable outbreaks of otherwise wholly preventable diseases. Of course, making such decisions for your own health are one thing, but I would argue that new realms of injustice are broken when you force your own ignorance on to an innocent child.
The damaging effects of a vocal minority are not confined to medicine. Donald Trump has front-lined scientific dissent through cabinet appointments (some of whom also have established scientific backgrounds) and their remarks on the nature of climate change. If these position-holders influence US climate policy as feared, the effects will undoubtedly be both devastating and irreversible. Ignoring the majority expert opinion in this way has been labeled as being part of a “post-truth” society, where the voices of world-leading experts are swept aside amongst the cacophony of populist opinion. Regardless, what unites the above examples is that ultimately, it is the public who pay the price when marginalized science informs policy. History reminds us this is unsafe territory.
One issue is that when the mainstream view turns out to be wrong, it jades the public perception of science whilst also seemingly validating the pursuits of the contrarians. Nutrition is a prime example of this phenomenon. In 1972 scientist John Yudkin was the author of a book that described how sugar, not fat, was the primary culprit for rising levels of obesity. He was marginalized by the mainstream nutrition community during his lifetime. But Yudkin was right, and now, 40 years later, the central role that sugar plays in expanding our waistlines is now widely acknowledged by nutritionists. Despite this knowledge, and in response to demand, food companies continue to capitalize on consumer fears with “low-fat” alternatives.
The right to dissent in enshrined in the scientific constitution. Arguably, it is the duty of scientists to dispute the status quo. Hypotheses are designed to be challenged, and history is littered with examples of long-held beliefs being re-written on the basis of fresh evidence. This continual revision of the orthodoxy is, in general, a good thing for science and more fundamentally, the development of mankind. The inherent fallibility of science—which on the surface can seem its biggest weakness—is actually its biggest strength.
However, at the extreme end of the spectrum are those dissenters who style themselves as research “white knights,” pariahs of the establishment working to uncover unpopular truths, and crying oppression when their views are debunked. This is more commonly known as playing the victim card. The spirit of Galileo, the most famous contrarian in history, is regularly invoked.
But in reality, such dissenters rely on their right to dissent, rather than the validity of their opinions, to be heard. Both the public and scientists must be wary of lending credence to unsubstantiated (and potentially damaging) beliefs on the principle of someone’s right to disagree.