August 29, 2013 | 3
I’ve been watching The Newsroom*, and in its second season, the storyline is treading on territory where journalism bears some striking similarities to science. Indeed, the most recent episode (first aired Sunday, August 25, 2013) raises questions about trust and accountability — both at the individual and the community levels — for which I think science and journalism may converge.
I’m not going to dig too deeply into the details of the show, but it’s possible that the ones I touch on here reach the level of spoilers. If you prefer to stay spoiler-free, you might want to stop reading here and come back after you’ve caught up on the show.
The central characters in The Newsroom are producing a cable news show, trying hard to get the news right but also working within the constraints set by their corporate masters (e.g., they need to get good ratings). A producer on the show, on loan to the New York-based team from the D.C. bureau, gets a lead for a fairly shocking story. He and some other members of the team try to find evidence to support the claims of this shocking story. As they’re doing this, they purposely keep other members of the production team out of the loop — not to deceive them or cut them out of the glory if, eventually, they’re able to break the story, but to enable these folks to look critically at the story once all the facts are assembled, to try to poke holes in it.** And, it’s worth noting, the folks actually in the loop, looking for information that bears on the reliability of the shocking claims in the story, are shown to be diligent about considering ways they could be wrong, identifying alternate explanations for details that seem to be support for the story, etc.
The production team looks at all the multiple sources of information they have. They look for reasons to doubt the story. They ultimately decide to air the story.
But, it turns out the story is wrong.
Worse is why key pieces of “evidence” supporting the story are unreliable. One of the interviewees is apparently honest but unreliable. One source of leaked information is false, because the person who leaked it has a grudge against a member of the production team. And, it turns out that the producer on loan from the D.C. bureau has doctored a taped interview that is the lynchpin of the story to make it appear that the interviewee said something he didn’t say.
The producer on loan from the D.C. bureau is fired. He proceeds to sue the network for wrongful termination, claiming it was an institutional failure that led to the airing of the now-retracted big story.
The parallels to scientific knowledge-building are clear.
Scientists with a hypothesis try to amass evidence that will make it clear whether the hypothesis is correct or incorrect. Rather than getting lulled into a false sense of security by observations that seem to fit the hypothesis, scientists try to find evidence that would rule out the hypothesis. They recognize that part of their job as knowledge-builders is to exercise organized skepticism — directed at their own scientific claims as well as at the claims of other scientists. And, given how vulnerable we are to our own unconscious biases, scientists rely on teamwork to effectively weed out the “evidence” that doesn’t actually provide strong support for their claims.
Some seemingly solid evidence turns out to be faulty. Measuring devices can become unreliable, or you get stuck with a bad batch of reagent, or your collaborator sends you a sample from the wrong cell line.
And sometimes a scientist who is sure in his heart he knows what the truth is doctors the evidence to “show” that truth.
Fabricating or falsifying evidence is, without question, a crime against scientific knowledge-building. But does the community that is taken in by the fraudster bear a significant share of the blame for believing him?
Generally, I think, the scientific community will say, “No.” A scientist is presumed by other members of his community to be honest unless there’s good reason to think otherwise. Otherwise, each scientist would have to replicate every observation reported by every other scientist ever before granting it any credibility. There aren’t enough grant dollars or hours in the day for that to be a plausible way to build scientific knowledge.
But, the community of science is supposed to ensure that findings reported to the public are thoroughly scrutinized for errors, not presented as more certain than the evidence warrants. The public trusts scientists to do this vetting because members of the public generally don’t know how to do this vetting themselves. Among other things, this means that a scientific fraudster, once caught, doesn’t just burn his own credibility — he can end up burning the credibility of the entire scientific community that was taken in by his lies.
Given how hard it can be to distinguish made-up data from real data, maybe that’s not fair. Still, if the scientific community is asking for the public’s trust, that community needs to be accountable to the public — and to find ways to prevent violations of trust within the community, or at least to deal effectively with those violations of trust when they happen.
In The Newsroom, after the big story unravels, as the video-doctoring producer is fired, the executive producer of the news show says, “People will never trust us again.” It’s not just the video-doctoring producer that viewers won’t trust, but the production team who didn’t catch the problem before presenting the story as reliable. Where the episodes to date leave us, it’s uncertain whether the production team will be able to win back the trust of the public — and what it might take to win back that trust.
I think it’s a reasonable question for the scientific community, too. In the face of incidents where individual scientists break trust, what does it take for the larger community of scientific knowledge-builders to win the trust of the public?
* I’m not sure it’s a great show, but I have a weakness for the cadence of Aaron Sorkin’s dialogue.
** In the show, the folks who try to poke holes in the story presented with all the evidence that seems to support it are called the “red team,” and one of the characters claims its function is analogous to that of red blood cells. This … doesn’t actually make much sense, biologically. I’m putting a pin in that, but you are welcome to critique or suggest improvements to this analogy in the comments.
Get 6 bi-monthly digital issues
+ 1yr of archive access for just $9.99