ADVERTISEMENT
  About the SA Blog Network













The SA Incubator

The SA Incubator


The next generation of science writers and journalists.
The SA Incubator HomeAboutContact

Journalism By Numbers: Why Journalists Are Skipping Lunch To Learn Stats

The views expressed are those of the author and are not necessarily those of Scientific American.


Email   PrintPrint



This is a guest post by Frank Swain who works at the Royal Statistical Society on a project to develop science training for journalists.

On the screen in front of us stands row upon row of little grey figures, four hundred in all if you count the ones cut off by the skewed aspect ratio. Five of the figures are stained red. They are dead. They developed pancreatic cancer. The rest are still enjoying their bacon sandwiches.

And so, the professor stood in front of the screen concludes, the relative risk associated with eating processed meat is 20%, or to put it another way, one extra death for every four hundred people. A relative risk increase of 20%, but an absolute risk increase of just 1 percentage point. The same statistic, but two entirely different news stories, depending on how you frame it.

A hand goes up. “Whose responsibility is it,” the woman asks, “to convert those numbers?”

It’s a good question. In public health studies, risk factors are often presented in relative terms, because that format makes sense when discussing interventions on a population scale. How many lives could be saved by encouraging British citizens to skip that second sausage?  But when discussing personal health choices, it’s absolute risks, not relative ones, that are most useful. Tell me my diet ramps up my risk of cancer by 20% and I may choke on my pork scratchings, but tell me that I’m shaving my life expectancy by a year and I’ll probably decide the intervening half-century is better spent in the company of fry ups.

When science makes the crossover from academic into public discourse, whose responsibility is it to adjust the language accordingly? A common attitude within the science community seems to be that journalists reporting on science stories ought to be able to substitute risk factors and odds ratios as easily as epidemiologists do. That’s a facile argument to make, but journalists are also the least equipped to do this, both in terms of time and ability. It is important, however, that journalists understand how influential this kind of reframing can be, and how it can take control of the reporting line if left unbridled.

The goal of delivering that understanding is what led me to this secluded corner of the UK’s Channel 4 newsroom, listening to the professor talk about relative risk and other statistical concepts to over a dozen journalists. The workshop, and many more like it, have come about through the Royal Statistical Society’s publicly-funded BenchPress project, which aims to develop science and statistical training for journalists. The project was set up in response to a white paper published by the UK Government’s department for Business, Innovation and Skills in 2010 which highlighted a dearth in the availability of such training, both within the industry and within the classrooms that supply it with new graduates. As part of the project, I’ve developed a network of a dozen volunteer speakers who regularly visit schools and newsrooms across the country to help future potential communicators and journalists get to grips with numbers.  The passion of the volunteers—all working scientists—helps ensure that both junior and more senior journalists produce science news stories that are as robust and accurate as possible.

Later this week, Hilda Bastian and Evelyn Lamb will host a discussion of rogue statistics at this year’s ScienceOnline conference, and the problems these can cause in politics and the media. “One of the most important roles of math blogging for non-mathematicians,” they write, “is clarifying the ways in which things are abused, and how we can make the true meaning of statistics clear without losing the attention of the audience.” I’d argue that’s a sentiment shared by journalists of all stripes, not just math bloggers. No one I’ve approached—neither college nor newsroom—has yet turned down the offer of a free workshop on science and statistics. Everyone in the industry is aware that the era of data journalism is fast approaching. Already political pundits in the US have seen their audiences depart in droves for the analytical pronouncements of quant Nate Silver. My advice to Bastian and Lamb is this: don’t be content to stop at math bloggers. The world’s hacks are just as eager to get their numbers right, if you’ll only help them along the way.

Khalil A. Cassimally About the Author: Khalil A. Cassimally is the Community Coordinator of The Conversation UK. He's also a science blogger. He hails from a tropical island and is a happy geek. Subscribe to his updates on Facebook and Google+. Follow on Twitter @notscientific.

The views expressed are those of the author and are not necessarily those of Scientific American.





Rights & Permissions

Comments 5 Comments

Add Comment
  1. 1. HildaBast 9:02 am 01/28/2013

    Totally agree, and we won’t be limiting it to mathematician bloggers. Making learning about statistics more accessible is vital (which is why my blogging medium of choice for this subject is cartooning). http://statistically-funny.blogspot.com/2013/01/fright-night-in-doctors-lounge.html @hildabast

    Link to this
  2. 2. julianpenrod 7:51 pm 01/28/2013

    At least in terms of the definition provided by Wikipedia, “relative risk” was completely incorrect as presented!
    Among other things, “relative risk” is supposed to compare risk for one group compared with risk for another group. The only group looked at at the beginnning of the article is those eating processed meat and, of those 1/80 or about 1.25% supposedly developed pancreatic cancer. To get a proper “relative risk”, you need the statistics for a group who didn’t eat processed meat yet still developed pancreatic cancer. Yet that is not provided. And, in general, relative risk is expressed as a multiple, so many more or less being affected in one group than in another, not a percentage. However, it is possible it was desired to suggest that those eating processed meats have a risk, relative to those not eating processed meats, of 20/1000 = 1/5 of developing pancreatic cancer. But that’s suggesting people eating processed meats are less likely to get sick! And, incidentally, a relative risk of “20%” can give you so many more, or less, per 400 people only if you know what the statistic is for those not eating processed meat. And, incidentally, if eating processed meat cause 5 cases of pancreatic cancer in 400 people, that would represent 5 more cases “for every four hundred people”! The entire presentation appears highly flawed.

    Link to this
  3. 3. mike5 12:46 pm 01/30/2013

    Sorry, the increased risk due to processed meat was 20%, bringing the total dead from 4 in 400 to 5 in 400? That’s 0.25% change in absolute risk, from 1 to 1.25%.

    Those number still don’t square, but I’ll assume it’s due to rounding to the nearest person out of 400 people.

    C’mon, man. This IS the right story to write, and you’ve shown exactly why. Did you run the factual/mathematical parts of this story by scientists? If not, why not?

    Link to this
  4. 4. mike5 12:54 pm 01/30/2013

    “That’s a facile argument to make, but journalists are also the least equipped to do this, both in terms of time and ability.”

    I’m sorry, but that is absolutely part of your job. Your argument here is like a sports journalist saying “Hey man, I don’t need to know the rules of football; my job is to interview the players. They’ll explain any relevant rules to me.”

    If you care about journalism and your writing, it’s on you to make sure you know what you’re writing about. Likewise, I know very few scientists who wouldn’t try their best to help a journalist if they were hung up on a math/science point.

    Link to this
  5. 5. FrankSwain 6:30 am 01/31/2013

    Mike / Julian
    You’re correct, and it’s my fault for blending two parts of the workshop in a cack-handed manner. Sloppy workmanship, I hang my head in shame.

    On the latter point, I did’t say it wasn’t their job, rather that they’re the least equipped to carry out these sums.

    Link to this

Add a Comment
You must sign in or register as a ScientificAmerican.com member to submit a comment.

More from Scientific American

Scientific American Holiday Sale

Black Friday/Cyber Monday Blow-Out Sale

Enter code:
HOLIDAY 2014
at checkout

Get 20% off now! >

X

Email this Article

X