September 9, 2013 | 3
Yesterday’s uplifting emphasis at the quadrennial medical editors’ scientific meeting was bad research (“Bad research rising”). This morning’s motivational agenda focused on measuring some of the main techniques for jazzing up research results.
It started with making research look more independent by not declaring authors’ commercial conflicts of interest (COI). Kristine Rasmussen told us that in Denmark, doctors have to apply for permission to collaborate with industry. That enabled Rasmussen and her colleagues to study whether or not doctors who did clinical trials were declaring their commercial relationships with drug manufacturers.
Out of 171 trial authors, 11% did not disclose a COI with the trial sponsor or drug manufacturer, and another 26% did not disclose that they had a commercial relationship with the manufacturer of another drug for the same use. Lively discussion ensued. From the audience, Leslie Citrome remarked that some academic departments are involved in so many industry trials that they should now be regarded as contract research organizations rather than academia.
Later, we heard from Serina Stratton that out of 313 trials studied, 36 required sponsor/manufacturer approval for text or publication and 6 had gag orders. Leading to some inevitable questions: why aren’t all academic institutions protecting researchers and trial participants from industry restrictions on academic freedom – and why aren’t potential participants being warned about this before they agree to be in a trial?
On to what data are reported: Jamie Kirkham found that 77% of systematic reviews they looked at had identified at least one trial in which it was suspected that data on harms had been held back. Scary.
Soon it was Isabelle Boutron’s turn to tackle researcher spin about their trials. She defines this as using techniques to make the intervention studied look more beneficial than it was. You can do this by distracting people from non-significant important results with data on less important outcomes. Or using words that make exaggerated claims that data don’t actually support.
This kind of academic spin is common in abstracts: in a past study, she and her colleagues found spin somewhere in the abstracts of 40% of trials. And it follows through into about half of press releases and subsequent media coverage.
Today she told us about a trial they did to see whether readers get caught out by the spin. They re-wrote abstracts to take out the spin and randomized 300 people to get either the spun or cleaned-up version. Not surprisingly, they found that the spin was successful in leading readers to believe the intervention was more beneficial than it was.
So how can we as readers protect ourselves from this fate? Understanding more about concepts like statistical significance is important - it’s one of the main tools of the trade. I’ve written some more tips here: “They would say that, wouldn’t they?” Don’t rely on an abstract: you really do need to check the data and fine print. Check if there’s a systematic review on the subject: that can help you see how the result fits in with other research. And be wary if they don’t tell you what the range of other opinions on the topic is, or if most of the references they use come from their own team. “Even if I say so myself” isn’t usually a good basis to support someone’s interpretation.
We’re about halfway through this conference now, and it’s been keeping a cracking pace. Research on the accessibility of research and post-publication discussion about research lie ahead.
Post on previous day: “Bad research rising”
Post on following sessions: “Opening a can of data-sharing worms”
As you would expect from a congress on biomedical publication, there’s a whole lot of tweeting going on. Follow on #PRC7
The cartoon is by the author, under a Creative Commons, non-commercial, share-alike license. The photo of Isabelle Boutron was taken by the author.
The thoughts Hilda Bastian expresses here are personal, and do not necessarily reflect the views of the National Institutes of Health or the U.S. Department of Health and Human Services.