The scientific literature is full of it.
By which I mean, of course, spin, error and less-than-reliable results.
All that noise makes it tough to keep up with what's important to read and buzz about.
But the biomedical research community has a new way to share opinions on what may or may not be worth pursuing - they could now comment on articles at PubMed. This whopping literature database, visited by millions every day, could become one of the biggest science water-coolers of all.
Conflict of interest disclosure: I am involved with this initiative as part of my day job.* PubMed is produced by the National Institutes of Health (NIH). It started this project after enthusiastic encouragement from Robert Tibshirani from Stanford and other academics. You can read about that here.
The commenting system is called PubMed Commons. It's still in a pilot phase accessible only to registered participants. Registration has been opened to the biomedical community generally today though. You can find out about the ways of joining here.
The first group of scientists to test it out have added comments to articles, and rated the helpfulness of others' comments. Linking in outside debates, such as those in blogs, has been common. Some authors have included extra information related to their own work, linking to outside open data repositories. They've recommended, enhanced and strongly criticized some of the research that's catalogued there.
You can establish links between articles within PubMed by commenting - and link to any website outside PubMed. So it's an opportunity to draw together debates about a publication in a frequently-visited central research hub. Those debates are scattered far and wide - in blogs, mass and social media, journals and journal clubs, Wikipedia talk pages and more.
I've also been one of the scientists in the initial testing group, invited in by the first small snowball recruitment technique when it reached the clinical effectiveness research crowd. The screenshot on the left shows what a comment looks like. It's from a comment I made on a randomized trial, linking it to a systematic review that analyzes it, and to a post of mine here at Scientific American that discusses research on the topic.
When I included that link to the ID of the systematic review in my comment about a trial, the review I was referring to then showed that it is mentioned in a comment (shown on the right). You can find out the details of how all this works, whether you're a registered user or not, in the FAQs at PubMed Commons.
Technically, PubMed Commons could develop in many ways, and I'm very much looking forward to the discussion about what it could or should become. But if it is to help make post-publication peer review a well-functioning reality, it needs to be used enough to become an integral and important part of biomedical research culture.
Post-publication review has yet to seriously scale in biomedical research. Writing letters to the editor has been a constricted and frustrating process in many ways. James Coyne tackles this here.
It's been relatively hard to get letters published - the gatekeepers kept it pretty tight, even in the internet era. And journals don't index in PubMed all the correspondence that they receive. Journals don't actually get all that much correspondence, and it's going to be interesting to see whether the researcher community is going to start capturing more of their thoughts for a wide audience.
Then there's the issue of paying enough attention to comments. Critical comments that do seriously challenge authors' conclusions really should not be ignored, but they often are. The authors of papers often "stonewall" in the face of criticism. A study of letters to editors in the early to mid 2000s found that authors never replied to half of them, and another found when they do, the response is often inadequate.
Readers and subsequent authors don't pay enough attention either. Bhopal and Tonks from the BMJ give several examples of critiques that didn't travel as far as the paper itself did - including a paper which had been roundly rebutted in a letter: the paper was cited 70 times, the rebuttal only once (see their article here).
Could PubMed Commons change any of this? Maybe its sheer ubiquity and high profile in our working lives has that potential. It's worth finding out. It not only needs to be used, it needs to not be abused: PubMed Commons is not moderated, although comments can be removed. Author and sponsorship bias is going to be even harder to detect at abstract level, and conflict of interest in commenting is going to be important to tackle.
Registration via the research community will inhibit the system being used for comment that is not valuable as research review, but it might not be enough. At the same time, developing ways to not exclude valuable input from outside is vital - a subject tackled by Ivan Oransky in Retraction Watch. We're at the start of a complex journey, but it's an important one to take.
It seems kind of strange to be launching a closed pilot in Open Access Week, yet it's also very appropriate. Opening up the biomedical literature to crowdsourced curation by the research community, with the results visible to all, is definitely a great open access step. As Mike Eisen says here, "Let's make this thing work!" So spread the word, and here's hoping it doesn't stay closed for very long!
Here is my (long) "Storify" about the excitement of the launch, other blog posts and Twitter reaction to the Commons' debut.
To find out more about PubMed Commons, click here. Follow on Twitter: @PubMedCommons
To read Robert Tibshirani from Stanford explain how the National Institutes of Health (NIH) came to develop PubMed Commons, and the story so far, click here.
The Statistically-Funny cartoon and sketch are my original work (Creative Commons, non-commercial, share-alike license).
*The thoughts Hilda Bastian expresses here at Absolutely Maybe are personal, and do not necessarily reflect the views of the National Institutes of Health or the U.S. Department of Health and Human Services.