The data is in. Neuroscientists don't really use blogs. But they think they are very important for policy. Super important, but no one uses them? What gives?

(I'm with this guy. Source)

I found this paper because of a fight. Well, it was more of a discussion than a fight, but there's no tone of voice on the internet, so there could well have been screaming and spittle and it just didn't make it across the screen. But the argument, and the paper itself, raised a lot of questions for me.

This paper is not about science. It's about scientists. Neuroscientists, to be exact. The science of how scientists view science communication (if you add another science in there, you'll hit infinite recursion!). The question is what types of media do scientists use to access science news. And this is not a silly question to ask. What neuroscientists do on a daily basis, the results they find, can have impacts on the real world. New treatments, discontinuation of old ones. New technologies, that may impact how we do things like combat crime, judge others, and generally interact with our world.

So it's important to know, not just how the work of neuroscientists influences the public, but how neuroscientists themselves judge science media and the communication of science to the public.

Allgaier et al. "Journalism and Social Media as Means of Observing the Contexts of Science" Professional Biologist, 2013.

The authors of this study conducted a survey, sent to neuroscientists randomly selected as authors on publication from 8 peer reviewed journals (no word on how they selected the journals, and it's not in the supplement, that's something I'd like to know, the relative impact factors etc, and hey, how would these results have changed if they'd used PLoS ONE?). They asked them how they got their science news, how they felt various outlets influenced the public on issues of science, and how they felt various outlets influenced policymakers on issues of science.

What they found was sadly, no surprise to me. When asked about media usage, the overwhemling majority of respondants said they used traditional journalistic sources. Only about 20% used blogs, and only 10% used social network content. While you might think that they must have only gotten old guys to take the survey (and indeed 79% were over 40), in fact, there was still similarity, the only difference being that the younger neuroscientists accessed the same journalistic, and though they were ALSO more likely to use blogs and social networks, it was still pretty low. They also only managed 22% female, but women did not really differ, except that they were less likely to use the print magazines and newspapers (and when you shove Discover and Wired under automotive in the bookstore, I can understand why you wouldn't have grown up used to reading these mags).

When the authors asked about IMPACT, the neuroscientists said that both traditional journalistic outlets and blogs could impact public opinion and policy (especially policy). But for all that 61% of US respondents thought they blogs could influence policy...they still do not READ them. So neuroscientists are "clearly privileging narratives vetted by journalistic production processes".

And this is a problem. Obviously it's not all neuroscientists (certainly not yours truly!), but it's clear that many neuroscientists are not remotely aware of the large amount of science there is on social media.

Is this symbolic of neuroscientists in particular being luddites? Are other fields in biomedicine better? Worse? The same? I'd be interested to see how different fields compare. Some fields seems to have embraced the world of social media much more easily than others (ocean science, for example, and climate science, if possibly by force).

But my question now is WHY many neuroscientists are so unaware of the neuroscience presence on social media? Do they not know? Do they not care? Do they feel it is not relevant to what they do every day?

Yes, it's true that scientists actually use social media LESS than the general public, which I think accounts for some of this attitude. But is this because they are horrid social media slugs? After all, neuroscientists are plenty quick to pick up on the latest hot science tech (optogenetics, anyone?), why has the internet lagged behind? They're not techie slugs (heck, many of them need to code for their jobs), so is it because they do not see things directly relevant to them?

Scientists are often trained to be passionate about their work, and only their work. Maybe they don't KNOW what the internet has to offer that will help them. Maybe the PIs don't know the treasure-trove of grant lore that is Drugmonkey or Rock Talk, and the students don't know that there are guides to the cranial nerves that will help you out in class (certainly, the vast majority of my colleagues do not know these things). As the authors note: academia LOVES hierarchy, and this can influence both what they feel is reliable (the view of the traditional media outlet as reliable may be a hard one to break, despite evidence to the contrary), and how younger neuroscientists end up viewing and understanding science communication. I wonder how much this lack of awareness is an indication of the ivory tower culture of academia, a culture that, in many cases, thinks that communicating directly to the public is a waste of time. I mean, you could be writing grants instead of blogging at 2am. That's a waste of good insomnia!

As to why (or whether) they don't know these things exist, as opposed to knowing and not using them by choice? Here we run into culture. The only internet thing that biomedical science has embraced whole heartedly is Pubmed (and boy, do we ALL LOVE PUBMED). But otherwise, many things in academia, and maybe neuroscience is just an example, still teach things the way they always did. Students get grant training and writing training from their supervisors. They are not TOLD to get it somewhere else, and are certainly not told that they need any other source for this training. It is assumed that your advisor will BE your primary mentor, and that looking elsewhere for guidance is, in general, unnecessary. Focused on their work as they are, do students even think to look for other sources? In addition, early in their careers, many students idolize their PIs, and if their PI says thou shalt not be on the internet, you won't be on the internet. Further, many biomedical scientists are told to focus, focus, focus. Your research is in dopamine, and you're reading serotonin? What FOR?! You'd really better focus! And let's not talk about reading about the kidney if you research the brain. That's a waste of valuable research time. In such a culture, the wide ranging world of blogs, where most blogs don't just focus on a single, teeny tiny slice of the neuroscience world...can seem a little hobby-ish. Fine, if you've got free time, but why would you?

It's a combination of an existing culture and the pressures that culture exerts. But I think it cannot last. As we finally come to acknowledge that academia is in fact the alternative science career (when 85% of your PhDs don't go into academia, I think it's time we switched the terms between what is "alternative" and what is not), students and trainees (and maybe PIs) are going to realize that they aren't getting the right training. They are going to understand that they need to widen their skill sets, the best pipette jockey doesn't always get the job. And as more and more science news goes online, and as more and more scientists see their news being covered (especially in the current neuroscience craze), they will have to become aware of who is commenting, and how. Finally, we cannot ignore the internet. We cannot afford to be unaware of controversy or to miss valuable information. These were quirky scientist foibles before. Now, they could harm your career.

But I also wonder just HOW pervasive this problem is in academia in general, and how it compares with other professions. The world doesn't read my blog, just a few interested individuals do (and I adore you! All two of you! Thank you so much!). And this goes for almost every topic. Some topics are more popular than others (fashion, for example, TV, for another), but for actual scientific information, HOW many people tune in? We know the majority of people who want to find out something about science use the internet for specific science related searches...but how many of them use it that way at all?

I really don't know. I'm interested to see who does, and what this means. And I'm very interested to see if the findings this paper has replicate across disciplines, or if there are differences. And if there are, WHY. Future studies, I hope! I'm very glad that this sort of study is being done, I think the future of effective science communication has to involve a better understanding of who is doing the communicating as well as who they are communicating to.

Now, on to the argument. In the discussion section of the paper, the authors note that

"Scientists may understand that neuroscience stories in legacy media channels are likely to be of higher quality than similar narratives found in blogs. Stories in social channels are often crafted on the fly, without the help of experienced editors who can point out holes in the narrative or who can insist on rewriting and revision. Blog posts also tend to be shorter narratives, bereft of the kind of complexity and nuance possible only in long-form journalism."

Man, if neuroscience stories in legacy media channels were of higher quality, I would not be writing to you from SciAm today. I'd be out of a job, let alone the real critics like Neurocritic (who wrote a piece on this very paper, no less!) and Neuroskeptic! And it's clear that there are many examples of excellent science writing (go here for a lovely weekly platter of it), both long and short form, on blogs and social media, as well as really lovely examples in traditional media outlets (like *cough* regular SciAm for example). But the presence of editorial oversight certainly does not assure quality, and the lack of it doesn't automatically cause quality to decrease. This is the old "bloggers vs journalists" argument. And it's been over a long, long time. Apparently, academia has yet to catch up. There was a good discussion of this over at Branch, and a write up at Knight Science Journalism.

And yeah, I'm a little affronted, I suppose, that my little blog is not "understood" to be high quality, but that'd be just my entitlement talking.

That aside, I wonder if statements like this are a symptom of the view of academia in general toward social media. Maybe it really isn't just neuroscientists. Maybe it's everyone in academia, assuming that social media just isn't reliable, deep, or that there are simply too many LOLs and too much commercialization. And maybe the authors fell victim to this assumption. The authors were analyzing academic culture, but perhaps they could not get away from the culture fully themselves. It makes me think: if we are going to continue to study how scientists use social media (and I hope we do), how will the scientists studying it break out of the attitude? And does this attitude bias the results and interpretations of the study (and, perhaps, in what outlets it is shared)? The argument over these statements may be a symptom of something to be watched, that while we analyze attitudes toward social media, we may ourselves need an attitude adjustment.

Edited to add (3/19/13): Upon some contact with the 1st author and comments from Neurocritic, I would like to add some more thoughts (even though this is long enough).

First, the survey was conducted in 2010, which might explain some attitudes, and I'd be interested to see an update. A LOT has changed since 2010. Secondly, the authors excluded anyone who had not published EIGHT PAPERS over TWO YEARS. This was to get a group of neuroscientists who were the ones making the big decisions, but I think this might be responsible for a large part of their results. These people are big cheeses, are likely to be older, and way less likely to be female. If they have published 8 last author (say) papers and are under 40...they are the neuroscience equivalent of "gunners", and, as neurocritic points out below, probably spend very little time doing anything that's not directly related to their work.

I did get a list of the journals used, and it's pretty exhaustive, including IFs as low as 0.3, so there may be no IF skew, which is probably a plus. But it would be very interesting to see how younger, less published scientists view social media and how much they use it. Anecdatally (that's my technical term), I find that most of my colleagues, even those my own age, make very little use of social media for scientific purposes and have almost no knowledge of what is out there. However, that could be my subfield, or merely the institution(s) of which I have been a part. But some real data on this point would definitely be good.

But there is one final skew that I think might be worth looking at. The data was taken in an ONLINE SURVEY. This means people had to go online, and not delete the email, and complete the online survey. If people are unlikely to go likely are they to complete the survey in the first place? I wonder how this would compare to other methods of sampling, say by mail or in person at the Society for Neuroscience meeting? It would be interesting to see.