February 28, 2013 | 5
Recently, Adam Stevens looked at where crowdsourcing ends and citizen science begins and raised his doubt that the projects in the Zooniverse qualify as citizen science. According to Stevens, categorizing images (“data crunching, plain and simple”) is what happens before science really starts. When I run a race, it appears to start with the bang of a cap gun, but it really begins months earlier when I start training. A new era of space exploration didn’t start when Neil Armstrong stepped onto the Moon; it began with a lot of data crunching at mission control. Similarly, in citizen science, there are many ways to draw boundaries to parse these types of collaborative efforts, but I’m not sure what purpose is served by narrow ones.
Others have already responded to explain why the Zooniverse projects count as citizen science, such as here, here, and here. Citizen science broadly encompasses public involvement in science, and, until I saw Karen James’s response to Stevens, I didn’t realize people were discussing the nuances of what does or does not qualify as citizen science.
Stevens’ view of clicking and tagging in the Zooniverse reminds me of the scientific hierarchy defined by William Whewell. He used the term ‘subordinate laborers’ to describe citizen science participants in his Great Tide Experiment of 1835. To paraphrase, Whewell wrote that lay people are capable of collecting information, like gathering pearls, while professional scientists have the ability to make meaning of the information, like stringing the pearls into a necklace. A stereotypical portrait of science focuses on a lone, unkempt individual working long hours to progress slowly and iteratively through the steps of the scientific method.
To understand Zooniverse-style citizen science, instead imagine science based on massively collaborative efforts where tasks are divvied up. Citizen science can look like a hobby, a game, a puzzle, brainstorming, or even like mind-numbing drudgery. Citizen science projects vary in how much they open the door for people to have access to creating knowledge. Sometimes a project has to entice people in, and in other cases people are banging down the door. If Stevens defines citizen science participation as narrowly as stringing pearls, that model is never going to open the doors of science very wide, let alone bring in hundreds of thousands of people like the Zooniverse has done.
So far the discussion has focused on defining citizen science based on the perspective of individual participants, their activities, and whether they truly experienced or learned about the scientific process, as though these features were the basis for determining what is and isn’t citizen science. The common denominator for all citizen science is collaborative research that includes members of the public in any one of a variety of ways; some projects have no explicit public education goals. I am in awe of the science learning and social outcomes of citizen science, BUT I don’t see the need to use the presence or quality of science learning and social outcomes as criteria for determining what qualifies as citizen science. Citizen science, like other types of science, happens when it produces reliable, new knowledge. Participants in the Zooniverse have done science because they’ve helped in the long process to produce reliable, new knowledge.
As I’ve said elsewhere, the word “citizen” in relation to governance confers rights and responsibilities to a participating member of a country. In the context of citizen science, “citizen” conveys the idea that anybody can assume rights and responsibilities to participate in the enterprise of science. In both cases, a citizen participates in a much bigger, collective process. Citizen science doesn’t mean participants necessarily become scientists and do everything. If the goal of a citizen science project was to have participants carry out all phases of science independently, well….that’s the same goal of PhD programs.
Get 6 bi-monthly digital issues
+ 1yr of archive access for just $9.99