For most participants, next week’s March for Science (M4S) will be a chance to step away from the lab and join a public outpouring of support for evidence-based science. For sociologist Dana Fisher and a handful of other scientists who do survey research, however, the event will be another day at the office, as they plan to query those attending the demonstrations.
Fisher studies protests and climate politics. And on 22 April she will be leading a team of 16 faculty members and students from the University of Maryland in College Park who will commute to downtown Washington, D.C., to gather data from the crowd assembled on the National Mall for the flagship U.S. march. Fisher hopes that more than 500 people will complete a two-page survey asking what brought them to the event, their level of political activism, and the nature of their work.
A different team, led by political sociologist Michael Heaney of the University of Michigan in Ann Arbor, also wants to explore the political identities of the marchers. Their approach will require a six-page survey, and Heaney says he’ll be ecstatic if 250 people wade through it.
A third group wants to explore scientific norms, that is, what marchers think is appropriate political behavior by scientists. The four-person team, led by political scientist Michael Xenos of the University of Wisconsin in Madison and deployed by graduate student April Eichmeier, hopes to engage some 200 participants.
ScienceInsider has also identified a fourth group of researchers who hope to study the march without actually being there in person. The team, from nearby George Mason University (GMU) in Fairfax, Virginia, are asking organizers to help them recruit participants for a survey by sending out a note to the march's email list and social media followers.
Scientists are no strangers to political activism. But the potential magnitude of this month’s marches—scheduled for hundreds of cities across the United States and around the world—may be unprecedented, says Scott Frickel, who studies mass mobilizations at Brown University and is part of Fisher’s group.
The challenge for researchers is to understand what is driving people to act now and what their actions say about the status of science in today’s fractious political culture. “People are always upset about something,” Frickel says. “But they start movements when they see the possibility of losing what they have.” Fisher is predicting as many as 100,000 will show up at the Washington, D.C., march alone, if the weather cooperates.
Researchers hope the data they collect will also give them a sense of what might happen next. “It’s the nature of mass mobilization,” Frickel says. “You get people in the streets, and then you figure out what you want to do.”
Fisher has been doing surveys of large-scale climate protests for more than a decade. Though she specializes in environmental issues—the March for Science takes place on Earth Day—she also surveyed participants in this year’s postinaugural Women’s March on Washington.
Her M4S instrument is based on the two-page questionnaire she used for that march. But it will contain seven additional questions—including one asking for the participant’s job title and another that requests a description of what they do at work—that are aimed at capturing the unusual demographics expected for the M4S. Although the survey is anonymous, it asks those willing to participate in a follow-up interview weeks or months later to provide contact information.
What happens after 22 April is a less compelling question for Heaney, who studies how social movements and political party affiliation shape public policy. “I’m not sure what I would make of the [follow-up] data if I asked,” he says. For Heaney, M4S offers a chance to use the science march to enrich his previous findings, such as studies showing that people tend “to abandon a movement rather than [to switch] parties.”
For example, his M4S survey asks participants to select from a list of 28 causes that might have prompted them to hit the streets at some point in their lives, from opposing the 1960s Vietnam War to Black Lives Matter and “stop Obama health care.” Fisher’s similar list of 11 topics is much shorter and asks participants to go back only a decade. It’s also more generic—the categories include peace and racial justice, for example—when dealing with the topic that prompted them to protest.
Like his colleagues, Xenos wants to find out why people are attending the march, how they learned about it, and a bit about who they are. But he’s also interested in what they hope to accomplish, and what they think of scientists who adopt this form of political expression.
March madness methodologies
At the march, the survey takers will face the challenge of selecting participants from among the throng. All of the researchers are planning to use a variation of the standard technique for randomizing the process. Fisher’s team will plunge into the crowd and grab every fifth person, for example, whereas Heaney’s crew will start by spotting an “anchor”—a person who catches their eye for whatever reason—and then pick those who are five persons to the right and left of that individual.
Once they find a willing subject, however, their methodology will vary. Fisher’s team will be armed with computer tablets, which they will hand over to marchers and then gently monitor their progress. It is way more efficient than using paper questionnaires, which later must be typed into computers, she says.
In contrast, Heaney will use the traditional clipboards, and Eichmeier’s crew will record the interviews, which they will later transcribe. Under survey rules approved by their university’s Institutional Review Board, the research teams won’t have to get written prior consent from their subjects because the subjects will not be identified in any way. And only those 18 and older will be asked to participate.
Away from the march, the GMU team is taking a digital approach. Rather than in-person interviews, they are asking march organizers to help them query thousands of people who have expressed their support for the march via email or on social media. (The March for Science would not share its email or social media lists with the researchers; rather, it would let followers know the researchers are seeking their responses on a survey.)
The GMU researchers realize that many of those people will not be scientists themselves. “The march is a unique opportunity to measure public perceptions of public engagement by scientists and the role of science in society,” says Teresa Ann Myers, a researcher at GMU’s climate change communications center that has proposed the study. “There’s a lot of talk about that online, but there isn’t much in the literature.”
Jonathan Berman, an M4S co-chair, says the steering committee is reviewing requests from several researchers interested in contacting their supporters. “We're looking forward to working with social scientists who do know how to ask the right scientific questions to generate scholarly output about the march,” says Berman, a postdoc at the University of Texas Health Science Center in San Antonio. Berman says that it’s unlikely any decisions will be made until after the march, however, a timeframe that Myers says “wouldn’t be a problem.”
High refusal rates are the bane of every survey researcher, and the willingness of participants to engage can vary greatly. Those attending this winter’s Women’s March on Washington were very receptive, says Fisher, with only 7.5% declining. That’s the lowest refusal rate ever for her, but she won’t be surprised if the M4S erases that record. After all, who better understands the value of doing research than a fellow researcher?
Clarification, 4/13/2017, 11:21 a.m.: The story has been revised to clarify that the GMU researchers are not asking March for Science to share its email or social media lists with the researchers, but asking march organizers to distribute notice of a survey instrument on their behalf.