Read our COVID-19 research and news.

Climate change is just one controversial issue that provokes differing views on the relevant science. Here, demonstrators in Washington, D.C., in 2013.

Climate change is just one controversial issue that provokes differing views on the relevant science. Here, demonstrators in Washington, D.C., in 2013.

Stephen Melkisethian/Flickr (CC BY-NC-ND 2.0)

Politics, science, and public attitudes: What we’re learning, and why it matters

The bad news is that everybody does it. The good news is that social scientists are making progress in understanding why people ignore solid scientific evidence in deciding what they think about all manner of science-based issues—including how those topics should be taught in schools and addressed by policymakers.

The U.S. research community has long lamented how often the public disregards—or distorts—scientific findings. Many media pundits point the finger at partisan politics, although they offer contrasting explanations: Liberals often assert that Republicans are simply antiscience, whereas conservatives often insist that Democrats tout scientific findings to justify giving government a larger and more intrusive role.

A leading social science journal, The ANNALS of the American Academy of Political and Social Science, takes a deep dive into the debate by devoting its March issue (subscription required) to “The Politics of Science.” The issue, edited by political scientists Elizabeth Suhay of American University in Washington, D.C., and James Druckman of Northwestern University, includes some 15 articles that explore “the production, communication, and reception of scientific knowledge.” And nobody gets a free pass.

“It’s an equal opportunity scold,” says the journal’s executive editor, Thomas Kecskemethy. “I was fascinated by how the knowledge elites are vulnerable to their own biases.”

The researchers provide no simple answers. (In truth, some of the articles are nearly impenetrable, larded with jargon and political theory.) But the special issue does offer some useful take-home messages:

  •  Scientists shouldn’t beat themselves up for being poor communicators. Yes, they could do a better job. But most people aren’t waiting for scientists to tell them what to think. So the solution is not simply to provide them with more facts and figures.

  •  People are heavily influenced by their existing beliefs, often based on ideology and religion, when they evaluate any particular scientific result. And holding strong beliefs makes a person more likely to reject a “dissonant” message, or actively oppose it.

  •  Liberals are just as likely as conservatives to disagree with the prevailing scientific evidence. But those disagreements occur over a different set of issues (see this table). Many liberals object to nuclear power, hydraulic fracturing (fracking) for oil and gas, genetically modified organisms, and some aspects of genomic medicine. For conservatives, hot-button issues include climate change, evolution, and stem cell research, with vaccination a recent addition.

  •  Ideology isn’t the same thing as party affiliation, although the current gridlock in Congress and enmity between Republican legislators and the White House may suggest otherwise.

  •  When it comes to teaching evolution, poorly trained biology teachers may be a major factor in explaining why large segments of the U.S. population remain unconvinced. Even those teachers who accept the concept are prone to give it short shrift in their classrooms because they lack confidence in their ability to defend evolution against its critics.

  •  The public tends to hold scientists in high regard. People also generally welcome learning more about a controversial issue, such as geoengineering, in which their minds aren’t already made up. So the situation is far from hopeless.

The rest of the piece takes a detailed look at three themes covered in the issue: how deference to scientific evidence relates to political ideology; how people cope with dissonant information; and what students training to become biology teachers think about evolution.

Deferring to science

Understanding the intersection of U.S. politics and science is more important than ever, believes Suhay, who has worked with Druckman to examine the political controversies surrounding genetics. “Political values are unavoidably wrapped up with scientific research, because science tells us what’s possible,” she says. "Science is inherently controversial because nobody wants to hear that their options are limited.”

The recent surge in polarization in American politics, Suhay says, is forcing “political elites to make their arguments with greater passion. … Part of the battle is marshaling scientific evidence in favor of your point of view. So facts become tied to a particular political view.”

Given that polarization, Daron Shaw, a professor of government at the University of Texas, Austin, and his graduate student, Joshua Blank, wanted to know “the extent to which people would defer to science” on various controversial issues.

“There’s a long history in this country of believing that ‘the truth will set you free, and that science has the answers.’ It binds U.S. politics together,” says Shaw, who studies elections and voting behavior and who has done survey research for several political campaigns. But if that deference to scientific knowledge “is unraveling as a result of greater polarization,” he says, “that’s a consequential change.”

To find out, Shaw and blank surveyed a nationally representative sample of 2000 registered U.S. voters. Each was asked to score 16 policy areas on a 10-point scale; a 10 meant policymakers should totally embrace the advice of scientists, and a zero meant that they should completely ignore scientific advice.

Overall, they found that deference to science remains quite high, regardless of political self-identification. The scores across all issues averaged to 6.4, suggesting voters generally want policymakers to listen to scientists. But there were differences: Self-identified Democrats averaged 7.46, versus 5.58 for Republicans and 5.84 for independents.

Looking at those findings, Shaw concludes that, yes, conservatives are less willing to defer to scientific recommendations. But no, it is not accurate to accuse Republicans of holding antiscience beliefs, or to single them out. For starters, their attitudes are nearly indistinguishable from independents. Second, the ratings showed that Republicans still defer to science in 14 of the 16 policy areas. The exceptions were mandatory health insurance and gay adoption, where “being a Republican correlates with a decreased willingness to defer to what science says,” Shaw and Blank write.

In contrast, Democrats deferred to science in all 16 areas. And Shaw says the overall average score of 6.4 “is pretty positive … at least it’s more, rather than less, supportive” of tapping scientific expertise for policymaking.

The researchers also found that a person’s deference to scientific evidence depends on the specific policy under consideration. There was little difference across the ideological spectrum on using animals in research, for example, whereas there was a huge disparity between conservatives and liberals on regulating carbon emissions to combat global warming. (The researchers identified the scientific consensus on those issues as being in favor of the use of animals in research, and supporting some type of regulatory mechanism to reduce emissions, respectively.)

None of this means that evidence necessarily trumps ideology, the researchers note. In fact, they found that ideology usually wins when the two are in direct conflict in a voter’s mind.

To Shaw, the biggest mystery is why Democrats put so much more faith in science to inform policy than do Republicans or independents. No other factor, such as education, income, or race, appears to explain that difference, he says.

Even so, the researchers believe that their findings might be useful to campaign strategists. “If you want to get Democrats on your side, you may want to use scientific research to back up your policy positions,” they write. “The self-expressed willingness of those on the Left to defer to scientists indicates that political arguments based on objective, scientific research might have a powerful influence on opinion. … They are also important for key elements of the Democratic coalition, such as blacks and Latinos.”

Reacting to dissonance

Another way to look at the interplay of politics and science is to examine how people react when faced with so-called dissonant scientific messages—information that doesn’t fit with their worldview. A trio of researchers at Ohio State University, Columbus, found that the public’s faith in science was weakened by such cognitive dissonance. The distrust occurred among both conservatives and liberals, but only on the most contentious topics.

The researchers—communications professors Erik Nisbet and R. Kelly Garrett and Kathryn Cooper, a graduate student—conducted an online survey of 1500 people. Participants thought they were evaluating the quality of a new science website. But that was a pretext for measuring their attitudes about information that would challenge their beliefs on certain issues. The survey included questions about climate change and evolution—red meat for self-identified conservatives—as well as fracking and nuclear power—topics expected to elicit opposition from liberals. They also read passages relating to the solar system and the earth sciences, two topics that the researchers deemed neutral.

As expected, the participants exhibited high levels of what social scientists call “motivated reasoning.” That is when we rebut or ignore new information on a topic—say, the safety of genetically modified foods—to protect what we already believe. The researchers also found that people reacted more negatively to scientific information that was seen as a threat to their values. The effect applied across the political spectrum, although conservatives reacted four times more strongly than did liberals.

Like Shaw and Blank, Nisbet found that “liberals are also capable of processing scientific information in a biased manner,” he noted in a press release. “They aren’t inherently superior to conservatives.” The Ohio State researchers also found that conflict, by itself, can cause people to lose trust in the scientific enterprise. “Just reading about these polarizing topics is having a negative effect on how people feel about science,” Garrett said in the press release.

Teaching evolution poorly

A third paper in the special issue examines the attitudes of students being trained to teach one of those polarizing topics—evolution—in the nation’s schools.

Previously, authors Eric Plutzer and Michael Berkman, political scientists at Pennsylvania State University (Penn State), University Park, had conducted research that found “a pervasive reluctance [among high school biology teachers] to forthrightly explain evolutionary biology.” Only 28% used evolution as a unifying theme in their classes, they reported in a 2011 Science article. On the other end of the spectrum, 13% included creationism or intelligent design in their lessons.

In the current study, Plutzer and Berkman sought to learn more about the beliefs of what they call “the cautious 60%, [teachers] who are neither strong advocates for evolutionary biology nor explicit endorsers of nonscientific alternatives.” So in 2013 they interviewed 35 students preparing to become high school biology teachers, hoping to find clues about how they would handle the subject once they entered the classroom. They selected the undergraduates from a diverse set of institutions in Pennsylvania—a large research university, a state university with a large teacher-training program, a Catholic college, and a historically black university.

What they heard troubled them. “We found that the depth of their scientific understanding is not what you’d think it would be,” Berkman explains. “Yes, they were science majors in science education programs, but they weren’t becoming science teachers because they loved science.” And they were “not the ones who were taking apart washing machines or launching rockets when they were kids,” Plutzer adds. “They are not driven to become scientists.”

That’s a concern, the authors say, because teachers who consider themselves educators first are likely to handle potentially hot topics like evolution very differently than those who consider themselves scientists, the researchers posit. “Rather than cite facts and discuss the content, most of the students felt they could rely on classroom management and pedagogical techniques if a problem arose,” Berkman says. That approach masks a larger issue, he adds: “Not feeling confident about your knowledge of evolution leads to being less likely to teach it.”

The researchers said they were initially surprised to find that students at the Catholic college were more comfortable discussing the topic than were their peers at secular institutions. Their explanation: Rather than shying away from the subject, the students “probably had been wrestling with the issue their entire lives,” Berkman says. “They seemed to do a better job of reconciling their beliefs with what they had learned about evolution.”

In contrast, they say, students at secular institutions are unlikely to have had the opportunity to explore their personal views in a science or education class. “You’re not going to get a Penn State professor to talk about that with their students,” Berkman surmises.

The researchers admit their sample is not representative of all science teacher–training programs. But they think the responses are still instructive—and highlight how much work needs to be done. “Young preservice teachers are already on a path that is likely to lead to evolution instruction that falls short of the expectations of leading scientific organizations,” they conclude.    

What should be done to change that direction? One paradoxical answer is breaking what the authors call “a cycle of ignorance” by improving biology teaching at the high school and undergraduate levels. “Many students lack good models for teaching evolution in public schools” because they didn’t get good instruction on the topic, the researchers write.

Future teachers also need a better grounding in what the researchers call “the nature of scientific inquiry.” Few ever work in a research lab, they note—both because their schedules are packed with courses on content and pedagogy, and because many shy away from such “hands-on” experiences.

Faculty members in the sciences also need to understand that teacher trainees, in general, are different than typical undergraduate science majors, Plutzer says. “Future science teachers are not junior versions of themselves,” he says, “and getting them to understand evolution is not simply a matter of having them take more science courses.”

Half-full or half-empty?

Although the papers in the special issue, taken together, highlight the many obstacles that stand in the way of clear and helpful science communication, Suhay doesn’t think readers should conclude that the situation is hopeless. In fact, she sees rays of sun poking through the current political storms.

“I don’t know that we need to be as alarmed as some people are,” she says. Simply understanding that people all across the ideological spectrum sometimes have trouble incorporating scientific knowledge into their worldview could help reduce the finger-pointing, she says. “It allows us to see the world more clearly,” she says. “If you blame only Republicans [for ignoring science], it reduces the chance of striking a compromise.”

Such understanding is a necessary prelude to action, she adds. “It may help quell the anger on both sides. And understanding the motivations behind why people act in certain ways should also help us to be smarter in addressing the issue.”