For many americans, the wave of disinformation and defiance of medical authority that arrived with the coronavirus was their first face-to-face encounter with the high-stakes consequences of science denialism. Within months of the outbreak, the global scientific community reached a consensus on how the virus spread and acted on the body. Evidence-driven policies like mask mandates and social-distancing protocols soon followed. Although the devastating health effects of the virus were impossible to ignore, large swaths of the population and their elected leaders still rejected expert medical advice. Now, as the United States gradually recovers from a year of death and disease, Americans must grapple with a troubling question: why did so many of their fellow citizens reject not just the scientific evidence, but the evidence of their own senses, when so many lives were at stake?
For Lea professor of the history of science Naomi Oreskes, the rampant science denialism during the pandemic followed a familiar pattern. She has spent her career detailing the methods that powerful industries (tobacco and fossil-fuel producers) use to systematically undermine scientific evidence, while simultaneously exploring the ways that individuals become science deniers. The pandemic exposed the extent to which science denialism has woven itself into the fabric of American society. But the way Oreskes sees it, the extraordinary circumstances of the past year merely
emphasized an everyday reality.
“I think people became very aware during the pandemic how egregious science denial is because it was a question of life and death,” she says. “But I think a lot of people still don’t fully understand how broad and persistent it is.”
Oreskes rejects the common notion that what happened during the pandemic is indicative of a crisis of trust in science. She cites two recent studies from the American Association for the Advancement of Science and Pew Research Center that show public attitudes about science have barely changed at all over the past 50 years: “Our perception that there’s a crisis in science is not supported by the data.”
Instead, the data reveal a persistent divide between the two-thirds of Americans who trust science and those who do not. Overcoming the latter requires understanding its cause, which Oreskes traces to a perception of science as incompatible with cherished beliefs about God, family, or country. In some cases, people who trust science as a whole may reject science on specific topics such as climate change or evolutionary theory. She refers to this selectivity as “implicatory denial.”
“What we see here is that the rejection of science in these areas is not because the science is too uncertain or otherwise inadequate, it’s because the science implies something these people don’t want to accept as true,” she said during a recent presentation for the Harvard Alumni Association. Although scientists often believe they can win over skeptics by doing more science and persuading them with the strength of evidence, Oreskes says this is a misguided approach: “These people don’t reject science because they don’t have enough facts. They reject science because they think it clashes with their values or ideology.”
Oreskes points to a growing body of research that shows that the best way to bridge the gulf with science deniers is to engage their values. This may involve finding common ground between the scientists’ values and the skeptics’. For example, a biologist and a climate-change denier can both cherish biodiversity even though one attributes it to natural selection and the other to divine creation. By acknowledging these shared values, they may agree to take action on climate change because it threatens something they both hold dear.
In other cases, researchers can make the case that science isn’t the enemy of others’ value systems. For instance, evolutionary biologists can engage devout science deniers by explaining that science can’t disprove the existence of God—that is a theological question beyond the purview of science—or by pointing them to the legions of scientists who are also believers. “When you take their value concerns seriously instead of just saying these people are hopeless,” Oreskes says, “we have evidence that you can make progress.”
A major challenge to this kind of mutual understanding is that most scientists are unaccustomed to acknowledging personal values and belief systems in discussions about their work. Science is supposed to be objective: let the data speak for themselves; what the scientist believes is irrelevant. But this description of science, Oreskes says, is both inaccurate and undermines a potent opportunity to engage with deniers. Scientists are not valueless automatons, and their research is deeply influenced by the things they care about. Although they are still expected to be objective and honest, that doesn’t mean they can’t present a scientific finding in a way that creates an emotional connection with people who might otherwise reject it.
Oreskes doesn’t expect all scientists to put down their pipettes and microscopes to engage the skeptics and science deniers on critical issues. But she does think scientists as a group need to seriously reconsider the role that human values play in their work—and use that insight to guide the way they interact with the public. “This whole notion that we need to set aside our emotions and just look at the facts flies in the face of evidence,” she says. “Nobody does that, because the emotional and the rational are connected. So rather than being afraid of our emotions, we should embrace them. That’s how we can reach people on these issues.”