What causes some people to mistrust science, and what can scientists do about it?

What causes some people to mistrust science and what can scientists do about it
What causes some people to mistrust science and what can scientists do about it
  • Researchers studied why some people disregard scientific data while formulating beliefs.
  • They emphasized four fundamental principles as well as solutions to overcome them.
  • They came to the conclusion that “scientists should be ready to sympathize” with the people they wish to reach in order to better communicate their ideas.
What causes some people to mistrust science and what can scientists do about it
What causes some people to mistrust science and what can scientists do about it

According to a September 2021 poll, 61 percent of Americans identified COVID-19 as a major public health threat.

Another recent poll of Americans indicated a far bigger growth in climate concern among Democrat-leaning respondents (27 percent) than among Republican-leaning respondents (6 percent ).

Understanding why people ignore scientific data when developing opinions should help scientists and science communicators engage the public more effectively.

Researchers recently identified four main reasons why people may disregard scientific data when formulating beliefs, as well as measures to improve communication.

“The authors echo many of the critical ideas that science communication scholars and practitioners have long championed,” said Dr. Dietram A. Scheufele, distinguished professor at the University of Wisconsin-Madison who was not involved in the study.

“Perhaps most prominently: Communicate your messages in ways that respond to rather than mock things that are significant to the individuals you are attempting to reach,” he explained.

The study was published in the journal PNAS.

Framework

For the study, the researchers linked recent findings on anti-science sentiments to principles from research on attitudes, persuasion, social impact, social identity, and information acceptance versus rejection.

They discovered four elements that underpin the rejection of scientific evidence when forming judgments in this way:

  • source of the scientific message — when sources of scientific information, such as scientists, are perceived as inexpert or untrustworthy
  • recipient of the scientific message — when scientific information activates one’s social identity as a member of a group that holds anti-science attitudes, that has been underrepresented in science or exploited by scientific work
  • the scientific message itself — when scientific information contradicts preexisting beliefs, what people think is favorable and a preexisting sense of morality
  • mismatch between the delivery of the message and the recipient’s epistemic style — when information is delivered in ways that a reader does not conceptually understand, or that does not address their need for closure.

Dr. Bastiaan Rutjens, assistant professor of Social Psychology at the University of Amsterdam, who was not involved in the study, told MNT, “[i]t is important to appreciate that anti-science beliefs are not some monolithic entity but rather diverse and […] reflect potentially very different attitude objects.”

“In some cases, scientific literacy is a more important antecedent, and thus the principle pertaining to thinking style may be more important,” he explained. “However, in other cases, political ideology plays a key role, and in yet other cases, religious or spiritual beliefs clash with scientific theories.”

Defying anti-science ideas

The researchers proposed many strategies to combat the aforementioned principles. They suggested the following as a “source of scientific message”:

  • improving the perceived validity of scientists’ work
  • conveying warmth and prosocial goals in science communication and using accessible language
  • conveying that the source is not antagonistic by portraying both sides of the argument.

They suggested establishing a common or superordinate identity while communicating science and engaging and partnering with marginalized communities to address the “receiver of the scientific message.”

For “the scientific message itself,” the researchers recommended:

  • training in scientific reasoning
  • prebunkingTrusted Source
  • strong arguments
  • self-affirmation
  • moral reframing
  • increasing the perceived naturalness and moral purity of scientific innovations.

Dr. Scott Morgan, associate professor of psychology at Drew University, not involved in the study, told MNT:

“The public may not always understand that science is a process of refining knowledge, and although errors happen, a scientist will update their beliefs in light of the best evidence. The public may come to believe that scientists ‘don’t know what they’re talking about’ when in fact, they are grappling with new, complex information and updating beliefs in light of new findings.”

They proposed “framing messages as approaching advantages for promotion-focused recipients, but avoiding losses for prevention-focused recipients” to address the “mismatch between delivery and recipients’ epistemic style.”

To better explain their views, the researchers found that “scientists should be poised to empathize” with the people they attempt to reach.

The study’s limitations

Dr. Scheufele went on to say that, while the study has excellent intentions, it assumes that huge segments of the population are “anti-science.” In his experience, “Americans trust science more than practically any other institution, other than the military,” he said.

“People can accurately report on what scientists regard as’settled facts,’ but they reach very different conclusions about how that matches with their political or religious convictions,” Dr. Scheufele remarked. “This is where the disconnects between the relatively naive sage-on-stage models of science communication […] and the realities of societal disputes surrounding science come from.”

He emphasized that, while scientific studies can provide statistical evidence for various outcomes — whether related to public health or the environment — they cannot advise people whether they should behave appropriately. This, he believes, is a political question “influenced but not resolved by science.”

Dr. Scheufele also mentioned that public and policymakers may have different interests than scientists, resulting in different techniques and conclusions. “Those are the reality of democratic science policy-making, not individuals being anti-science,” he explained.

The issue of democracy

Dr. Scheufele co-authored an article last year advising against scientists attempting to repair “public diseases” and obtain as much support for new findings as possible.

“[A]rtificial intelligence, brain organoids, and other disruptive breakthrough research, in my opinion, threaten what it is to be human.” Blind social faith in science would be as democratically unacceptable in those circumstances as no trust at all.”

“A critical public engagement with and continual evaluation of science is necessary as we face challenging political, moral, and regulatory decisions in many of these emerging areas of science.” Simply labeling anything that does not line with the scientific establishment’s inclinations as ‘anti-science’ is not only simple, but also intrinsically undemocratic,” he opined.

Nonetheless, he concurred with the present study’s authors, who stated that “those with more scientific literacy are just more sophisticated at strengthening their existing beliefs by cherry-picking concepts and material to defend their worldview.”

“Ironically, this diagnosis also characterizes what many scientists do when they decry popular anti-science sentiments: Their concerns may reflect their own worldviews rather than what public audiences are truly concerned about,” he concluded.