Why Don't Americans Trust Experts Anymore?

September 18, 2020 Topic: Science Blog Brand: The Reboot Tags: ScienceTrustExpertsSkepticismMedia

Why Don't Americans Trust Experts Anymore?

Trust in experts has been wavering. And a major culprit is the way science is communicated: While scientific journal articles are typically nuanced and cautious in their claims, the sound bites and memes that the general public consumes ignore that nuance and uncertainty.

 

As the race for a COVID-19 vaccine heats up, polls suggest that many Americans would refuse to be vaccinated. That’s a major concern for policymakers who hope to stop a pandemic. It’s tempting to dismiss those who would refuse a vaccine as ignorant or “anti-science” — but that approach does little to change hearts and minds.

Vaccine hesitancy — and other “anti-science” viewpoints like climate change skepticism — reflect a fundamental lack of trust in experts. To see the role that trust plays, consider the claim that the earth revolves around the sun. Most of us believe this claim, but even those trained in science have trouble explaining how we know it’s true. We believe it because someone we trusted — an expert — told us so.

 

Yet trust in experts has been wavering. And a major culprit is the way science is communicated: While scientific journal articles are typically nuanced and cautious in their claims, the sound bites and memes that the general public consumes ignore that nuance and uncertainty.

First, distrust of science is understandable when scientific findings are reported as incontrovertible facts, only to be reversed. In recent months, the limited scientific evidence suggesting that hydroxychloroquine may treat COVID-19 was abruptly reversed when a study showed that it actually harmed patients. That study was then retracted, only to be followed by another study showing that the drug didn’t have much impact at all.

This process occurs even when science isn’t proceeding at breakneck speed. Large scale attempts to replicate hundreds of the most influential research papers have largely failed to reproduce the majority of the original results. A review of the research shows a variety of foods — including wine, coffee, and eggs — could either cause or prevent cancer depending on which study you believe.

The fact that scientific findings are inconsistent and keep changing doesn’t mean science is fundamentally flawed; it just means science is messy and uncertain. But failing to convey the degree of uncertainty around scientific claims makes it harder for the public to identify newer, less certain claims — which are likely to be reversed — and undermines more established findings like the safety of vaccines.

Second, scientific advice is sometimes presented without nuance or acknowledgment that individual circumstances or assessments can differ. A claim about the COVID-19 deaths that could be prevented by a lockdown may be grounded in science, but people could legitimately come to different conclusions about how and when a lockdown should be implemented based on other values, such as economic security and human connection. Similarly, science has established that man-made global warming is occurring. But science can’t tell us how to weigh its impact against the pain created by measures designed to stop it. And science can warn us of the dangers of drinking wine or coffee while pregnant, but women may come to different conclusions about how to balance those risks.

This lack of nuance is a problem because when people feel their concerns are not being heard, they may fight back by casting doubts on the more established parts of science. For example, those who deny global warming may be concerned that policies undertaken to stop it will harm vulnerable groups. Similarly, questioning the lethality of COVID-19 may also be rooted in concerns about the economic, social, and mental health costs of lockdowns — which fairly absolute advice about staying home may appear to ignore.

To restore trust, headline writers, tweeters, reporters, and scientists need to ensure that the messages they amplify accurately convey uncertainty and nuance. They need to make clear, for example, that a new study showing the health benefits of coffee should carry less weight than a large body of research establishing the safety of vaccines. And while science can quantify the risks from COVID-19, we must look to our values and ethical principles to decide whether reopening the economy or attending a gathering is worth the risk.

In the longer-term, improving diversity in science could help. It’s hard to trust experts when we don’t think those experts share or understand our life experiences or worldview. Recent efforts focused on improving racial and gender diversity are a start, and much more needs to be done. But diversity of worldview matters too and is often overlooked. For example, Christians are underrepresented in science, and evangelical Christians may even face discrimination. Scientific claims that are backed by a more diverse group of scientists will likely carry more credibility across a diverse population.

The ultimate solution would be for the general public to demand more nuance and greater complexity in the news. If people required this type of reporting — instead of sound bites and memes — the media would be eager to provide it. This could be a small step each of us can take towards restoring trust.

Ben Ho is an associate professor of behavioral economics at Vassar College. Sita Nataraj Slavov is a professor at the Schar School of Policy and Government at George Mason University and a visiting scholar at the American Enterprise Institute.

This article was first published by the American Enterprise Institute.

Image: Reuters