Who’s Afraid to Share Science in Their Listserv?

An interview with Democratic pollsters on the chicken-and-egg challenges of keeping science in neutral political territory.

To learn more about the forces influencing Americans’ trust in science, Issues spoke with Celinda Lake and Emily Garner of Lake Research Partners, a progressive polling and research firm engaged in understanding how American voters’ confidence in institutions is shaping the political views and values of the electorate.

There’s an increasing divide between Democrats and Republicans when asked about their trust in science. As opinion researchers, what do trust and science mean in the context of these surveys?

“A huge dimension within trust is the role of government, which is coupled with trust in science. If you believe in more of a role for government, then you are more trusting of scientists, or vice versa.”

Lake: In general, we let people define scientists and trust in their own mind. We did an interesting study on the dimensions of trust, and a huge dimension within trust is the role of government, which is coupled with trust in science. If you believe in more of a role for government, then you are more trusting of scientists, or vice versa.

Another dimension is efficacy. Do you think things can be accomplished? Do you think things will work? And the reason it’s important to have this trust variable is we often find that opposition is not that important a variable. It’s more lack of confidence—that people are so distrusting of systems and institutions right now. That’s the bigger variable.

A third dimension of trust is fairness. And then anomie, the kind of violent disorientation. So those are the four dimensions of trust.

The pandemic was a major hit on confidence and trust in scientists, and it made scientists seem more political and polarized. The collapsing of trust among Republicans was so high that it’s really gotten a lot of notice. But it should also be noted that it’s not like Democrats increased in trust; their trust actually decreased slightly too.

There’s always been this sense that voters think that scientists are very honest, but they also think of them as changing their opinion all the time. First they tell you don’t eat eggs, then they tell you to eat eggs, then they tell you only eat the egg whites, then they tell you everything in the egg is fine, then you can’t buy eggs anyway so it doesn’t matter what they say—there’s a sense of arbitrariness. People look to facts and science to provide stability and reassurance. People overwhelmingly believe that information should be corrected if the science changes, but it unnerves them when that happens.

The context surrounding all of this is that trust in every institution is down. But the top two institutions for trust right now are still medical scientists and then the military. And one of the problems that scientists have is that their information is passed on by politicians and journalists—and voters have very little confidence in either one.

“The context surrounding all of this is that trust in every institution is down.”

Garner: Institutions like Pew Research Center ask variations of questions measuring trust in relation to confidence in scientists to act in the public’s best interests, and those numbers are all quite high. But you get some more concerning numbers when you ask people if they believe that scientists have bias or are able to overcome their own bias. People believe that scientists are trustworthy and mean to do well by the public, but they do not believe with nearly as much confidence that scientists are capable of evaluating their own biases. And that connects to how people feel about changes to scientific guidance, and why you might not always be able to trust scientists even when you think scientists are a trustworthy group of people.

What did Democrats mean when they had those signs that said, “In this house we believe science is real?”

Lake: Well, if you think about it, the scientists are agreeing with a lot of the positions that Democrats have. Part of this is because Democrats are now more college-educated and Republicans less so. And college-educated people are a lot more deferential to facts, while non-college-educated people tend to think that facts are a lot more changeable. So Democrats with the yard signs intend to say they believe in expertise. 

But whether you’re talking about vaccines, climate change, or food safety, there are two things that have happened. Science in the public discourse is usually being used in support of the role of government: We’re going to have food safety officers; we’re going to have public health officers; we’re going to require vaccines; we’re going to ban fossil fuels to fight climate change, etc. It’s kind of a chicken-and-egg question because science has buttressed the ideological positions of a lot of Democrats and buttressed a role for government. There is this mixing of support for the role of government, deference for expertise, and a sense of supporting the positions they already hold.

“The recent Pew numbers show that what was lost was an immense amount of trust among Republicans. Scientists became a vehicle for pushing policy that Democrats agreed with, which was a dynamic that already existed.”

Garner: When you zero in specifically on these lawn signs that say, “In this house we believe science is real,” those are almost frozen in time. People still have them, but they’re absolutely from 2020. And we can see that reflected in data. The recent Pew numbers show that what was lost was an immense amount of trust among Republicans. Scientists became a vehicle for pushing policy that Democrats agreed with, which was a dynamic that already existed. In that brief moment, Democrats’ strong trust in scientists and science shot up.

But it actually went back down again fairly quickly within a year or two. I think a lot of that has to do with the fact that there was a very clean and simple narrative at the beginning of the pandemic that scientists were on the side of these COVID policies that Democrats agreed with. And then things got murkier as the scientific guidance changed. It wasn’t necessarily one unified coalition anymore. There were Democrats who were upset when the Centers for Disease Control and Prevention rolled back some of their recommendations.

So we saw this immense collapse in trust among Republicans because of the way that science and scientists were used as messengers for a policy agenda that became very polarized. And that did not recover. We saw a spike among Democrats in this area of strong trust specifically, but it didn’t last. The impact of this was negative on trust in science overall, because you didn’t even keep those gains among Democrats.

Lake: And I think this could be a very treacherous time. What happens when the administration pulls out scientists who are launching a major study on autism and vaccines, implying that they’re going to find that autism is increased by vaccines? For Democrats, now what? Do we trust the scientists or not? I think it’s going to depend on which scientists and on what facts. It’s a very precarious time.

How is science involved in the way people are forming their political identities? 

Lake: Today science itself is seen as polarizing. It used to be that for women—moms in particular—science was in the category of “news you can use,” and a lot of moms would pass science on in their listservs, like, “This is what my pediatrician said.” But now women don’t want to divide their personal networks. Now science is seen as controversial, so people don’t pass on the science the way they used to. That’s really problematic.

“Now science is seen as controversial, so people don’t pass on the science the way they used to. That’s really problematic.”

I don’t think science was originally part of a Democratic identity. It may be for younger voters now, but originally, people acquired their party identity quite young—it’s highly correlated with their family, and they tend to keep it their whole lives. Now I’m wondering if science will become more of an identity marker for young people, driving them into different silos.

Garner: There are two broad points to make that are kind of contradictory. The first one is how polarized science is becoming and how science is now representative of government institutions. Trust in government institutions is highly predictive of a liberal political worldview. But at the same time, there’s still a bipartisan majority in this country that strongly trusts science and scientists—and, for instance, vaccines. Since there’s so much change happening, it’s hard not to interpret that change as a schism—but it’s a schism in an area where there were very high levels of trust to begin with.

So you’re losing some trust in certain communities, which is a big deal; but at the same time, there is still bipartisan trust in science. And we’ve also seen, in some research from the American Psychological Association on how political identity is formed, that there’s not as much partisan disagreement on contentious scientific issues as you might expect given what’s been happening.

We’re looking at a partisan divergence and an education divergence. And that points toward the role of trusted messengers in science education as an important way to stop the bleeding. But it’s just a lot more difficult to do that than it is to say it.

You’ve done more than a decade of research on climate change and vaccines. What trends do you see for the future?

Garner: What we would expect is that Democrats would continue to have high levels of trust, unless there is some other kind of COVID-like realignment on a scientific issue—which could always happen without warning. So, barring any sort of major change, you should expect to see polarization continue.

“This political intensification is hard to avoid, but we’re seeing it again with new issues. This is a pattern that doesn’t recede. It has to be addressed directly.”

And that’s a bad thing, because we’ve seen a similar trend with polarization happen in the past. Around 2008, views on climate change became more politically polarized too: Is climate change happening at all? Is it human caused? Is it something I should be concerned about? You started losing Republicans, especially strong Republicans, around the time that scientists became associated with policies that Democrats wanted on that issue. This political intensification is hard to avoid, but we’re seeing it again with new issues. This is a pattern that doesn’t recede. It has to be addressed directly, and it gets harder to do that. It’s hard to put it back in the bag.

Lake: For a long time, we’ve seen that trust in scientists in the climate change sphere was mixed, but the meteorologists—the weather people—were really trusted. Similarly on vaccines: Doctors, pediatricians, and nurses are the most trusted sources of information. Your doctor delivering the science is much more credible than a scientist you don’t have a personal connection to. These figures are also less politicizing. People don’t think their doctor is taking a political view. People don’t think their weatherman is taking a political view. And depoliticization of science is really critical; science is not well served by being involved in a political fight.

Another thing that I think overlaps with this question is that scientists are perceived as demanding that everyone must agree with them about the facts. Linguist and philosopher George Lakoff’s basic premise was that the frame matters more than the facts. And scientists will go in trying to control the facts. They should be trying to set the frame. When you’re answering, you’re losing. When you’re positing your frame and putting something positive forward, you’re winning.

What are your thoughts about what the science enterprise could do to build greater trust?

Lake: The first thing we need to do is depoliticize this conversation. One of the conversations around science that has not been successful is when it is associated with COVID, which is already very polarizing. But support for government science is more successful when it is connected with messages from disease groups working toward cures for cancer or Alzheimer’s, for example.

Likewise, science discussed in connection to avian flu has been less polarized. People don’t think Democrats like eggs and Republicans don’t; people think everybody likes eggs. No single institution or chicken farmer or even Perdue has enough wherewithal to deal with avian flu alone; the situation justifies a role for government. So picking some things that are not inherently polarizing, and then having nonpolarizing messengers, like farmers, is very important. If anything, people think chicken farmers are Republicans, not Democrats.

Garner: If before COVID you just looked at the polling on trusted information sources, it makes a ton of sense to put your biggest scientist up there on the TV and have him validate your policy. And ultimately what that achieved was polarization. It didn’t really convince anybody, it just polarized people on the entire issue of science. Now we can learn from that.

The main deviation we see where people don’t have trust in science at these very high levels has to do with the concerns about bias. They are concerned that scientists are biased and don’t know their own bias. And that has to do with concerns about incentives. People are concerned about the perversion of incentives in science. 

“Everything is better locally right now. People are so sick of the polarization nationally.”

People want to hear from scientists. It’s not even really a concern on a large scale that scientists are malicious. People want information from a range of sources and they want to see that data. But at the same time, it can’t be too complicated. So there are a lot of contradicting hurdles to overcome here. You want to be putting scientists and science-adjacent people like doctors in front of people, but you also don’t want them to be perceived as speaking on policy in the sense that they’re creating it. You want them to be sharing information in a way that’s perceived as neutral and in the best interest of the public. And the challenge is figuring out how to do that.

Lake: Everything is better locally right now. People are so sick of the polarization nationally. Besides working on transparency, another area is values. One thing that scientists don’t do enough is establish that they share the values of the public. In their attempt for neutrality, they often don’t establish well enough that their goal is a shared value with the public.

Garner: And not to open a whole other can of worms, but I think there are some genuine problems that the scientific community is dealing with in terms of replication issues and inconclusive studies that might not get published. The public has a sense of that, and that’s difficult because those are real problems, and they’re not easily solved, but that creates distrust. But what do you do about that? It’s not even a messaging problem yet, because it’s a bigger problem in the scientific community.

Given that science is starting to be seen as polarizing, is it wise to frame pushing back against the agenda of the new administration as “standing up for science,” and defending science “under attack”?

Lake: This is a great question, but we haven’t done the research yet. I think there needs to be a project that looks at how science can break through the polarization. Who are the right spokespeople? What are the salient facts that really convince people? How can science be part of conversations on increasingly polarized subjects while maintaining its most credible perspective?

Your participation enriches the conversation

Respond to the ideas raised in this essay by writing to forum@issues.org. And read what others are saying in our lively Forum section.

Cite this Article

Lake, Celinda, and Emily Garner. “Who’s Afraid to Share Science in Their Listserv?” Issues in Science and Technology 41, no. 3 (Spring 2025): 49–51. https://doi.org/10.58875/CILW1855

Vol. XLI, No. 3, Spring 2025