Rebuilding Public Trust in Science
A DISCUSSION OFScience Policy in the Spotlight
As Kevin Finneran noted in “Science Policy in the Spotlight” (Issues, Fall 2023), “In the mid-1950s, 88% of Americans held a favorable attitude toward science.” But the story was even better back then. When the American National Election Study began in 1948 asking about trust in government, about three-quarters of people said they trusted the federal government to do the right thing almost always or most of the time (now under one-third and dropping, especially among Generation Z and millennials). Increasing public trust in science is important, but transforming new knowledge into societal impacts at scale will require much more. It will require meaningful public engagement and trust-building across the entire innovation cycle, from research and development to scale up, commercialization, and successful adoption and use. Public trust in this system can break down at any point—as the COVID-19 pandemic made painfully clear, robbing at least 20 million years of human life globally.
For over a decade, I had the opportunity to support dozens of focus groups and national surveys exploring public perceptions of scientific developments in areas such as nanotechnology, synthetic biology, cellular agriculture, and gene editing. Each of these exercises provided new insights and an appreciation for the often-maligned public mind. As the physicist Richard Feynman once noted, believing that “the average person is unintelligent is a very dangerous idea.”
The exercises consistently found that when confronted with the emergence of novel technologies, people were very consistent regarding their concerns and demands. For instance, there was little support for halting scientific and technological progress, with some noting, “Continue to go forward, but please be careful.” Being careful was often framed around three recurring themes.
First, there was a desire for increased transparency, from both government and businesses. Second, people often asked for more pre-market research and risk assessment. In other words, don’t test new technologies on us—but unfortunately this now seems the default business model for social media and generative artificial intelligence. People voiced valid concerns that long-term risks would be overlooked in the rush to move products into the marketplace, and there was confusion about who exactly was responsible for such assessments, if anybody. Finally, many echoed the need for independent, third-party verification of both the risks and the benefits of new technologies, driven by suspicions of industry self-regulation and decreased trust in government oversight.
Taken as a whole, these public concerns sound reasonable, but remain a heavy lift. There is, unfortunately, very little “public” in the nation’s public policies, and we have entered an era where distrust is the default mode. Given this state of affairs, one should welcome the recent recommendations proposed to the White House by the President’s Council of Advisors on Science and Technology: to “develop public policies that are informed by scientific understanding and community values [creating] a dialogue … with the American people.” The question is whether these efforts go far enough and can occur fast enough to bend the trust curve back before the next pandemic, climate-related catastrophe, financial meltdown, geopolitical crisis, or arrival of artificial general intelligence.
Environmental Law Institute