Rebecca Rutstein and the Ocean Memory Project, "Blue Dreams" (2023), still from the 2 minute and 40 second digital video.

How to Clean Up Our Information Environments

A DISCUSSION OF

Misunderstanding Misinformation
Read Responses From

I was delighted to read “Misunderstanding Misinformation” (Issues, Spring 2023), in which Claire Wardle advocates moving from “atoms” of misinformation to narratives and laments our current, siloed empirical analysis. I couldn’t agree more. But I would also like to take Wardle’s thoughts further in two ways, as a call to action and a warning.

First, focusing on narratives requires understanding how they circulate in certain contexts, which some groups, including the US Surgeon General, refer to as “information environments.” My background is in medicine, where trying to define a toxin or poison is as difficult as defining misinformation: it all depends on context. Even water can be toxic—ask any marathon runner hospitalized after drinking too much of it. A public health approach isn’t to focus on either atoms or narratives, but on the context that imbues danger. Lead and chromium-6 (think Erin Brockovich) are usually of little concern unless ingested by the young. Eliminating either is a fool’s errand, so we mitigate their potential harms: we test for them, bottled water companies remove them, and we regulate them to minimize the riskiest exposures, such as by restricting lead paint and gasoline.

Taking a similar approach to misinformation would be recognizing that any information can be toxic in the wrong context. Environmental protection requires monitoring, removal of toxins when necessary, and regulation to prevent the most egregious harms. While the English physician John Snow cannot step in to fix things—as he did by removing a public water pump handle in his famous, albeit authoritarian, solution to London’s 1854 cholera epidemic—we must recognize that cleaning up our information environments is not “if” but “how.” Because keeping tabs on revenge porn, hate speech, and pedophilia makes sense, but where and how to draw the line in borderline cases, such as the narratives Wardle speaks of, is less clear.

And that leads to the second point echoing Wardle’s suggestion that we need more research with fewer silos to better understand in what conditions, contexts, and communities information becomes most toxic. But I’m concerned that her call for more holistic research will go unheeded because environments are challenging to study. For example, over 50 years ago, the social scientist William McGuire developed a “matrix” of communication variables related to persuasion, listing on one side all the ways a message could be constructed and delivered, and on the other the spectrum of potential impacts, from catching someone’s attention to changing their behavior. Research was needed in every cell to fully understand persuasion. Today, most research fits into only a small part of that matrix: assessing the effect of small changes in message characteristics on beliefs.

We must recognize that cleaning up our information environments is not “if” but “how.”

Such studies are quick and cheap to do, test important theories, and are publishable. To be clear, the problem is not the research, but the sheer amount of it relative to the deep, structural research that is sorely needed. Nick Chater and George Lowenstein lament this disparity in a recent article in Behavioral and Brain Sciences, observing how individually framed research deflected attention and support away from systemic policy changes. Writing in The BMJ, published by the British Medical Association, Nason Maani and colleagues go further, arguing that discourse today is disproportionately “polluted” as discussion of individual solutions crowds out discussion of more needed—but more difficult to study—structural solutions.

In short, why have we made so little progress? Because social scientists have been and continue to be incentivized to study certain types of questions, leaving McGuire’s matrix embarrassingly unfilled. A 2011 Knight Foundation report argued that we should be assessing community information needs, a perspective, I would argue, that is consistent with understanding their information environments. A National Academies 2017 consensus study setting a research agenda for the science of science communication also called for a systems-based approach to understand the interrelationship between key elements that make up science communication: communicators, messages, audiences, and channels, to name a few. Yet years—and a global pandemic—later, individually framed research on misinformation dominates the discourse.

These incentives are myriad and deeply entrenched in academia and funding agencies. In his 2016 New Atlantis essay, “Saving Science,” Daniel Sarewitz characterized this as a post-world war problem with “institutional cultures organized and incentivized around pursuing more knowledge, not solving problems.” The result, he argued, is that “science isn’t self-correcting, it’s self-destructing.” It raises a scary prospect: perhaps solving the problem of misinformation may require fixing, or at least circumventing, a sclerotic system of science that continues to reproduce the same methodological biases decade after decade. Otherwise, I cringe with macabre anticipation about reading the future recommendations on improving science communication after the 2031 H5N1 influenza pandemic. I imagine they will add to the chorus of unheeded calls for more holistic, context-informed studies.

Assistant Professor of Medicine, Weill Cornell Medicine

Chief Medical Officer, Critica

My “ah-ha” disinformation moment came in July 2014, in the hours and days following the crash of Malaysia Airlines Flight 17 in eastern Ukraine. I was a senior official in the federal agency that oversees US international broadcasting, including Voice of America and Radio Free Europe/Radio Liberty, and closely monitoring media reports of the tragedy. Credible reporting quickly emerged that the airliner had been shot down by Russian-controlled forces. Yet within hours the global information space was muddled by at least a dozen alternate narratives, including that the Ukrainian military, using either a jet fighter or a surface-to-air missile, had downed the aircraft; that bodies recovered at the crash scene actually had been corpses, loaded on an otherwise empty passenger jet that was remotely piloted and shot down by Western forces who then blamed Russia; and even that the intended Ukrainian target had been the aircraft of Russian President Vladimir Putin, who was returning at that time from a foreign trip. In this messy, multi-language information scrum, Russia’s likely culpability was just one more version of what might have happened.

It quickly became clear that, at a quickening rate, false and misleading information online was seeping into the marketplace of ideas, eroding public discourse around the world and carving fissures throughout societies. The mantra of US international broadcasting throughout the Cold War—that, over time, the truth would win in the competition of ideas—was under unprecedented pressure as the disinformation tsunami gathered momentum. We knew we had a problem. But what was to be done?

Enter Claire Wardle and other clear-thinking academics and analysts who initially helped to clarify the parameters of “information disorder,” in part by dismissing the woefully inadequate term fake news and introducing the more precise terms misinformation, disinformation, and malinformation.

At a quickening rate, false and misleading information online was seeping into the marketplace of ideas, eroding public discourse around the world and carving fissures throughout societies.

In her essay, Wardle again comes to the rescue, with fresh analysis and recommendations that include moving beyond even the improved trifecta of information terminology, which she calls “an overly simple, tidy framework I no longer find useful.” “We researchers,” she chides, “(have) become so obsessed with labeling the dots that we can’t see the larger pattern they show.” Instead, researchers should “focus on narratives and why people share what they do.”

What’s needed, Wardle argues, is a better understanding of the “social contexts of this information, how it fits into narratives and identities, and its short-term impacts and long-term harms.”

This is music to the ears of this media professional. Responsible, professional journalists already have responded to the disinformation challenge through actions such as quick-response fact-checking operations and innovative investigations to expose disinformation sources. I believe that they can do more, including the sort of investigative digging and connecting players and actions across the disinformation space that Wardle calls for. Journalists, for instance, can act on her recommendation to enhance genuine, two-way communication between publics and experts and officials, and they can support community-led resilience and take part in targeted “cradle to grave” educational campaigns to “help people learn to navigate polluted information systems.”

Why pursue this course? As a career journalist and therefore a fervent defender of the First Amendment, I am wary of any moves, no matter how well-intentioned, to restrict speech. Top-down solutions such as legislating speech are but a short step away from the slippery slope toward censorship. The ultimate defense against disinformation, therefore, must come from us, acting as individuals and together—in short, from a well-educated, informed, engaged public. To paraphrase Smokey Bear: Only YOU can prevent disinformation. And the better we understand and implement the prescriptions that Wardle and others (including the authors of the other insightful offerings in Issues under the “Navigating a Polluted Information Ecosystem” rubric) continue to pursue, the better our prospects for limiting—and perhaps even preventing—wildfires of disinformation.

International Journalist, Editor, and Media Manager

Former Lecturer in Communication and Political Science at Ohio State University

Cite this Article

“How to Clean Up Our Information Environments.” Issues in Science and Technology 39, no. 4 (Summer 2023).

Vol. XXXIX, No. 4, Summer 2023