Rebecca Rutstein and the Ocean Memory Project, "Blue Dreams" (2023), still from the 2 minute and 40 second digital video.

Science and Global Conspiracies

What difference does the internet make to conspiracy theories? The most sobering aspect of “How Science Gets Drawn Into Global Conspiracy Narratives” (Issues, Spring 2023), by Marc Tuters, Tom Willaert, and Trisha Meyer, is its focus on the on the seismic power of the hashtag in choreographing new conspiratorial alliances between diverse sets of Twitter users. Importantly, this is also happening on other platforms such as Instagram and TikTok. During a recent data sprint project that I cofacilitated at the University of Amsterdam, it became increasingly apparent that TikTok hashtags resemble an apophenic grammar—a sort of language that tends to foster perceptions of meaningful connections between seemingly unrelated things. Specifically, the data sprint findings suggest that co-hashtags were partly responsible for the convergence of conspiracy theory and spirituality content over the course of 2021 (to conjure what is often termed conspirituality).

But notwithstanding the significance of hashtag stuffing, the other major idea that figures prominently in current conspiracy theory research is weaponization. In our present political moment, seemingly innocuous events can become weaponized as conspiratorial dog whistles, as can celebrity gossip and fan fiction. On July 4, 2020, the Florida congressional candidate K. W. Miller claimed in a tweet that the popular music icon Beyoncé is actually a white Italian woman called Ann Marie Lastrassi. Miller seems to have “discovered the truth” about Beyoncé via a speculative Instagram comment and parodic Twitter thread, and his #QAnon clarion call was a deliberate misappropriation of Black speculative discourse with white supremacist overtones. He was building upon both the 2016 denunciation of Beyoncé by the InfoWars conspiracy theorist Alex Jones and longstanding hip-hop rumors regarding Beyoncé and Jay-Z’s involvement with the Illuminati, an imagined organization often portrayed as pulling global strings of power.

In our present political moment, seemingly innocuous events can become weaponized as conspiratorial dog whistles, as can celebrity gossip and fan fiction.

Could derisive disarmament by counter-conspiratorial web users be an effective way of laughing in the face of such weaponization tactics? Through a distinctive kind of Black laughter discussed by Zari Taylor and other social media scholars, some Twitter users attempted to extinguish the potential for hilarity in Miller’s tweet by asserting that the joke is actually on white conspiracy theorists who are willing to believe that Beyoncé is Italian while denying the very real and palpable existence of systemic racism in the United States. Ultimately, Miller’s election campaign was wholly unsuccessful, and the collective disarmament effort seems to have been relatively effective, both within and beyond the notion of Black Twitter introduced by Sanjay Sharma, a scholar of race and racism. Several years on from the event, Black Twitter users memorialize and celebrate Ann Marie Lastrassi as their Italian Queen in a similar way to Beyoncé’s own reclamation of racist #BoycottBeyoncé hashtags in 2016.

The internet’s contribution to the spread of conspiracy theories has less to do with “echo chambers” and algorithmically dug “rabbit holes” and much more to do with the perverse echo effects of connected voices. These voices listen to each other in order to find something that they might seize upon to deliver virtuosic conspiratorial performances. Although researchers, monitoring organizations, and policymakers might learn something from the echo effects of the Lastrassi case, it is also true, as my colleague Annie Kelly regularly reminds me, that disarmament is sometimes nothing more than re-weaponization. It might even serve to fan the flames of conspiracy theory in the age of the so-called culture wars.

Postdoctoral Research Associate, University of Manchester

Lecturer I in Music, Magdalen College, University of Oxford

Marc Tuters, Tom Willaert, and Trisha Meyer explore the emergence of a distinctive feature of conspiracy theories: interconnectedness. The authors focused on the Twitter case-study of public understanding of science, and specifically on the use of the hashtag #mRNA. They found that the hashtag, initially used in science discussions (beginning in 2020), was later hijacked by conspiracy narratives (late 2022) and interconnected with conspiratorial hashtags such as #greatreset, #plandemic, and #covid1984.

In a recent paper, one of us quantified such interconnectedness in the largest corpus of conspiracy theories available today, an 88-million-word collection called LOCO. On average, conspiracy documents (compared with non-conspiracy documents) showed higher interconnectedness spanning multiple thematic domains (e.g., Michael Jackson associated with moon landing). We also found that conspiracy documents were similar to each other, suggesting the mutually supportive function of denouncing an alleged conspiratorial plan. These results extend Tuters and colleagues’ research and show that interconnectedness, not bound only to scientific understanding, is a cognitive mechanism of sensemaking.

Conspiracy theories simplify real-world complexity into a cause-effect chain that identifies a culprit. In doing so, conspiracy theories are thought to reduce uncertainty and anxiety caused by existential threats. Because people who subscribe to conspiracy theories do not trust official narrative, they search for hidden motives, consider alternative scenarios, and explore their environment in search for (what they expect to be the) truth. In this process, prompted by the need to confirm their beliefs, conspiracy believers tend to quickly jump to conclusions and identify meaningful relationships among randomly co-occurring events, leading to the “everything is connected” bias.

Conspiracy theories simplify real-world complexity into a cause-effect chain that identifies a culprit.

As Tuters and colleagues suggest, social media might offer affordances for such exploration, thus facilitating the spread of conspiracy theories. The authors also suggest that not all social media platforms are equal in this regard: some might ease this process more than others. Work currently in progress has confirmed this suggestion: we have indeed found striking differences between platforms. From a set of about 2,600 websites, we extracted measures of incoming traffic from different social media platforms such as Twitter, YouTube, Reddit, and Facebook. We found that YouTube and Facebook are the main drivers to conspiracy (e.g., infowars.com) and right-wing (e.g., foxnews.com) websites, whereas Reddit drives traffic mainly toward pro-science (e.g., healthline.com) and left-wing (e.g., msnbc.com) websites. Twitter drives traffic to both left and right politically biased (but not conspiracy) websites.

Do structural differences across social media platforms affect how conspiracy theories are generated? More experimental work is needed to understand the mechanisms by which conspiracy theories are generated by accumulation. Social psychology has furthered our understanding of the cognitive predisposition for such beliefs. Now, building on Tuters and colleagues’ work, it is time for the cognitive, social, and computational sciences to systematically investigate the emergence of conspiracy theories.

Postdoctoral Research Associate

Professor of Cognitive Science

University of Bristol, United Kingdom

Cite this Article

“Science and Global Conspiracies.” Issues in Science and Technology 39, no. 4 (Summer 2023).

Vol. XXXIX, No. 4, Summer 2023