Getting Serious About Improving Biosafety
In “Better Biosecurity for the Bioeconomy” (Issues, Fall 2025), David R. Gillum correctly observes that the United States currently has “a rare policy window to design [biosecurity and biosafety] oversight that’s adaptive, collaborative, and capable of earning public trust.” Attention to biosecurity and biosafety has long been an unfunded mandate of federal funding, with limited oversight of much research and substantial oversight of limited research, but nothing in between.
Gillum’s example of the shaggy microbe perfectly captures how our mindset around biosecurity and biosafety concerns needs to change. While most people involved in the development of emerging biotechnology see security and safety as a compliance burden, the fact that these concerns change over the lifecycle of research requires a change in both perception and organizational structure. Researchers and engineers should consider themselves in partnership with those whose training and roles focus on “How could this go wrong, and how can we prevent that?” If methods for building and sustaining this partnership are infused into the training, conduct, and identities of everyone involved in biotechnology development, then it becomes part of the fabric of innovation—another tool, as Gillum notes, to “foster critical thinking and team-based risk awareness and mitigation.”
Without saying so, Gillum draws on substantial literature within the fields of science and technology studies and responsible innovation to show how engagement with these concerns requires changes in organizational structures and processes, narratives, and, through that, which research and innovation pathways are pursued. This is what Emma Frow and I, in Absence in Science, Security and Policy, called “caring for” these concerns, rather than “taking care of” them by checking the box.
Attention to biosecurity and biosafety has long been an unfunded mandate of federal funding, with limited oversight of much research and substantial oversight of limited research, but nothing in between.
I agree with Gillum that a National Biosafety and Biosecurity Agency (NBBA) aligns with recommendation 4.4a of the National Security Commission on Emerging Biotechnology’s April 2025 report, which I helped draft. Centralizing oversight and coupling it to an adaptive process that includes methods to assess and innovate solutions to safety and security concerns is long-overdue within biotechnology. In particular, I applaud his point that “NBBA would be able to lead cultural change,” making oversight “a collaborative process rooted in partnership.” This is not an easy task, but it is an essential one if we are to truly take biology from the lab to the foundation of the nation’s future economy.
We got here because of decades of inattention to the connective tissue between high-level biosecurity and biosafety policy and the on-the-ground practice of carrying out those policies. We lack that connective tissue because, at the end of the day, oversight for these concerns does not have a natural home within the government.
That must change. The liquefaction now occurring in many of the agencies with biosafety and biosecurity roles provides a window to act on these deeper structural reforms. Congress, the executive branch, companies, and universities all have a role to play in shaping that bigger reform. Let’s get to work.
Sam Weiss Evans
Former Senior Policy Advisor (2023–2025) on the National Security Commission on Emerging Biotechnology, where he led the portfolio on biosecurity, biosafety, and responsible innovation
Having collaborated with David R. Gillum for many years, I can attest to his commitment to biosafety and the research community. His article reflects the thoughtful leadership he has consistently brought to the field. I want to offer a complementary perspective drawn from two decades of approaching biosecurity through a law-enforcement lens.
Discussions of dual-use research often focus on agents, experiments, and oversight structures. But in federal law, dual use is defined very differently. US Code Title 18 Section 175 makes it a criminal offense to weaponize biology or to possess highly hazardous biological materials without a peaceful, protective, or research purpose. In other words, dual use is not about the organism or the technique. It is about intent or the absence of legitimate purpose. Congress established that distinction in the US Code as a means to fulfill the United States’ obligations under the Biological Weapons Convention, implemented in 1975, which commits nations to prevent the misuse of biology regardless of how technologies evolve.
If we recognize how rapidly the life sciences are evolving, then our approach to biosecurity must evolve as well. Top-down oversight can identify hazards, but it cannot anticipate every new capability, data-driven workflow, or experimental pivot. Compliance processes alone do not build a culture capable of recognizing when something is off. In my experience, the earliest warning signs never came from select-agent lists or static rules. They came from scientists themselves: the graduate student who noticed an unexpected pattern, the postdoc who asked the hard question, the facility manager who saw a workflow that didn’t align with a stated purpose. These moments come from awareness, not regulation.
If we recognize how rapidly the life sciences are evolving, then our approach to biosecurity must evolve as well.
That is why education is critical. We teach students the mechanics of pipettes and PCR (polymerase chain reaction) genetic tests, or how to design genetic circuits, but rarely the history that gives biosecurity its meaning. Many emerging scientists have never learned about, for example, the Tuskegee experiments conducted without full disclosure on impoverished African-American men, or about the story of Henrietta Lacks, whose cells were taken without her consent and have become a key tool in medicine. These episodes show how biology has been misused by governments, institutions, and corporations. They are the foundation for why trust matters and why safeguards exist. Without this understanding, biosecurity can feel like an overbearing bureaucracy rather than a shared responsibility.
By giving students this grounding, we cultivate citizens and scientists who understand both the power of the tools in their hands and the ethical landscape around them. This creates something more robust than rule-following. It creates a distributed early-warning network. A neighborhood watch for science. A community of Guardians of Science who are empowered to recognize when risk is emerging and equipped to act.
The future of the bioeconomy will depend on more than better oversight. It will depend on a culture that can delineate intent, understand why our safeguards were built, and see biosecurity as a shared obligation. Gillum’s call for adaptive oversight is the right direction. Strengthening the human element that comprises the community will make that oversight truly durable.
Ed You
Founder, EHY Consulting
Former Supervisory Special Agent, FBI
David Gillum makes a compelling and urgent case for improving oversight of high-risk biological research and proposes a National Biosafety and Biosecurity Agency to coordinate what is currently a fragmented federal process. His call for reform comes at a critical moment. As he notes, COVID-19 has created “a rare policy window to design oversight that’s adaptive, collaborative, and capable of earning public trust.”
Yet alongside the structural reforms Gillum proposes, we must address another challenge: the increasing politicization of biosecurity that threatens to undermine any new framework before it can take root.
Historically, biosafety and biosecurity enjoyed broad bipartisan support. In recent years, however, topics that were once confined to small expert circles, such as gain-of-function research aimed at genetically altering a pathogen to enhance its biological function, have become politicized. The Trump administration’s Executive Order 14292 directed agencies to revise or replace the Biden-era Policy on Oversight of Dual Use Research of Concern and Pathogens with Enhanced Pandemic Potential, calling out that administration for allowing “dangerous gain-of-function research within the United States with insufficient levels of oversight.” While oversight gaps existed, this framing of previous efforts as dangerous failures rather than imperfect frameworks needing refinement is particularly troublesome because it strips away the nuance that biosecurity requires and works against the adaptive, flexible oversight Gillum advocates. When biosecurity becomes partisan, oversight frameworks risk becoming rigid and politically motivated rather than adaptive, shifting with each administration rather than evolving with scientific understanding. Four-year policy cycles cannot build the institutional capacity and expertise that effective biosecurity requires.
When biosecurity becomes partisan, oversight frameworks risk becoming rigid and politically motivated rather than adaptive, shifting with each administration rather than evolving with scientific understanding.
Politicization also threatens the public trust that Gillum emphasizes is essential. The COVID-19 response illustrated this issue, where public health guidance became divided along partisan lines. Biosecurity policy risks following the same path, with technical terms such as “dual use research of concern” and “enhanced pandemic potential” increasingly used as political signals rather than carefully defined technical categories.
The stakes of politicization are particularly high given how rapidly the field is changing. Recent work on artificial intelligence-assisted genome design illustrates how biological risks are shifting beyond historic paradigms of physical containment and select agent lists. The oversight frameworks Gillum describes must not just address traditional dual-use research concerns centered on lab-based research, but also must anticipate and react to evolving technological capabilities.
Biological risks don’t respect election cycles or party platforms. If we’re serious about seizing this policy window, we need to actively protect biosecurity from partisan capture. Gillum is right that we need better biosecurity infrastructure for the bioeconomy. But that infrastructure must be built to last beyond any single administration. Only by consciously depoliticizing biosecurity—treating it as the complex, evolving technical challenge it is—can we create the adaptive, collaborative oversight system and public trust necessary to sustain it.
Anemone Franz
Visiting Research Fellow
American Enterprise Institute
The Executive Order Improving the Safety and Security of Biological Research (EO 14292), signed by President Trump on May 5, 2025, provides a response to a diagnosis of gain-of-function research framed as “dangerous,” without providing justification for this label. It further suggests that such research, aimed at genetically altering an organism to enhance its biological function, is taking place without adequate oversight mechanisms.
These positions risk ignoring the work that Institutional Biosafety Committees undertake. As David Gillum rightly points out, risk is not static or predictable, hence “biosafety must be flexible, with engaged oversight over the entire research life cycle.” Along the same vein, Sam Weiss Evans and colleagues have proposed experimental governance as revisiting assumptions, learning lessons from previous actions, and developing alternatives in biosecurity. This requires broadening the scope of imagination of what constitutes a threat or risk, what can be their source, and what are the best mechanisms to approach them.
While EO 14292 points to the replacement of the National Science and Technology Council’s Framework for Nucleic Acid Synthesis Screening, issued in 2024, current screening tools remain limited in their capacity to identify potential threats in commercial nucleic acid synthesis. Moreover, threats may emerge from actors with nefarious intentions engaging in activities not currently envisioned, and extending beyond nucleic acid synthesis services. As such, the capacity to anticipate and act—a hallmark of anticipatory governance of emerging technologies—is fundamental to biosecurity and safety governance. Such governance also must encompass fostering societal dialogue about the right balance of risks and benefits that the public is willing to bear regarding gain-of-function research, rather than imposing restrictions solely upstream. Placing overly cumbersome barriers on such research may leave the United States less prepared for a future pandemic—an event that experts agree is not a matter of if, but when.
Placing overly cumbersome barriers on such research may leave the United States less prepared for a future pandemic—an event that experts agree is not a matter of if, but when.
Gillum also highlights the “human practice” component of gain-of-function and biosafety research in the United States—an aspect often overlooked in policymaking. In my research on the cultural dimensions of work in biocontainment facilities, I have found biosafety specialists and researchers to be highly rigorous in their practices, including the appropriate use of personal protective equipment, anticipating potential pathways of pathogen transmission, and ensuring physical isolation.
Biosafety professionals and researchers are strongly motivated to conduct work that is relevant for pandemic preparedness and public health, and they take their peers’ safety extremely seriously. A biosafety specialist interviewed during my research noted that their motivation to work in biosafety stems not only from a commitment to protecting potential patients but also from a responsibility to ensure the safety of their colleagues. Their work underscores that risks to public health are to be managed, not ignored. Both biosafety and pandemic preparedness involve social and technical dimensions that must be carefully orchestrated. Applying blunt restrictions on gain-of-function research—based solely on its designation as “dangerous”—limits opportunities to rethink pandemic preparedness, especially in light of the COVID-19 pandemic and the lessons that could be learned and implemented.
Alberto Aparicio
Assistant Professor, Department of Bioethics and Health Humanities
University of Texas Medical Branch
David Gillum stresses the need for a new biosafety and biosecurity oversight system that adapts to new developments in science and technology; enables collaboration among researchers, policymakers, and institutional administrators to assess and reduce risk to maximize benefit; and engenders trust among the public that scientists are engaging in research responsibly. Toward this end, he advocates for the creation of a National Biosafety and Biosecurity Agency to “create shared definitions and clear review criteria” for dual use research and pathogens of enhanced pandemic potential (now, as before, called “gain-of-function” research).
In 2004, following publication of the National Academies’ report Biotechnology Research in an Age of Terrorism, the US government established the National Science Advisory Board for Biosecurity (NSABB) under the auspices of the National Institutes of Health to inform policy on dual use research and engage international counterparts on oversight of such research. Though its charter has since changed, the NSABB was set up to be the type of entity Gillum calls for. Among its initial tasks, the NSABB developed criteria for dual use research of concern, guidelines for risk and benefit analysis of potential dual use research and results, guidelines for oversight of such research, considerations for types of research requiring institutional review, and considerations for national policies governing review of such research.
Though its charter has since changed, the NSABB was set up to be the type of entity Gillum calls for.
Currently, the NSABB focuses on providing advice, guidance, and recommendations on oversight of biomedical research with dual use potential and associated biosecurity and biosafety issues and activities. In parallel, advances in genome editing, synthetic biology (or engineering biology), and use of artificial intelligence with biology have raised significant concern about the biosafety and biosecurity risks of these technologies. Additional work under the auspices of the National Academies has focused on these issues, including activities on scientific advances and applications of genome editing tools and their broader implications, particularly highlighting biosecurity and biosafety concerns and beneficial applications for medicine and agriculture.
The Academies also issued a report that introduced a framework for assessing biosecurity risks of synthetic biology advances. This study examined, for example, how a malicious actor might exploit the technologies, and highlighted examples of synthetic biology research that may be of concern. Another report analyzed unique biosecurity risks and biodefense benefits of AI-enabled biological models, focusing on the potential for these models to enhance the abilities of malicious actors to design harmful pathogens or toxins, and the capabilities of biodefense experts to prevent, detect, and assess biological threats. Very recent work focuses on a subset of highly advanced engineering biology research, specifically the creation of synthetic cells, that challenges current frameworks for assessing benefits and dual use risks of biological research. A new report on synthetic cell research and development and related environmental, biosafety, and biosecurity considerations and proceedings of a workshop on mirror cells will be released soon.
Looking to the future, the National Security Commission on Emerging Biotechnology, chartered by Congress, has highlighted the promise of biotechnology for meeting several national challenges, including in health, agriculture, and national security and defense. It offered numerous recommendations for building the biotechnology innovation ecosystem from basic research through advanced development and application that integrates safely, securely, and responsibly throughout.
Gillum highlights issues that are relevant to these and other discussions about biosafety and biosecurity in a rapidly advancing biotechnology and life sciences research and development ecosystem.
Kavita M. Berger
Senior Program Director
Life Sciences and Biotechnology Program Area
Center for Health, People, and Places
National Academies of Sciences, Engineering, and Medicine