Forum – Spring 2020
Science institutions in a new world
In “Science Institutions for a Complex, Fast-Paced World” (Issues, Winter 2020), Marcia McNutt and Michael Crow provide an important wake-up call for those who are deeply concerned about the current state of world, with its urgent need for more tolerance, evidence-based thinking, and science-based policy-making. The authors rightly argue that major institutions must move much more quickly and aggressively to address the enormous challenges facing the crowded, fast-changing planet. The US National Academies and research universities they discuss are well known to me, inasmuch as I have spent more than 50 years in research universities and 12 years as the president of the US National Academy of Sciences. Both are by nature conservative organizations: the Academies because they have critical reputations to protect as the provider of authoritative scientific, engineering, and health advice to the nation, and universities because their academic traditions so heavily value individual intellectual achievement over collaborative efforts with a more practical bent.
I very much agree with the authors when they state that “excellence and social embeddedness will have to be fused in the nation’s universities to achieve public value,” and I write to highlight a particular issue that urgently requires much more attention from US universities: the need to drive cycles of continuous improvement in public education through scientifically based education research. In the late 1990s, the National Academies created a project to ask: Why has research supported innovation and continuous improvement in medicine, agriculture, and transportation, but not in education—and what can we do about it?
Briefly, the answer from two successive studies was that education is missing the equivalent of teaching hospitals in medicine—that is, places where researchers, teachers, and designers collaborate in practice settings to observe, explain, document, replicate, and evaluate practice as a source of new knowledge. This led to the proposal that a small number of US school districts be established as special “field sites” that would readily allow problems to be defined and solutions tested in real-world classrooms. In 2004, a small, independent nongovernmental organization, called the Strategic Education Research Partnership (SERP), was spun out of the Academies to test this idea. The Boston Public Schools system served as the first SERP field site, with adolescent literacy chosen by its superintendent, Tom Payzant, as the focus.
Since that time, SERP’s work has extended into additional areas, including mathematics and science (see https://www.serpinstitute.org/). But progress has been slowed by the need to change two cultures: that of school districts, which are often driven by short-term needs that make long-term research efforts difficult to sustain, and that of research universities, whose efforts to improve K-12 education, if they exist, are driven by grants to individual faculty members that make the whole no more than the sum of its parts.
I would like to suggest a new effort by the National Academies—in collaboration with university leaders such as Michael Crow—to ask a new question: How can the nation much more energetically and effectively use its research universities to improve precollege education, perhaps the most critical issue for the world’s future?
Chancellor’s Leadership Chair in Biochemistry and Biophysics for Science and Education
University of California, San Francisco
President, National Academy of Sciences 1993–2002
Marcia McNutt and Michael Crow pay homage to the prescience of Vannevar Bush’s Science, the Endless Frontier and its transformational impact on the US scientific enterprise. They then discuss how the science and engineering community should evolve going forward, with a particular focus on institutions such as the National Academies and the nation’s research universities. They argue for a more public-facing, socially engaged enterprise.
Yet few people beyond the scientific community appreciate how Bush’s report served as a blueprint for government investment in scientific research and transformed the nation’s approach to solving complex problems. Even fewer make the connection between the US government’s investment in science and the fruits of these investments in diverse areas, including health care and the everyday technology of the twenty-first century. Most people assume that a better educated citizenry will naturally make these connections from the evidence and be more supportive of science—but the expected support for science is often lacking. Why?
We must have the courage to look in the mirror—as a scientific enterprise and a higher education community—and ask how we can better engage the public. We must move from science viewed solely as an elite activity in the research university to science that, as McNutt and Crow say, is “deeply entwined with the practical experience of daily life,” such as prevention and treatment of diseases like COVID-19, reversing climate change, and the creation of jobs.
As a first step, we must ask ourselves: How do Americans feel about science? Do they think of themselves as part of the scientific enterprise? Do they think of science as critical to their day-to-day lives, essential to health, jobs, and social mobility? Can we imagine a society that is as excited about science as it is about basketball?
Many Americans have not had a good educational experience with science. As we talk to students and others, it is common to find a majority who admit that by 11th grade they either loved or hated science and mathematics or who were interested in the STEM fields (science, technology, engineering, and math) when entering college but transferred to other majors, leaving with negative feelings about science.
Despite our success in educating more Americans since the Endless Frontier and educating more in STEM specifically, the higher education community has not made enough progress toward including people of all backgrounds in the work we do. Consequently, too few people see themselves reflected in the important work done by scientists. Even increased enrollments for women and people of color haven’t translated into similar diversity of the scientific workforce. For example, fewer than 2% of scientists at the National Institutes of Health are black and about 4% are Latinx, and the percentage of computer science degrees awarded to women has fallen from 36% to 18% since 1984. How can we expect members of the black or Latinx communities or women to support science when they are significantly underrepresented in the enterprise?
Looking in the mirror, we must communicate more directly and effectively with the public about science. By pulling laypeople into the conversation, we can cultivate a public that believes in the importance of science for innovation and problem-solving and supports public investment in the scientific enterprise. As champions for science, we should ensure that the student experience emphasizes the reciprocal value of the arts and sciences and strives to eliminate the unhelpful division between STEM and arts/humanities. We must also cultivate a scientific enterprise that welcomes students of all backgrounds and enables them to identify with science. Our efforts to increase diversity and inclusion must be data-driven and intentional, and we must build on successful models now being replicated across universities.
With COVID-19 on the minds of everyone, we have the opportunity to help the public better understand the strong connections between science, public health, the economy, and the future of humankind.
Freeman A. Hrabowski III
President, University of Maryland, Baltimore County (UMBC)
Peter H. Henderson
Senior Adviser, Office of the President, UMBC
J. Kathleen Tracy
Professor, University of Maryland School of Medicine
Do not forget about Harley Kilgore.
Any discussion about American science policy in the post-World War II era should include Kilgore. A US senator from West Virginia from 1941 to 1956, Kilgore chaired the Senate’s Subcommittee on War Mobilization. He also served on the Special Committee to Investigate the National Defense Program. In these roles, Kilgore played a key role in determining the most effective ways to organize military research.
As a result of his legislative work on military research, Kilgore developed opinions about the role of the federal government in science and technology research more broadly. In many ways, Kilgore’s views about the best way to promote scientific research pitted him against Vannevar Bush. The two had a robust debate that spanned multiple years, numerous pieces of legislation, the release of Bush’s Endless Frontier report in 1945, and the establishment of the National Science Foundation in 1950.
Although Bush is widely regarded as the architect of postwar American science policy, many ideas that Kilgore championed have become part of that policy. For example, Kilgore wanted the president, not a board of scientists, to appoint the NSF director to ensure public accountability while avoiding undue politicization. Pending Senate confirmation, Sethuraman Panchanathan will become the 15th presidential appointee to fill this role.
Kilgore also advocated for federal funding for social sciences research. In 1957, NSF created the Social Science Research Program. Today, NSF’s Social, Behavioral, and Economic Sciences Directorate invests more than $200 million across the social sciences, which represents more than 60% of the total annual federal funding for basic social science research conducted at universities and colleges. This investment is especially important given recurring political and ideological attacks on the directorate.
I suspect Kilgore, who called for federal research funding to be distributed widely to avoid concentration of research and education in a small number of states, would also be pleased that NSF created the Experimental Program to Stimulate Competitive Research (EPSCoR) in 1979. EPSCoR is a competitive grant program designed to build research capacity in states that historically have underperformed in NSF funding. From five charter states (West Virginia among them), EPSCoR has grown today to 28 states and jurisdictions, from Alabama to Wyoming. As further evidence of the program’s goal, EPSCoR has expanded from NSF to five other federal agencies: the Departments of Agriculture, Defense, and Energy, the National Aeronautics and Space Administration, and the National Institutes of Health. These agencies received more than $650 million in federal funding in fiscal year 2019 for EPSCoR and EPSCoR-type programs, enabling talented researchers and students to contribute to America’s research enterprise.
McNutt and Crow do an admirable job of describing Vannevar Bush’s significant influence on the formulation of American science policy during the past 75 years. To their essay, I offer a friendly amendment: remember Harley Kilgore. As a voice for public accountability, the social sciences, and what we today call the “flyover states,” Kilgore was equally significant—and successful.
E. Gordon Gee
President, West Virginia University
Marcia McNutt and Michael Crow properly describe Vannevar Bush’s Science, the Endless Frontier as a watershed document that changed the way the United States funded and conducted science after World War II. I believe it is the greatest science policy treatise ever produced, with direct consequences—intended and unintended—that led to structural reforms that transformed American science from “Little Science” to “Big Science.”
McNutt and Crow are also prescient in realizing that even the best of science policies and their applications must change with the times. I would put the matter more strongly. I believe there is ample evidence to demonstrate: 1) that the structure set up by Bush and Congress after the war had much to do with catapulting American university-based science to international preeminence; and 2) that although the model still works in many respects, it needs to be rethought in a number of significant ways, as part of a much larger change in American research universities.
Consider, in no particular order, just a few of many potential changes that ought to be considered in a new version of Science, The Endless Frontier:
- Rethink the role of scientists and Congress in the allocation of scientific funding. Scientists should be engaged more directly in the setting of scientific and health priorities, with congressional oversight and review as in national intelligence issues, but without veto power.
- Federal and state governments ought to create at multiple universities a highly prestigious and sought-after fellowship program, the equivalent of the Howard Hughes Medical Institute Investigator Program, which has been enormously successful in the biomedical fields, but across the sciences. One particular target should be talented young people who are willing to think outside the box and to challenge the orthodoxies in the social and behavioral sciences.
- We need policies that support the growing multidisciplinary structures of universities. The paradigm shift will involve a movement away from siloed research to a combination of deep knowledge with research across disciplinary and school lines that is now virtually required for the growth of understanding.
- We need novel programs that support undergraduate teaching of science and technological literacy.
- We need a mechanism that offers training to members of Congress and their staffs on the state of knowledge about major scientific challenges and global problems.
- We need to recognize the crisis for young scientists who are unable to receive their first grants to set up their own labs until they are past the age of 40.
- We need the Presidential Science Advisor and the President’s Council of Advisors on Science and Technology to occupy a strengthened and permanent place in the Executive Branch.
- We need (as is now obvious) to create a new and well-funded national vaccine institute that focuses on potential pandemics. We cannot expect private industry to do this, given that 90% of potential pandemics don’t materialize.
- And finally, our research must even look at warfare and conflict differently from the past, now that cyberwarfare is a formidable opponent and social media is capable of adversely affecting freedom of expression, controlling privacy, and influencing personal behavioral choices.
McNutt and Crow place the right questions in front of us. Now we must take their ideas as a point of departure and create a new transformative policy document for the twenty-first century.
Jonathan R. Cole
John Mitchell Mason Professor of the University
Department of Sociology
As both a government agency scientist and a faculty member at a Florida two-year college, I hope Marcia McNutt and Michael Crow’s vision will include the two sectors I work in, particularly regarding the transfer of scientific knowledge to many more people. This is not something large research universities can do on their own, even the land-grant institutions with their extension services. Smaller state colleges, whose mission is to serve a much more local population, are more “socially embedded” in their communities (to use the authors’ phrase) and can help connect more at a local level.
Similarly, resource management agencies such as mine can also help with the identification of important local and regional issues and communication of scientific knowledge, as we have ties with local business communities, schools, nongovernmental organizations, and the elite research universities. By way of example, my group has done considerable work with the University of Florida.
Robert A. Mattson
Environmental Scientist V
Florida Bureau of Water Resources
St. Johns River Water Management District
Precision medicine and individual health
In “Will Personalized Medicine Lead to a Healthier Population?” (Issues, Winter 2020), Richard Cooper and Nigel Paneth write that “mass diseases are the products of the societies in which we live” and that “we do not believe genomics and precision medicine will transform biomedicine and population health.” These statements summarize the authors’ views on the role (or lack thereof) of precision medicine in improving population health. Cooper and Paneth provide an overview of the societal changes and scientific and technological advances that have helped achieve a remarkable increase in life expectancy over the past 60 years. They review the history and premise of genomics, and how precision medicine will most likely not result in an improvement in population health, with a few exceptions.
The authors’ ideas and language resonate with the eminent epidemiologist Geoffrey Rose’s insights decades ago. Sick societies, he said, require society-level solutions for their epidemics. Furthermore, addressing powerful upstream determinants of health is crucial to reducing the burden of specific diseases in entire populations and for truly improving population health. For example, a population-wide salt substitution trial in Peru reported a 50% drop in hypertension incidence through the replacement of regular salt with potassium-enriched salt. In an even more impactful example, a recent trial conducted in rural South Asia reported a 30% reduction in mortality from all causes through a community-based intervention providing enhanced access to public health care. These recent population-wide randomized trials support Cooper and Paneth’s premise that population health improvements come from population-level interventions.
But it is also essential to acknowledge that population-wide interventions can be challenging. Gaining an understanding of mass influences requires either a comparison between populations that are exposed to varying mass influences or analysis of changes within populations in these mass influences. For example, finding a gene responsible for health disparities may seem like a much lower hanging fruit than finding a society with no racism to act as a control group, regardless of how fruitless the search for the “disparities gene” has been. For this reason, and though there is evidence for these population-level interventions, Geoffrey Rose did not state that individual-level (high risk) interventions were useless, but rather that these interventions were intended to improve individual health, not population health.
Assistant Professor, Epidemiology and Biostatistics, Urban Health Collaborative
Dornsife School of Public Health
Richard Cooper and Nigel Paneth provide a detailed and coherent account of the research and intervention strategies that have worked in the past to dramatically improve population health in developed countries. They then contrast this success with the relatively less fruitful record for the molecular techniques that have dominated for the past several decades. But they only obliquely discuss the explanation for this sudden enfeebling of biomedical progress.
Scientists and physicians ought to be the most evidenced-based of all segments of society, able to quickly discern what works from what does not work. How is it conceivable, then, for such a smart and creative segment of society to collectively make a wrong turn and not even notice? What could possibly explain such massive dysfunction in the prioritization of approaches that are efficient and effective from those that are perpetual disappointments and voracious money pits?
Cooper and Paneth allude vaguely to some systemic forces that might begin to explain this mass delusion. Steady progress in population disease prevention, they note, has been replaced by “a reductionist, technology- and theory- (and career- and profit-) driven approach to health and medicine that [is] … wildly expensive.” Is the explanation for the failure of the collective scientific orientation therefore an economic order that increasingly commodifies the biomedical research process and the provision of care?
Indeed, universities increasingly seek patentable technologies, while medicine is ever more corporate and consolidated in the hands of large hospital and insurance conglomerates. Perhaps it is simply more lucrative, for universities organizing research efforts, for scientists conducting them, and for clinicians delivering care to patients, to delve into the high-technology world of genomics than into the less remunerative tasks of behavior modification and old-fashioned screening. Not to mention that molecular medicine requires endless varieties of expensive equipment for use in both the laboratory and the clinic, for which manufacturers and suppliers form an aggressive lobby.
Communism was ridiculed for producing the scientific catastrophe of Lysenkoism, but perhaps capitalism has fallen into its own perverse ideological dysfunction, so intent in generating huge profits that it can no longer be bothered to prioritize promising results over useless ones for actual human health and well-being. When the utility of selling something becomes more important than the utility of preventing a disease or extending a life, then the system has indeed failed us.
Jay S. Kaufman
Professor, Department of Epidemiology, Biostatistics, and Occupational Health
The US role in global nuclear energy market
In “The US Shouldn’t Abandon the Nuclear Energy Market” (Issues, Winter 2020), Travis Carless’s argument that the United States should not abandon the nuclear energy market serves as a deft defense of the national security merits of government investment in the nuclear industry. Surprisingly, climate change receives only a brief mention at the end. Carless does a good job illustrating the shift in the nuclear energy market away from US primacy, and making the case for investment in US capacity. But there are other ways the United States could pursue its nonproliferation goals, such as investments in diplomacy, inspection technology, and international institutions, without an industrial policy favoring investments in nuclear energy over other energy sources.
The author’s picture of the state of the global industry is persuasive. The supply of nuclear energy technology is shifting east, to suppliers in China, Japan, Russia, and South Korea, and the United States is no longer a dominant player in the commercial market. Carless warns that nuclear newcomer countries in search of technology may be especially vulnerable to debt-trap diplomacy by Russia and China. The introduction of advanced reactor designs from these new suppliers creates new safety and security risks in need of revised mitigation strategies.
Paradoxically, while pointing out safety vulnerabilities abroad, Carless proposes loosening rigorous regulatory requirements at home to ensure US competitiveness on the global nuclear energy market. However, the stringent safety requirements of the US Nuclear Regulatory Commission are arguably a selling point for the nation’s nuclear industry. Reputation is something difficult to earn, but easy to lose. Perhaps talented researchers such as Carless can help make progress in identifying which regulations should be reformed and how—or how regulation of new technologies can become more dynamic.
The risks that other nuclear suppliers pose to US nonproliferation goals may be overstated. The heralded “nuclear renaissance” has not come to pass, and emerging energy markets may balk at the cost of even a small nuclear reactor and creating (or re-creating) lasting dependences. The threat of Russia and China recolonizing the world with nuclear deals pales in comparison with the scale of their existing investments, such as China’s Belt and Road infrastructure initiative and the many “deals” that Russia’s state nuclear energy company, Rosatom, brags about (though many of those are still at a very early stage).
There is much the United States can do to further nonproliferation goals, including reinvesting in diplomacy, strengthening the International Atomic Energy Agency and the Non-Proliferation Treaty, and working with allies including South Korea to supply peaceful technology. There are other ways to pursue nonproliferation and nuclear deterrence goals besides nuclear exports.
From the perspective of the nuclear industry, the future is uncertain, but market signals—combined with investments in new research and engineering test facilities—may provide some guideposts. Will nuclear fusion prove feasible? Is there a commercial market for small, modular reactors? Will new reactor designs prove cost-effective at scale? There are no clear answers to these questions, but market signals can help guide investment over the medium to long term. There is a lively debate in venture capital circles over whether government investment should help firms survive the “valley of death” between initial idea and commercial production. Perhaps the nuclear industry needs more investment in the research and experimental phase before it can prove market viability. Fortunately, scholars such as Carless can help chart a path forward and identify new areas for the research and development funding that he recommends.
Associate Professor, School of Public and International Affairs
Associate Professor, Science and Technology Studies
As Travis Carless explains, US nuclear suppliers have become much less significant players in the global commercial nuclear marketplace. This decline in prominence, however, doesn’t necessarily mean that the United States will cease to contribute to setting the nonproliferation, security, and safety standards of civilian nuclear operations worldwide. The influence of the United States in these areas will depend far more on its prioritization of fair and reasonable standards and its use of diplomatic, political, and scientific tools to induce cooperation among nations on nuclear policy-making than on its domestic nuclear energy capacity and global market share.
Carless describes a situation where “emerging markets’ reliance on Russia and China for low-barrier, quick pathways to nuclear power can create several nuclear proliferation, safety, and strategic risks.” Although there are consequential differences in the conditions put in place by Russian and Chinese nuclear suppliers (compared with the United States), those nations still have considerable incentives to ensure that commercial nuclear deployments don’t lead to accidents or weapons proliferation, and they are party to most of the same nuclear nonproliferation and nuclear safety agreements as the United States.
If Carless’s argument is extended to its logical conclusion, a rejuvenated US domestic nuclear industry would be expected to lead to increased safety, security, and nonproliferation standards globally. But this would be far from a certain outcome. The United States has a long history of conditioning civilian nuclear cooperation on the adoption of and adherence to nonproliferation and safety standards. In many of these instances, though, US leverage was as much about broader security and political conditions as it was about the supply of commercial technologies. A far more consistent determinant of US leverage on nuclear nonproliferation, safety, and security policy-making has been the emphasis placed on these matters by US policy-makers, and the willingness of these policy-makers to insist on fair and reasonable standards, in light of the competing economic, security, scientific, and energy interests of partner countries. Where the United States has offered a range of inducements and been consistent in its requirements of partner countries, influence on nonproliferation and nuclear security has more assuredly followed.
Government support for US domestic nuclear energy development could be a worthwhile investment, but arguing that it is essential to the nation’s leadership on nuclear nonproliferation, security, and safety is misleading. These objectives can and should be pursued regardless of the state of the US domestic nuclear industry.
Associate Director, Center for International and Security Studies at Maryland
University of Maryland School of Public Policy
Regulating gene drives
In “Gene Drives: New and Improved” (Issues, Winter 2020), Robert M. Friedman, John M. Marshall, and Omar S. Akbari provide a comprehensive and accessible update on the field of gene drive research. By reviewing the diversity of strategies for gene drives, their primary envisioned applications, and the state of cutting-edge research, the authors offer an efficient way to understand key developments since the publication in 2016 of Gene Drives on the Horizon, by the National Academies of Sciences, Engineering, and Medicine. Although gene drives are arguably still on the horizon—none has been deployed beyond the laboratory even as part of a controlled field trial—advances described in the authors’ update make the time horizon for decision-making even more urgent.
The article makes the important point that a policy perspective should consider gene drives not as a broad regulatory yes-or-no exercise, but as a “design challenge.” The authors suggest that key design questions involve performance characteristics, outcomes to avoid, and desired social outcomes. This makes good sense, but how do we get there?
Several studies about public attitudes toward gene drives offer some insight. But should the gene drive research community simply accept these public attitude measures and prioritize the designs that received the most support? Unfortunately, I don’t think it’s that simple.
Gene Drives on the Horizon dedicated full chapters to the questions of human values and public, stakeholder, and community engagement around gene drives. I was a member of the study committee. We defined engagement as “seeking and facilitating the sharing and exchange of knowledge, perspectives, and preferences between or among groups who often have differences in expertise, power, and values,” and we went so far as to claim that “the outcomes of engagement may be as crucial as the scientific outcomes to decisions about whether to release a gene-drive modified organism into the environment.” This suggests that one-way measures of public attitudes are insufficient to inform the complex “design challenge” of gene drives.
Just as scientists are investigating and developing the kinds of novel technical options for gene drives described by Friedman, Marshall, and Akbari, social scientists are exploring methods to organize meaningful engagement to inform the design and governance of gene drives. As one example, I convened a diverse group of stakeholders in a workshop with the technical research team of the Genetic Biocontrol of Invasive Rodents project in March 2019. Our discussions about the various design decisions inherent in laboratory studies, contained trials, and potential future field trials yielded new insights that inform the ongoing challenge of designing and deploying a gene drive mouse for conservation purposes. (See our report at https://research.ncsu.edu/ges/2019/06/workshop-report-gene-drive-mice/).
Our workshop called attention to the benefits of integrating the design challenges—technical, regulatory, and engagement—that surround gene drive research. Funders should recognize this complementarity and prioritize the support of interdisciplinary teams that take all such design challenges seriously. The complexity, power, and potential of gene drives demand that we integrate engagement, governance, and technical innovation.
Jason A. Delborne
Associate Professor of Science, Policy, and Society
Genetic Engineering and Society Center
North Carolina State University
Melvin Kranzberg, one of the founders of the Society for the History of Technology, is well known for developing the first law of technology: “Technology is neither good nor bad; nor is it neutral.” This formulation applies perfectly to the current discussions surrounding the development of engineered gene drives and their potential application to conservation.
Robert M. Friedman, John M. Marshall, and Omar S. Akbari provide an excellent summary of the science of engineered gene drives currently under way. They stress that the technology is not one thing, but an ever-growing set of approaches with different objectives and constraints. They also stress the need for public consultation and the need for developers to demonstrate both safety and effectiveness.
Over the past seven years I have been involved in a global effort to write an evidence-based assessment of the intersection between conservation practice and synthetic biology—including gene drives. This effort is sponsored by the International Union for Conservation of Nature (IUCN), a global organization with more than 1,300 members from over 170 countries. My experience indicates that this will be far from enough to create a public space that supports informed transparent work on gene drives. Though many people know nothing about gene drives—naturally or engineered—they will base their opinions on many factors, only one of which will be the quality of the science itself. But the strong negative public reaction identified by the IUCN effort supports the concept of “confirmation bias”; as the Yale law professor Daniel Kahan wrote in 2017 regarding the set of experiments, “The subjects are aggressively misinforming themselves by selectively crediting or discrediting evidence on what scientists believe in patterns that cohere with the positions associated with their group identities.” This is the reality of science today.
Science is important as a means of learning more about engineered gene drives and whether they have a future in society’s scientific tool box. In the face of calls for a global moratorium on all work on gene drives, the point that Kranzberg makes about technology never being neutral is critically important. Unfortunately, there is precious little opportunity for people to find documents such as Friedman, Marshall, and Akbari’s and the IUCN assessment, or others that take a different position, and to educate themselves and form an opinion. All gene drive science is political: there is no neutral ground. We must learn to operate, keen to the constraints and opportunities of how people pose and get answers to their legitimate questions. The natural world is waiting for humanity to determine the right answer about potential use of engineered drives.
Kent H. Redford
Friedman, Marshall, and Akbari argue that from a policy perspective, the evaluation of gene drives for use in pest management should be treated not as a yes-no vote on a given proposal, but as a “design challenge” in which the goal is to ensure that proposals align with desired societal outcomes. There are a variety of gene drive approaches in development, they point out, and different approaches may align with goals better in different cases.
This is a move in the right direction, but it could go farther, in two important ways. First, the authors seem to see the design challenge as a matter of maximizing benefits and minimizing potential harms, but that too is unduly narrow. Proposals to suppress, eliminate, or replace wild populations, perhaps entire species, raise questions that fall outside the quick math of cost-benefit calculations. For some people, at least, there are large questions here about the meaning of genomes, the integrity of species, and ideals for the human relationship to nature. Gene drive technologies are similar in this respect to other genetic technologies used in or proposed for agriculture, human health and enhancement, and environmental conservation. We tend to agree with the authors that some gene drive approaches might have “societally desired outcomes,” but we cannot talk meaningfully about those outcomes if we’re not taking on board a large range of questions about what outcomes we’re aiming for.
Second, to understand what the societally desired outcomes are, there needs to be a meaningful societal conversation about the outcomes. Friedman, Marshall, and Akbari call, somewhat obscurely, for “robust design dialogues among product developers, regulators, and other societal players.” The idea seems to be that experts and power brokers of various sorts will identify the societally desired outcomes. Ideally, we’d figure out what nonexperts and less powerful people think too. This probably means a formal public deliberation process of some sort—a process that creates conditions in which to learn what people think when they’re really thinking, as the theorist of public deliberation James Fishkin puts it. At the very least, it means ensuring that regulators are hearing from the public. Existing mechanisms for public consultation are probably inadequate for this task. We should consider the possibility of augmenting those processes with opportunities for formal public deliberation.
Gregory E. Kaebnick
Michael K. Gusmano
Ben Curran Wills
The Hastings Center
In their timely and accessible description of the current state of gene drive research, Friedman, Marshall, and Akbari importantly dispel the common misunderstanding that all gene drives have the same property of potentially spreading throughout a species range based on release of a small number of individuals with the gene drive trait. Indeed, a greater variety of gene drives have been proposed with properties conducive to localized or temporary spread than those geared toward unrestricted spread. For a number of reasons including ease of construction, products that could have unrestricted spread are closer to release than those expected to have only local spread. Although attention is rightfully focused on unrestricted drives, we should not lose sight of other gene drives that may be much more appropriate in specific cases.
As the authors indicated, most of the funding for gene drive research has come from philanthropies and governments, and most of the researchers in this field are university faculty, students, and postdocs. This is very different from the model for research and funding for crop and microbial products of genetic engineering. This is not surprising given that there is no typical business model for a product that self-perpetuates. Over a decade prior to the use of the CRISPR gene-editing technology, academic gene drive researchers were already meeting to discuss how to develop and release products in a socially responsible manner, and how to avoid the shadow cast on biotechnology by Big Ag. The presence at meetings of representatives from the one company, Oxitec, that was doing peripherally related work ratcheted up anxiety of researchers because of the public’s negative response to that company’s tactics.
With all the claims that followed the development of CRISPR-based gene drives in 2015, the field attracted much more attention from social scientists and ethicist, and their perspectives became clearly articulated in the important 2016 report Gene Drives on the Horizon, referred to by Friedman, Marshall, and Akbari. The Foundation for the National Institutes of Health, the Bill & Melinda Gates Foundation, and other funders then began a concerted effort to develop socially responsible policies to determine yes-no decisions along the path for testing specific gene drive products with unrestricted spread.
Target Malaria, primarily funded by the Gates Foundation and Open Philanthropy, developed a semiautonomous ethics committee (of which I am a member) and has ongoing engagement with local communities and with African government entities where gene drives may be tested and ultimately deployed. Even as some people criticize such efforts, we will need to ask what kind of engagement will be needed by future less-well-funded projects to attain ethically appropriate informed consent of communities, countries, and continents. There is certainly no magical percentage of people consenting that enables an obvious decision. There are tough deliberations ahead.
Given this situation, it is worth considering whether we would be on this same trajectory if locally limited gene drives were the first to be moved forward. In any event, it certainly behooves us to continue developing locally limited gene drives and to make sure that policy-makers and the public understand that they are an option.
William Neil Reynolds Professor of Agriculture
North Carolina State University
Land-grant system for the digital age
Mark Hagerott, the author of “Time for a Digital-Cyber Land Grant System” (Issues, Winter 2010), has been pioneering this disruptive and timely idea for some time. It is exciting to see that he is getting traction with peers.
Perhaps I am biased as a native of Vermont, which was home to Senator Justin Morrill, the founder of the land-grant college concept more than 150 years ago. But be that as it may, the Digital-Cyber Land Grant concept is a “big idea” whose time has come.
Morrill’s concept solved an emerging problem. It took abundant land and committed it to practical higher education: higher education that would strengthen the countryside, its people, and its core economic activity, farming, all the while supporting and generating intellectual and economic activity. Today, the United States has a different problem. Hagerott identified a supreme irony in the digital revolution: namely, that when it comes to higher education and economic development, the abundant information and technology available everywhere comes with an intrinsic but powerful bias that favors cities and more highly populated areas.
Expressed more darkly, rural America is being left out in the cold when it comes to the “new economy.” The human, intellectual, and skill resources needed to support learning and work in the digital age are clustering in urban and suburban areas. And with the demand for these resources vastly exceeding supply, rural America and its universities are losing this tug-of-war and control over their economic future.
So although the circumstances and the drivers are dramatically different from those to which Morrill’s legislation responded, the need for a Digital-Cyber Land Grant initiative is as critical today as Morrill’s land grants were in 1867.
Especially exciting to me is the core opportunity that the Digital-Cyber Land Grant concept represents: dramatic change in the ways that higher education is delivered across the country and throughout life to all who want it. Morrill’s legislation reframed the nation’s vision of higher education away from the elites and toward the general population spread across the countryside. Today we stand at a similar crossroads. But questions remain:
- Are there innovative universities that are willing to band together and create this network of opportunity?
- Are there visionary legislators at the state and federal level willing to alter tax laws and spending priorities to encourage this kind of development?
- Are there innovative businesses and nonprofits willing to share in the work, the risk, and the reward of developing the digital-cyber concept without trying to dominate it?
- Will one size fit all? Or will multiple models connected by common characteristics be the order of the day?
If we seize this opportunity, the twenty-first century land grant movement will further level the opportunity playing field for all, making education and economic opportunity available to most people, throughout life, and on their own terms, regardless of where they live.
Senior Vice President and Chief Academic Officer
University of Maryland Global Campus
Whither China’s science policy
Anything written by Richard P. Suttmeier, the author of “Chinese Science Policy at a Crossroads” (Issues, Winter 2020), is worth serious reading; if there is a dean in the field of China science and technology (S&T) studies, he is it.
His article appears at an important inflection point in our understanding of the Chinese research and development system and the nature of the relationship between the United States and China with respect to bilateral S&T cooperation. The notion that after multiple decades of expanding cooperation, the United States may be embarked on a path of disengagement is quite disturbing. Although the bilateral S&T relationship clearly has not been problem-free or without an array of critical limitations—political and otherwise—the reality is that even during the most difficult periods surrounding Sino-US relations since 1979, transpacific interactions in science and technology have provided a solid foundation for helping to sustain overall bilateral engagement.
As Suttmeier points out, China is no longer a technological laggard. Now that the United States and China have moved from a relationship characterized largely by asymmetry in capabilities and knowledge to a situation of greater overall symmetry, the United States finally has a chance to leverage its past involvement to secure access to key areas where Chinese progress is now meaningful and internationally recognized. For the first time, the notion of mutual benefit in science and technology has a real chance to have substantive meaning. Moreover, the fact remains that there is no major global S&T-related problem—clean energy, climate change, health, water, and so on—whose long-term solution will not require some form of in-depth cooperation between the US and Chinese scientific communities.
Simply stated, access to China’s S&T system and associated resources is now more important than ever. Access to the Chinese high-end talent pool is critical in a world where progress in innovation is increasingly underpinned by participation in a range of globally oriented knowledge networks. There is no doubt that China is embarked on a path that will result in closing the science and technology gap with the United States in appreciable ways across many fields. In all likelihood, China will become a serious competitor across many key domains. Its overall presence in international S&T affairs is undoubtedly becoming more pronounced. Under such circumstances, does the United States benefit more from being engaged with China or by trying to marginalize its ties and diminish its interactions?
Suttmeier provides many things to think about as the United States ponders its overall foreign policy toward China. One thing to remember, however, is that unlike the situation in the 1980s and much of the ’90s, government-to-government ties in S&T are no longer the main game in town. Connections in S&T between American and Chinese companies, universities, research institutes, and think tanks far exceed in both depth and breadth the S&T ties associated with the recently renewed bilateral government-to-government agreement. Most of these organizations, while sensitive to warnings from the FBI and others about intellectual property violations and beyond, continue to remain much more enthusiastic and optimistic about cooperation than the American government.
Maybe they know something more about the future of research and innovation than the current flock of China critics who would like to see the United States walk away from this strategic relationship.
Executive Vice Chancellor
Duke Kunshan University
Kunshan, Jiangsu province, China
Richard P. Suttmeier provides valuable insight into China’s science and technology (S&T) policy and the larger S&T collaboration of China and the United States. As he noted, with the potential decoupling of China and the United States, Chinese S&T policy is at a crossroad: grow more global or more independent?
China has made huge progress in the past 40 years, which can be seen in the number of international papers, patents, and so on. As Suttmeier wrote, “China has made notable achievements in space technology and in civil engineering for large infrastructure projects, including the construction of an impressive high-speed rail network, and is forging ahead in the construction of world-class ‘big science’ research facilities.”
I think there are two important factors for this kind of progress. First, Chinese S&T has been increasingly embedded into the global system, especially with the United States, the European Union, and Japan. Second, the Chinese government has given S&T the role of the production force for economic development, ever since Deng Xiaoping rose to power and became the propeller for international competitiveness. The government has persistently supported the S&T development since the 1980s. In 2019, research and development spending in China was about 2.1% of the nation’s gross domestic product, a level comparable with the European Union. China has built a very modern S&T infrastructure, with lots of megascience projects going on in Beijing, Shanghai, and other places.
But the modern road for Chinese S&T development is different from that in countries in the West. First, there has been deep intervention by the party and government—as Suttmeier put it, from long range S&T programs, megaprojects, and various technology policies to strategy-making for indigenous innovation. Second, China lacks a culture of science, as Suttmeier also noted. For example, China rewards technological inventions not based on intellectual property rights, but on the government’s title and honor. So for progress to expand, China’s S&T efforts need to more independently value innovation, and its S&T needs to be even more deeply embedded in the global science system. Thus, for government officials and policy researchers, whether to couple with or decouple from the United States is always a choice. The questions become when, why, and how. Some officials and researchers in China may not support this.
Could US efforts to decouple from China’s innovation system end up making China more independent and more capable? This is the key question that Suttmeier asks—but it is hard to answer. For one thing, if the United States does uncouple, it won’t be good for efforts to deal with so many global challenges today, such as climate change, disease, and coordinated development for science development. Second, I think it will depend how China will reform existing institutions for S&T and innovation. China needs institution-innovation driven development as it approaches becoming a middle-class country.
Professor, School of Economics and Management
University of Chinese Academy of Sciences
Richard P. Suttmeier’s article comes at a moment when the United States is increasing its pressure on China not only in trade relations but also in science and technology. Even as scholars are proclaiming “The Collaborative Era in Science” (which also provided the title of a recent book by Caroline Wagner) and open international cooperation is needed more than ever for addressing global challenges, political leaders in the United States see China’s rise in science and technology (S&T) as a threat.
Suttmeier provides a welcome insight into key aspects of the situation, including China’s moves toward self-reliance, leapfrogging, and strengthening the research-industry links. There is an urgent need for carefully assessing the conflict and the possible consequences for China, the United States, and the rest of the world.
These tensions arise 40 years after the 1979 Deng-Carter agreement on S&T cooperation. Since then, US-China collaboration and exchange played an important role—not only for training Chinese researchers and facilitating their integration into global networks but also for strengthening the US S&T workforce and developing close US-China scientific cooperation and interdependencies in global production networks.
Since a majority of the best Chinese scientific talent stayed in the United States, that unidirectional researcher mobility means a significant brain drain for China. Talent programs try to balance that situation by offering Chinese researchers working abroad attractive conditions for returning home.
For the United States, attracting and retaining excellent scientists from around the world means substantially benefitting from and depending on foreign S&T talent. Decoupling from China may substantially reduce the inflow of Chinese S&T talent into the United States, disrupt mutually beneficial US-China scientific collaboration, and endanger industrial ties in areas such as information technology, manufacturing, or low-carbon technologies.
Such radical intervention into the complex fabric of international S&T networks may have drastic consequences at the global scale.
US authorities launched their campaign against China’s S&T community because of some severe cases of misconduct where appropriate reaction was necessary. But general suspicion and mistrust should be avoided even as the S&T community works to safeguard common rules for international cooperation in areas such as common interests and mutual benefits, openness and fairness, reciprocity, scientific integrity and ethics, and codes of conduct for peer review and conflicts of interest.
The best way forward will be that the scientific community actively and autonomously reaffirms and strengthens the rules and values of S&T cooperation. A workshop on “The Future of Funding Research” organized by the National Natural Science Foundation of China and UK Research and Innovation in Beijing in December 2019 was attended by 33 agencies from the United States, Europe, Israel, Russia, and Asia; this was a step in the right direction and an example of good practice in the present situation.
Honorary Professor, European and International Research and Technology Cooperation
Vienna University of Technology, Austria
Academic science losing its soul
In “How Academic Science Gave Its Soul to the Publishing Industry” (Issues, Winter 2020), Mark W. Neff analyzes how the academic publishing industry has distorted scientific research and academic practice. Using the example of Mexico, he shows how the research topics, the speed to obtain results, the methods used, the sectors that can access scientific information, and the degree and progress of the exploitation of academics as free labor have been profoundly modified by the corporate publishing system.
Being a researcher at Mexico’s National System of Researchers, or SNI, I confirm all this being true. It is worth mentioning that the new administration of Mexico’s National Council of Science and Technology has in just its first year made important modifications, but it is not easy to change the existing evaluation and monitoring procedure. The result, according to Neff, is a dysfunctional market that greatly influences science policy. This might change, though not without struggle, given the advance of the Open Source system—but even then it may prove difficult to avoid the problematic scoring system and other issues.
The problem with the dysfunctional market, however, arises not from the adjective but the noun. Markets are not fixed entities. They are dynamic and show two trends in any economic sector: concentration (increase) of capital, and centralization (mergers and acquisitions). Neff gives examples of both trends with amounts, names, and dates. Therefore, it is the proper functioning of the market, not its dysfunctionality, that is at stake.
Beyond Neff’s criticism of how corporate publishing houses can adversely affect autonomy and accountability in the sciences, there are other areas that need attention, such as the role of technology in science. Suppose scientists have freed themselves from corporate control of publishing houses. Does this mean they have freed themselves from corporate control of the physical and virtual equipment they use? I have in mind such things as computers and the internet. The question is pertinent.
Researchers work with computers that use software that forces users to spend hundreds or even thousands of hours in overcoming absurd updates, without considering associated costs. True, there is free software, but that is used primarily in corporate computers.
Also, laboratory researchers are subject to protocols that have defined indicators incorporated in their physical equipment that are required for the results to be accepted by regulators. All such equipment is controlled by corporations and dominant countries. For example, if a researcher discovers that a particular chemical element manifests toxic responses in the second generation of mice, this information won’t be given proper attention if the simulation PBPK system (or a similar one) accepted by government regulators discards that analysis, or if accepted software looks only at toxic responses in the first generation.
Technology is not neutral, and the concentration and centralization of markets becomes more entrenched.
Professor, Universidad Autónoma de Zacatecas, Mexico