Carbon capture and sequestration technologies will play an essential role in climate change mitigation, so it is not surprising that California’s plan for achieving negative emissions, described by Roger D. Aines and George Peridas in “Getting to Zero—and Beyond” (Issues, Spring 2020), hinges on their successful deployment. One important part of the plan calls for capturing and sequestering carbon dioxide from bioenergy facilities that convert biomass into fuel or electricity.
Computer models that assess decarbonization scenarios adore systems that combine bioenergy with carbon capture and sequestration (Aines and Peridas refer to this suite of technologies, known as BECCS, as “the first energy system ever invented by a computer”). But doubts about BECCS’ commercial and environmental viability persist, largely because the technology faces many of the same complications and uncertainties that plague conventional bioenergy. In theory, processes that turn biomass into energy products can have a range of positive and negative impacts on climate. The climate benefits typically decrease as production scales up, because additional demand for biomass encourages landowners to convert natural ecosystems into plantations, often in ways that transfer terrestrial carbon to the atmosphere. In practice, very few commercial bioenergy projects deliver significant climate benefits.
Aines and Peridas correctly identify “waste biomass that can be responsibly and realistically sourced without violating ecological, environmental, logistical, and economic constraints” as the most appropriate feedstock for BECCS systems, but acquiring that biomass in massive quantities is a tall order. Sourcing significant amounts of appropriate waste biomass is not something the biomass industry is currently doing or knows how to do. Selecting, aggregating, and transporting waste biomass present a complex and expensive set of problems, and to date the biomass industry has spent more time denying those problems than addressing them. As a result, much of the biomass that is now sold as “waste” to energy producers comes from live trees that were actively absorbing carbon from the atmosphere prior to being harvested.
Lots of work needs to be done to resolve, if possible, these supply chain challenges before BECCS can be scaled up in a meaningful way. The Clean Air Task Force, a group of climate and energy experts, is working with policy-makers and other stakeholders to ensure that bioenergy-related carbon fluxes are accounted for in a scientifically rigorous manner, so that biomass harvesting targets the most climate-beneficial feedstocks.
In the meantime, there are hundreds of natural gas-fired power plants in the United States that need carbon dioxide emissions controls. They offer ample opportunity to deploy and refine carbon capture technologies, to build the connective infrastructure for transporting carbon dioxide, and to develop and improve injection and geologic storage capability. BECCS systems can eventually take advantage of this downstream carbon dioxide management network, provided that careful research and testing indicates that the supply-side challenges posed by BECCS can be overcome.
Clean Air Task Force
Mapping more detailed pathways that states and nations can use to help achieve global net-zero greenhouse gas emissions is critical. Roger Aines and George Peridas are to be congratulated for taking on the challenge and dreaming big.
But ambitious dreams can be dangerous too. There is an inevitable tension between describing the possibilities and the uncertainties. Some hype is needed to win political attention and funding for essential carbon removal. But too much hype may actually undermine efforts to reduce emissions from other important sources, including some that pose hard technological challenges.
And there is hype here, most clearly in the case of bioenergy with carbon capture and storage (BECCS), presented as if it were in Schrodinger’s box, simultaneously “available and ready to be scaled up” yet “nearly imaginary.” There are indeed some forms of BECCS in use—for example, in ethanol fermentation—but these don’t actually remove net carbon, while the forms proposed by Aines and Peridas remain untested and unproven, technically and commercially.
The environmental costs of locking in continued generation of biomass wastes for BECCS use, or the ongoing removal of biomass for fire management, must also be examined more carefully. Moreover, such models may not be transferable to other jurisdictions (especially given the widely varying availability and public acceptability of geologic carbon dioxide storage).
Yet carbon removal will be needed—as well as dramatically accelerated emissions reduction. To contribute fairly to climate justice, rich jurisdictions such as California need to achieve net-zero much sooner, and will need to go beyond even that goal, to be net-negative. So a program such as this one must recognize the risks of mitigation deterrence, and build in measures from the start to minimize the risk and ensure that both carbon removal and emissions reduction can be delivered.
This will mean, for instance, devising effective separate funding, not relying on offsets. And it will demand clear separate targets for maximizing emissions cuts in the short term. Then perhaps California’s carbon dreams can become reality.
Professor in Practice
Rethinking the Green Revolution
In “How to Improve the Social Benefits of Agricultural Research” (Issues, Spring, 2020), Marci Baranski and Mary Ollenburger provide a timely and insightful analysis of the Green Revolution and its modern-day successors, such as agricultural research for development programs in Africa, that have excluded large numbers of smallholder farmers in their efforts to spearhead agricultural transformations. Inclusive innovative approaches, they argue, will benefit both people and research.
Their timing couldn’t be more appropriate, as they provide insights into four issues that hasten the call for smallholder-driven research.
First, the disruptions triggered by COVID-19 reveal that global food production and distribution networks designed through decades of agricultural research and technology development are contributing to major immediate vulnerability. Swift, damaging impacts offer dramatic new proof of acute hunger and food and nutritional insecurity among poor smallholders and other precarious populations.
Second, the disruptions are likely over the long term to worsen preexisting inequalities in the global food systems. Numbering as many as 2.2 billion globally, smallholders produce on farm sizes below national averages. Although highly heterogeneous worldwide, most of them are hardscrabble farmers, often representing diverse ethnic groups.
Smallholders must be viewed through a lens that is “un-Romantic, nonteleological, and anti-fetishistic,” as colleagues and I recently wrote in response to the widespread tendency to assume entwined beliefs of imminent smallholder demise and stereotypical traditionalism. Yet often precarious conditions do not detract from smallholder food-growing and resilience. In Peru, for example, smallholders furnish an estimated 80% of the country’s food and a large majority of the agricultural biodiversity that underpins critical food-system resilience. By comparison, proponents of urban agriculture, myself included, readily acknowledge this sector is likely to expand to a maximum of 5% to 10% of food production in most countries.
Third is the convincing portrayal of the global forces that exclude smallholders from agricultural research. Authoritarian geopolitics, such as those of the Cold War, have tended to drive agricultural research toward excluding smallholders or including them nominally, as occurred in the Green Revolution driven by the United States in India, the Philippines, Mexico, Indonesia, and elsewhere. Recent resurgence of authoritarian geopolitics therefore redoubles this article’s relevance.
Fourth, the article’s timing coincides with the increase of agricultural research in agroecology and agrobiodiversity. This research has generated important advances regarding environmental and social scientific aspects of sustainable farm resource management. Though often rooted in the practices of smallholders, the sustainability innovations of agroecology and agrobiodiversity do not automatically or necessarily deliver social benefits to poor farmers.
New analysis of the Green Revolution and its successors, together with the clarion call for thorough inclusion of smallholder-driven research, is deeply resonant with the present needs of people and societies worldwide as well as those of science, technology, and policy.
Department of Geography and Programs in Rural Sociology and Ecology
Marci Baranski and Mary Ollenburger argue that although the Green Revolution ought to be celebrated for what it did to feed the world’s growing population, its agents often ignored the complex issues of the regions they hoped to “improve,” increasing inequalities and failing to live up to the real promises of poverty alleviation. The authors are correct to point out that top-down initiatives tend to be problematic, but bottom-up, community approaches are no panacea.
Scholars of development have long pointed out that universalizing approaches to modernization and agricultural development have consistently failed to achieve their goals, often refreshing colonial tropes and deepening inequalities. James C. Scott’s “high modernist” framework provides a backbone to much of this discourse. Though Scott defined high modernism as a unique category of authoritarian and technocratic practices, the condescending move to simplify the supposed messiness of rural lives has had a central role in many of the development and modernizing attempts of the twentieth century, including the Green Revolution’s linear model.
But this literature often goes awry when trying to find the corrective to this universalizing method of development in its particularistic opposite. In fact, although historians and activists focused on contemporary community-engagement are quick to point out the issues when “modernization comes to town,” examples of development practices centered around communitarianism have also failed, sometimes catastrophically.
The historian Daniel Immerwahr, in his book Thinking Small, argues that attempts to pursue development through communitarian strategies held up utopian conceptions of “community” and often exacerbated existing inequalities. For example, the mid-twentieth-century panchayati raj system in India—developed through collaboration between the United States and Jawaharlal Nehru’s government—sought to develop rural India through a decentralized organization of village councils, but ultimately led to what the Nobel laureate Gunnar Myrdal called “more, not less, inequality.”
Baranski and Ollenburger are correct to suggest that community participation must be a part of contemporary agricultural research for development, but it cannot be the only driver of positive change. To learn from both the successes and failures of the Green Revolution, it will be necessary to acknowledge the difficulties of both the linear model and the communitarian approach.
Research Analyst, Food & Agriculture
The Breakthrough Institute
Marci Baranski and Mary Ollenburger provide a good read for anyone concerned about the future of agriculture. Modern agriculture, which began in the mid-twentieth century, is the result of rapid advances in science and technology, investments in infrastructure, and policy support for agriculture. Commonly linked with the Green Revolution that began in the global South and South East, modern agriculture is credited for successfully supplying a large volume of food to the global market.
The linear model of innovation, as the authors argue, continues to dominate this century’s vision of agriculture development. This is despite the fact that its success has come with the heavy cost of widespread degradation of land, water, and ecosystems, along with forcing many smallholder farmers off the land and into urban slums and shantytowns.
Characterized by the use of new crop strains and greater input of synthetic fertilizer, water, and pesticides, the Green Revolution was mutually reinforced by a complex of science and technology policies that included but were not limited to extension services, input subsidies, marketing of agricultural products, and farmers’ newfound enthusiasm for innovations. The authors make the case that the Green Revolution was not uniformly successful across geographic regions, and did not eradicate hunger in countries where it was introduced and practiced the most.
They also correctly point out that agricultural innovations must not only focus of improving the “existing farming practices” but also should compete with other “investment opportunities” to become an attractive feature of society.
In fact, by converting fields of traditional crops into monoculture expanses of rice and wheat, the Green Revolution shrunk the genetic base on which the food supply of millions of smallholder farmers relied. Agrobiodiversity is the central tenet of smallholder agriculturalists across the world. It contributes to ecological stability, system resiliency, and overall productivity. The focus of the linear model, especially its emphasis on enhancing production of a few cereal crops, is ill-suited to tackling problems of food security and poverty alleviation.
For all its innovativeness and achievement, as the authors correctly argue, the ability of the linear model to respond to emerging challenges requires a new gestalt of concepts that demand different science and technology policies. The focus must be on ensuring that smallholder farmers can produce more food and other agricultural commodities sustainably under conditions of declining arable land, limited irrigation water, dwindling resources, and reduced labor supply, along with the stresses of climate change.
To expand on the points made by Baranski and Ollenburger, I would like to offer three considerations based on my own experience.
First, it is important to defend the gains in agriculture that have been made. This may involve integrating practices of both modern and traditional agricultural systems in science and technology policies. It also involves the promotion of policies that emphasize the sustainability of available natural resources and holistic approaches to agricultural development, as opposed to the commodity approach of the Green Revolution.
Second, we must extend our technological prowess in agriculture for the benefit of smallholder farmers around the world, whose varied and risk-prone crop-growing environment makes them not only vulnerable to the vagaries of climate but also less likely to invest in modern technology. The focus should be on providing smallholder farmers with location-specific technology, services, and policies to achieve the dual goals of enhanced food security and poverty alleviation.
Finally, science and technology should make new gains, especially for the rainfed (nonirrigated) and dryland agricultural systems that constitute more than 60% of the farm area in the developing world.
Overall, policies are needed that emphasize the knowledge base of farmers, promote genetic diversity of crops, and give farmers a basket of options to choose from.
School for the Future of Innovation in Society
Arizona State University
Risks and rewards of gene editing
In reading the interview with Jennifer Doudna (Issues, Spring 2020), I was struck by her comment regarding the importance of bioethicists in developing guidelines for the management of the CRISPR gene-editing technology. This echoed an article in the Fall 2019 issue, “Incorporating Ethics Into Technology Assessment,” by Zach Graves and Robert Cook-Deegan, which described the role of bioethicists in the assessment of new biotechnologies. For scientists, then, this appears to be the go-to approach for resolution of moral and ethical issues associated therewith, these being acknowledged to be outside the purview of science.
But it is not clear to me a priori why this is so, as there are after all many other entities in society contending for authority on such issues. Why not, for example, submit them for analysis and arbitration by, say, the Southern Baptist Theological Seminary or the Islamic Seminary of America?
Who, I wonder, are these bioethicists and how are they selected? Given the outsize roll they are being assigned, these questions merit careful consideration. I am going to surmise that they are largely drawn from the ranks of elite universities, which, as presumably with the religious institutions noted above, require vetting for intellectual orthodoxy before admission, and are grossly nonrepresentative of the demographics of the country as a whole. Are they in effect high priests of the belief system subscribed to by most scientists? That would explain the evident eagerness of scientists to invest them with ethical and moral authority.
Certainly I would encourage the bioethicists to try to use their power of moral suasion to make their case, but I would not deny this opportunity to others. In this regard, I find it disturbing that representatives of the nation’s faith communities seem to have been completely excluded from these deliberations.
There are now almost no children with Down syndrome being born in Iceland. This of course is because all fetuses identified with this condition are summarily aborted. I do not know if there is a consensus among bioethicists about the moral or ethical rectitude of this situation. But if there is, I do know that I would feel under no obligation to accept it as gospel. To my mind, any such consensus has no more legitimacy than, say, that achieved by of a group of people with Down syndrome convoked to consider the selective abortion of bioethicists.
The discoveries of Doudna and her colleagues are amazing, even awe-inspiring. They appear to confer, dare I say it, a god-like power. But I feel a deep unease at the notion that the new world built with this power will rest on moral and ethical foundations laid by bioethicists alone.
In this interview as in past public comments, Jennifer Doudna opens the door to using the CRISPR platform she helped develop in the service of a hugely controversial enterprise: altering the genomes and traits of future children and subsequent generations. She does so under the banner of responsible science and policy. But as with similar comments by supporters of heritable genome manipulations, her responses shed little light on what criteria would constitute “responsible use,” how irresponsible uses could be avoided, and how this immensely consequential decision might be made in an open and democratically responsible way.
To be sure, Doudna notes that “the main challenge in embryo editing is not scientific … but rather ethical,” and raises important questions about the feasibility of consent by future generations, the difficulty of distinguishing between medical applications and enhancements, and the harm that eradicating genetic conditions might bring to people living with those conditions. But she gives no hint about how these challenges could be met. Tellingly, she fails to mention the broader social justice alarms about heritable genome editing: that the accumulation of individual choices about the traits of future children, shaped by cultural pressures and market forces, would exacerbate existing inequalities and discrimination, introducing a new form of high-tech eugenics.
With the stakes this high, meaningful public involvement in policy decisions about heritable genome editing is critical. But Doudna’s call for “a broad public conversation” about heritable genome editing is undercut by her assertion that scientists are the parties “equipped” to “guide” the conversation. It’s difficult to avoid concluding that in this view, public participation is acceptable only at the edges and after the fact: it may nibble at questions of how heritable genome editing is to be conducted, but must refrain from considering whether it should proceed at all.
Doudna gives only the flimsiest of reasons for rejecting calls for a strong, enforceable moratorium on heritable genome editing made by many prominent scientists, biotech industry figures, policy experts, public interest advocates, and others. She voices concern about maintaining public support for using CRISPR in basic research and to treat existing patients; a moratorium on heritable genome editing would in fact strengthen public trust.
A truly responsible approach to heritable genome editing requires holding off on questions about how its development and use should proceed. Instead, we need to ensure adequate time and resources for meaningful democratic debate about whether modifying our children’s genomes will help us build a just and inclusive society.
Center for Genetics and Society
Future of American higher education
I admire the educational egalitarianism that Michael M. Crow and William B. Dabars embrace in “The Emergence of the Fifth Wave in American Higher Education” (Issues, Spring 2020), and I support their call for a transition to universities where “broad accessibility and academic excellence are complementary and synergistic.”
They liken this to a “coupling within single institutions the research excellence of the University of California system with the accessibility offered by the California State University system.” Having spent 30 years in the classrooms of UC and having studied both systems, I think Crow and Dabars’s ends are good, but the financial means are not adequate to their goals.
The goal of Fifth Wave universities is to bring nonelite students—the authors say “the top quarter or third of all 18-to 24-year-olds”—to “internationally competitive levels of achievement.” It is also to produce more social value in research. How would the Fifth Wave accomplish these tasks?
The answer for education, Crow and Debars say, lies in harnessing educational technology to teach students. The authors’ institution, Arizona State University, has the country’s most advanced online learning systems, with eAdvisor and related services making the school arguably the best-case working model of hybrid learning.
The answer for research is focusing on technology for the whole society, not just for big corporations. I will assume that the Fifth Wave differentiator is local technology for regular people. ASU has created a remarkable number of schools and research centers that address the concrete challenges of the overall social and environmental situation in its region.
The problem is that genuine versions of world-class learning and whole-society research cost enormous sums. Most of ASU’s growth has been in online programs: these can improve pathways for the less-prepared students ASU rightly wants to bring to college completion. But can online or hybrid instruction make these students internationally competitive intellectually? Almost certainly not. In the absence of transformative technology, less-prepared students need more attention from instructors and support staff than do elite students, which costs money.
Research poses a similar problem. Research in science, technology, engineering, and mathematics—the STEM fields—preferred by policy-makers and business people takes large revenue streams to do well. Extramural grants to universities require various kinds of support in excess of the “indirect costs” covered by sponsors. Public universities such as ASU typically provide 24 cents of every dollar of research expenditure out of their own pockets. As a particularly ambitious campus, ASU puts in as much as 39 cents of every dollar allotted.
In addition, ASU’s whole-society research naturally moves it toward building high-cost forms of public infrastructure. Does ASU need to build some or all of its local communities’ solar power grid? Even with good local partners the school doesn’t have the budget to realize Fifth Wave goals.
I particularly like Crow’s signature call for universities to judge themselves by whom they include rather than exclude. ASU is tirelessly self-inventive, attuned to multiple needs, and exciting to follow. But the Fifth Wave is a symptom of the retreat of government from society. It asks universities to provide public goods that governments have stopped properly paying for, and that they can’t pay for either. To use Fifth Wave terms, forming human and intellectual capital are costs not revenue streams, and there’s no point in trying yet again to convince ourselves that public-good universities can be self-supporting by redesigning themselves and using more technology.
ASU is right to want to lead other universities toward making less-prepared students world-class, and rebuilding a sustainable local community and state—and even nation. But the government and its publics are going to have to pay ASU properly to do it. Decades of underfunding have scoured a valley that is far too wide for the Fifth Wave to cross. I’d like to see Crow and Dabars work explicitly on the government funding bridges to the other side.
University of California, Santa Barbara
The COVID-19 pandemic colors all critiques and predictions of the future for higher education. Nonetheless, Crow and Debars, in calling for a Fifth Wave in higher education, provide a persuasive argument that highly selective public and private universities with strong research profiles should open their doors to a wider swath of the American population. My own university system has been extremely slow to broaden access by, for example, offering more online degree programs—the path that the multicampus Arizona State University under Crow’s leadership has taken with robust results, including generating additional income in the midst of cuts in state funding. In many ways, the early strategic focus on reorganizing ASU’s academic programs and integrating faculty into online degree programs, expanding the school’s total enrollment more than twofold, has become an even more powerful model in the COVID-19 era.
As Crow and Dabars state, virus or not, there is a great need to expand access to postsecondary education and to provide degree programs that meet future labor needs, enhance personal growth, and help mitigate growing socioeconomic inequality. But where I respectfully disagree is the seemingly sole focus on research-intensive universities as the primary vehicle for meeting this need. This is accompanied by a rather old-fashioned notion that excellence in teaching, and therefore degree programs, cannot be excellent and fit-for-purpose at various other institutional types, such as community colleges or teaching-intensive public universities and small colleges.
Why such a narrow argument? One of the hallmarks of America’s system of higher education is the diversity of institutional types. Although there are many poor performers, particularly among a large portion of for-profit schools, there are many that excel in their sphere of responsibility. Excellence in teaching and learning is not just the exclusive realm of research universities. What is more, the current crop of elite public and private universities, with highly selective admissions, cannot sufficiently grow in enrollment and offer the array of programs needed to serve a changing labor market and promote socioeconomic mobility.
The world of work is rapidly changing. Middle-skill jobs that typically required a high school degree are in decline. Many jobs lost over the past two decades will not come back, a trend accelerated by the pandemic. This is only one part of the larger story of growing inequality and economic dislocation in the United States.
Higher education institutions of all types, but particularly public multicampus systems that enroll most students, need to adapt rapidly by rethinking their curriculum and incorporating, for example, short-term job-related badges and certificates into traditional bachelor’s degrees, and by making a sharper break from the old school model that focuses on time in a classroom versus the quality and nature of learning experiences. If not obvious already, online courses and degree programs are one major path for providing greater access and increasing educational attainment rates. But it should also be noted that attrition rates are high among the traditional college student age cohort, and there are ethical issues regarding the probable clustering of lower-income and minority students into the online world.
Elite universities need to be part of devising needed reforms, but efforts also need to include the larger landscape of American higher education. I don’t sense that this reality is lost on Crow and Dabars. Their passion is to influence a certain sector of elite universities toward a more progressive, inclusive, and innovative model. That is the compelling part of their Fifth Wave argument.
John Aubrey Douglass
Senior Research Fellow and Research Professor
University of California, Berkeley
Increasing diversity in STEM faculty
In “A New Model for Increasing Diversity in STEM Faculty” (Issues, Spring 2020), Arri Eisen described many of the attributes and successes of the FIRST program—Fellowships in Research and Science Teaching—at Emory University. As an alumna, I would like to emphasize the importance of FIRST for me because of its unique blend of training, mentorship, and cohort-building, and especially how the values of the program radiated out from me to affect the future of STEM. As a young Latina in a biomedical PhD program, I saw few female and no minority professors. I realized the importance of seeing who you can be long ago. I joined the first cohort of FIRST because I believed in its mission.
I remain involved with FIRST as a teaching mentor and liaison from Morehouse College, a historically black college/university for men, where I have been on the faculty in biology for 17 years. I teach and mentor with empathy and high expectations. I bring research students to conferences to show them who they are, and to show the world of STEM, which might otherwise ignore them, that my students are serious researchers with the talent, skill, and intelligence to make important contributions in exploring the natural world.
I am in a unique situation where I can be a mentor and promote the good work of other minority scientist colleagues and connect them with my students. My success is largely due to FIRST’s cohort-building and invested mentors who gave me the experience and confidence I needed. My mentors saw me for what I could offer and pushed me toward excellence.
To the question of how to increase underrepresented groups and women in the STEM professoriate, Eisen proposes the answer: FIRST. I agree. I have advised and taught approximately 2,500 African American men and several dozen African American women in biology, mentored nearly 50 students in research, and obtained over $4.5 million in scholarship, training, and research funding.
I have carried forward the principles with which I was trained in FIRST by mentoring 11 FIRST fellows, all but one of whom stayed in science. Imagine all the people these students and postdocs in STEM will reach. Now, multiply that by the 18 FIRST alumni who are professors in the Atlanta University Center, or the nearly 200 FIRST alumni across the country. We are making a difference.
Here’s to 20 more years!
Valerie K. Haftel
Associate Professor of Biology
(First class of FIRST, 2003)
Data from the National Center for Education Statistics indicate that the number of doctoral degrees conferred over the past 40 years to African Americans, Hispanics, American Indian/Alaska Native, and Pacific Islanders has increased by at least 115% from 1977 to 2017. But this has not translated to similar increases in tenured or tenure-track academic faculty.This points to the pipeline issue, in which moving from graduate school to postdoctoral fellowship and onto junior faculty appointments at universities describes a “leaky pipeline” that loses minority faculty at each stop.
Because each step in the transition process has both unique and crosscutting challenges, addressing these gaps will require specific solutions. The requisite solutions that can enable underrepresented minorities to cross this chasm are not apparent, thus new modes to increase diversity are needed.
As Arri Eisen describes, the Fellowships in Research and Science Teaching (FIRST) program has supported several hundred postdoctoral fellows, many of them underrepresented minorities, myself included. In my case, the program helped me transition from graduate student to postdoctoral fellow and finally to my career path, which has encompassed both government service and now academia. Others have chosen career paths that span the breath of our diverse backgrounds in various sectors (e.g., government, academia, industry, and nontraditional fields).
As Eisen noted, FIRST provides postdoctoral scholars opportunities to explore potential career options as they transition from student to postdoctoral fellow and finally into their career. FIRST communicates that a worthwhile career does not require a person to be an academic, which is a significant yet subtle message of the program.
Still, many underrepresented minorities want to give back to their community and remain in science; this new model provides the avenue to explore these career paths.
The importance of the FIRST program for my career cannot be overstated. The mentorship, access to a community of fellows, support from program staff, and teaching and research experience helped me gain confidence in my own decision-making.
This is critically important, as obtaining confidence as an underrepresented minority in your own decision-making can be developed only in a safe environment, and this element cannot be taught. Having such confidence is necessary to move forward in your career, discern new opportunities and challenges, and ultimately reach a career destination best suited for your circumstances.
This is the “secret” ingredient, in my opinion, that has made FIRST successful. FIRST provides opportunities to explore, and significant support for many fellows such as myself to appreciate that career trajectories are nonlinear, and that it is possible to transition from one career path to another, as my own experience illustrates.
Emmanuel K. Peprah Jr.
Director, Implementation Science for Global Health
Assistant Professor, Global Health & Social and Behavioral Sciences
New York University
Rebuilding the ivory tower?
David Hart and Linda Silka say that when they began building an academic center for doing science that makes a difference, they found no comprehensive field guides. Thanks to their pioneering work, described in “Rebuilding the Ivory Tower” (Issues, Summer 2020), we now have one, and it provides useful signposts for creating institutes that connect research with societal needs.
Their lessons are also helpful because there are numerous obstacles to developing real-world, solutions-oriented research. I appreciate in particular how Hart and Silka reframe many of the challenges they encountered as opportunities. For example, rather than lament the difficulties in bringing together interdisciplinary researchers and community partners to work together, they embrace them and encourage perseverance that can lead to sustained collaboration. While noting how differences in values or perspectives among academics and others can lead to conflict, they describe these differences as a source of creativity and innovative problem solving.
Still, academic institutes that want to align research with societal needs need to grapple with several key issues that the authors didn’t cover:
How do we effectively integrate diversity, equity, and inclusion throughout science that informs action? Issues of power, privilege, and who is at the table are central to connecting research and decision-making. Doing science in a more inclusive way makes it more relevant to a diverse set of stakeholders and more effective at addressing the social dimensions that underpin many sustainability problems.
What are the impacts? How do we know if strategies for aligning research with societal needs are working, or if they’re making a difference? To validate the claim that this approach to research produces more solutions, we need to deal with difficult-to-track outcomes, complex and messy decision-making processes, and the long-term nature of effectively managing sustainability problems.
Nevertheless, it’s inspiring to learn from Hart and Silka’s experiences. The institutional change they call for is underway. Academic centers around the world are mobilizing scholars and leaders to solve societal problems. For example, the Gund Institute for Environment at the University of Vermont, where I work, catalyzes collaborative research at the interface of four themes (climate solutions, health and well-being, sustainable agriculture, and resilient communities) and partners with leaders in government, business, and society to develop solutions to urgent global issues.
The ivory tower won’t be rebuilt in a day. To take this work to the next level, institutes need to work together more. We need to build a community of practice and collaborate strategically. Imagine if we replicated projects in our own states and then evaluated and presented them together, or formed an annual directors roundtable, or leveraged strengths and lessons across partnerships that span academic research and other sectors. Together, we can create a broader shared culture for interdisciplinary research that solves real-world problems.
Director of Policy
Gund Institute for Environment
University of Vermont
David Hart and Linda Silka’s blueprint for fundamentally changing their university is particularly telling given that they are based within a land-grant college steeped in the history of cooperative extension. The Smith-Lever Act of 1914 built the extension system on the foundation of the land-grant colleges with the express goal of linking university research to real-world solutions. In many respects, the act was hugely successful.
But as the authors argue, this success remains narrow and limited, and the promise of universities better aligning knowledge and action has faltered. Universities indeed produce “more and better science,” but they fail to effectively connect that science to decision-making in order to make a difference in the world outside the ivory tower.
This is where an even more radical transformation in the academy is required. The authors identify the problem, but do not take the next logical prescriptive step. They note, for example, that several members of the National Academy of Sciences who advised them “felt the risks to such junior faculty during the tenure review process would be too high, and warned that participation in a solutions-oriented interdisciplinary project focused on community stakeholders would adversely affect their publication rate, evaluation by disciplinary peers, and other traditional criteria in tenure review processes.” This commentary is at the heart of why the authors must work so creatively and with such determination—for they are countering a system that fundamentally works against their goals.
Until the academy fully rewards faculty through its hiring and promotion processes for the values inherent in the Hart and Silka experiment, these kinds of solutions-oriented interdisciplinary projects will have impact only at the margins. Their success in creating a dynamic and effective link between their university and stakeholders outside is a huge testament to their team’s and partners’ ingenuity and perseverance.
But if we really want to solve the authors’ “wicked problems” at the scale that is required, and we really want to leverage the expertise of the academy to “solve real-word problems and create a brighter future,” universities will have to transform the entire structure of rewards and incentives that drive faculty promotion. When the norm is tenure committees routinely looking positively on a junior faculty’s “solutions-oriented interdisciplinary project focused on community stakeholders,” then we will know that the ivory tower has been truly rebuilt.
David W. Cash
John W. McCormack Graduate School of Policy and Global Studies
University of Massachusetts Boston
David Hart and Linda Silka revisit enduring principles with refreshing insights. The tortured relationship between science and society is no secret by now, nor is the dichotomous nature of universities as institutions that uphold longstanding traditions at odds with their own research. In this context, the authors’ reflections on perseverance are, to me, the most revealing and timely.
Perseverance is critical not simply because the scientific community is bogged down by stale norms, and not simply because credible science depends on data and results that can be proven over time. Perseverance is critical because sustainable solutions require conflict resolution. Hart and Silka embrace conflict as “raw material in crafting new ways in understanding and solving societal problems.” They are not simply turning a negative to a positive.
Their emphasis on perseverance is nuanced and sympathetic. They describe “stick-to-itiveness” as important to keep people at the table, not to endure at any cost. Right now, we are painfully aware of how fragile relationships can be. And we are challenged by how quickly the context for sustainability can change. People are increasingly led to believe that good work is easy and easily reproducible, but as the authors state early on, people are a fundamental part of any sustainability issue. People have attitudes, beliefs, and preferences.
Hart and Silka ask, Why would anyone’s beliefs, attitudes, norms, and preferences be easily won over? Over the course of their article, that nagging question gives way to another more hopeful take. Sure, doing work for public benefit is hard, but why should that make it any less rewarding?
Deputy Director of Climate and Science Risk Communication
City of New York
David Hart and Linda Silka’s article was both encouraging and troubling. It was encouraging, as their successes in doing good interdisciplinary and transdisciplinary research are mirrored around the world. Even from the vantage point of Tasmania, Australia, a reflective case study from Maine provides a valuable contribution.
What I found troubling were the big challenges that the authors evade, which have substantial implications for our societies. The authors frame necessity through the familiar realization that “pushing back the frontiers of knowledge” was not enough. They jump, pragmatically, to what they could do about this predicament.
But surely, the job of the academy is also to unpack root problems. Why was “doing their part” as scientists not (perhaps no longer) enough? There are many answers to this question, and significant scholarship that attends to them. In brief, this literature speaks to a deep disconnect between how societies make knowledge and how they govern. If it were just that our scientific forms of knowledge production were inadequate, researchers might simply get outside the ivory tower to learn with indigenous people, fishers, farmers, and others.
But this is not the only reason researchers are engaging with diverse stakeholders. We are also working to find and frame problems, and to build collectives that can address them. It is a long bow to call this research. It is governing, and it is necessary. We must do what we can.
My too-brief answer regarding necessity also reveals why the work is so difficult. Governing is hard at the best of times, but harder when you have no formal mandate to define and implement policy options. As researchers, we trade on good will, credibility, and short-term funding. We find ways forward. We build alliances with those who do have mandates, capital, and influence—and hopefully with those most affected by change.
We do all this because resources are being degraded and livelihoods being lost, among other problems. But is there an underlying reason that is rarely spoken of? Could there be a governance vacuum that must be filled? This is the argument some of my Australian colleagues make: that the winnowing of knowledge and capacity within governments is leading to situations in which outcomes are outsourced to universities and consultants. Grand challenges are hived-off, piecemeal, as short-term projects.
Places such as Maine and Tasmania are at an advantage here because our thick networks and deep commitments to communities stand in, to some degree at least, for the institutional mandates, memories, and capabilities that are required to govern complex change processes over long periods. But we cannot replace good institutions for governing, and scientists need to say so, even when it is against our short-term interests.
University of Tasmania
The autonomy and integrity of science
One indicator of the quality of scholarship is its shelf life. Findings from good science are often not immediately used or useful, but their value requires delayed gratification. My favorite example is the recommendation by the National Academy of Sciences, circa 1864, that the United States adopt the metric system: apparently we are still in the implementation phase.
In other cases, the intuitive worth of scientific knowledge can be abruptly undermined by exogenous events. Tight theories of the erosion of social capital, for example,however valid they seemed at the time, are unhinged by evidence from the COVID crisis: we may be physically distant, but we have found resourceful ways to stay socially close.
“The Changing Temptations of Science,” by Stephen P. Turner and Daryl E. Chubin (Issues, Spring 2020), a tour de force, eludes both those problems. (Disclosure: I’ve known Chubin for 35 years and have benefited mightily from his friendship, mentorship, and collegial criticism.) As it was written before the pandemic (the word doesn’t come up), one must marvel at its relevance in the light of current circumstances. And the essence of its arguments, even if not instrumentally applicable to short-term political or economic decisions, will guarantee its durability, especially if it becomes required reading, as I would heartily recommend, in courses on science policy and the policy sciences.
The timing of the article is, indeed, uncanny. As I write this, we have news of potential restructuring and increased funding of the National Science Foundation; reports of hundreds of millions of public and private dollars allocated worldwide toward finding a vaccine for COVID-19; demonstrations of ignorance, distrust, and disdain for factual evidence among key political leaders; and renewed calls for public accountability for investments in research and other public goods. Turner and Chubin’s key themes and arguments apply presciently on those issues. For example, they write that “big science … mean[s] big money, and big money mean[s] a need to justify the expenditure.” Finding a cure for COVID-19, a task likened to the Manhattan project for its magnitude and urgency, clearly will call attention to the benefits and risks of the size, source, and effects of the funding it will require.
Related is the question of incentives faced by scientists in an age of increased competition for public and private support: will pressure on scientists to please current and prospective backers fuel the “temptation to … overpromise … to sacrifice the pursuit of intellectually promising lines of work to those that can be funded, to produce work that is … scientifically trivial?” That’s a scary prospect, perhaps overstated but definitely worthy of attention especially as externalities of shoddy research on vaccines could be catastrophic.
In hopes that their article will sustain public interest, I offer three suggestions.
First, it would be good to facilitate a more comprehensive deliberation on the origins, roles, and potentially negative consequences of productivity and accountability—cherished linchpins of the nation’s economy and democracy—as determinants of the structure and governance of science.
Second, in a society that long ago rejected “philosopher kings” and, by extension, “scientist kings,” the future of the scientific enterprise should be informed by, but not left to, scientists. Designing organizational arrangements to facilitate such a process, which relies on mutual respect, contestation, and collaboration, should benefit from decades of experience with strategies for bringing evidence to bear on policy.
Finally, now may be the time to invest in a new science of science policy, one that more fully integrates disciplinary knowledge from the political, economic, and organizational sciences. Of course, where the funding for such a science will come from, what incentives it might create, and how to evaluate intended and unintended consequences, are questions for which we can be grateful to Turner and Chubin for providing an endurable framing resource.
Dean of the Graduate School of Education and Human Development
The George Washington University
Immediate past-president of the National Academy of Education
Nonresident senior fellow of the Brookings Institution
In their very interesting article, Stephen Turner and Daryl Chubin ask whether we have reached “the inevitable conclusion to the story of science.”
Their article came before COVID-19 transformed the world. Suddenly, scientists are featured on the morning news; a best-selling bobble-head features Anthony Fauci, a leading expert on infectious diseases; and the US government has created the controversial Operation Warp Speed to coordinate and facilitate urgent development of a coronavirus vaccine. No longer is the place of scientists in question. The issues become how much money, global cooperation, and time will be required to develop a reliable vaccine, therapeutic pharmaceutical, or hopefully both.
Although a few people envisioned a global pandemic, none predicted this specific virus with its capability to upend daily lives, devastate economies, and reveal stark political, racial, and social inequities. COVID-19 also underlines the scientific paradigm shift from the twentieth century dominance of physics to the twenty-first century focus on biology. For example, Turner and Chubin emphasized the role of physics and the rise of Big Science and team work critical for the Manhattan Project, and they noted that the later controversy over solar neutrinos carried “no urgency, no political or economic stakes.” In contrast, urgency, political and economic stakes, and social impact drive biology to solve COVID-19.
Biology is not the sole discipline critical to solving the pandemic. Computer scientists to sequence genomes, epidemiologists to model the predicted spread in the population, and social scientists to outline behavioral impacts will join bioscientists on interdisciplinary teams seeking solutions. Researchers from universities, industry, and government must join resources and work in effective partnership. The sole scientist—unfunded, self-policing, and talking with only a few qualified friends—who characterized the theory of liberal science, will not suffice.
Ironically, many of the aspects that the authors feared might foretell the death of science—such as competition, large teams, industry and governmental influence, and enormity of scale—characterize what is now needed. A May 8, 2020, editorial in Science called for a “COVID-19 Defense Research Committee to be empowered to coordinate and fund solutions to the pandemic.” This would parallel the National Defense Research Committee created in 1940 to pursue war-related innovations, which resulted in a new sense of social responsibility and the role of scientists. COVID-19 has already resulted in such a renewed sense for scientists, their role, and the impact of science.
Special Advisor for Research Development and External Partnerships for Academic and Student Affairs
California State University Office of the Chancellor
In Stephen Turner and Daryl Chubin’s article, there are two things at issue. One is curiosity-driven research versus commercially oriented research. The other is unconstrained research versus mission-oriented research. The lines are easily blurred, of course, but it is easy to see the difference.
Consider approaches to health—say, the problem of high blood pressure. Commercially oriented research looks for drug solutions, since that would lead to intellectual property rights and potentially big profits. Unconstrained research would also investigate the effects of diet and exercise, even though these would result in no royalties. Both approaches are mission-oriented in that they have the specific aim of tackling high blood pressure. Unconstrained research might start out investigating blood pressure but get sidetracked by uncovering interesting connections among, say, blood pressure, exercise, and cognition. Research might continue to the point where any interest in blood pressure drops completely out of the picture.
Turner and Chubin are worried about both these issues, though they do not put it in quite the same terms as I do. How can we overcome these problems? That is, how can we loosen the grip of commercially oriented research in favor of more that is curiosity driven, and how can we loosen the grip of mission-oriented research in favor of more that is unconstrained?
My preferred approach is to preach the doctrine of science for its own sake in tandem with science for the betterment of humanity. However, except for a few special cases this hasn’t had much success. Cosmology is a such a case. Most people are fascinated and happy to fund expensive telescopes while not expecting anything useful to come of it. But most research is of interest only to specialists. We need a Plan B.
Here is my tentative (and, I admit, not well thought out) alternative for funding research. Many corporations adopt so-called 20% time programs. Google, for instance, assigns projects to its employees but allows them to use 20% of their time (in effect, one day a week) to work on something of their own choosing. Here is where curiosity-driven, unconstrained research could take place. It seems to have paid off handsomely for Google.
This is a modest proposal—after all, it is only 20%—but at least it addresses the very legitimate concerns that Turner and Chubin raise. There is no mandate for a complete overhaul of the funding system, so in the interim this might be the best we can hope for. And who knows? Maybe it will creep up to 30%, then 40%, and so on.
James Robert Brown
Department of Philosophy
University of Toronto
Science is under strain. The prevalence of fraud, retractions, scandals, and scrambling for grant money suggests an institution in crisis. Stephen Turner and Daryl Chubin diagnose this crisis as the result of changing culture and incentives in science, from an emphasis on discovery to productivity, from autonomy to accountability, from fame to funding. Although they avoid calling for a nostalgic return to the post-World War II culture of science, they clearly think more has been lost than gained in this shift. But is the picture of the past too rosy? And can we chart a more promising future with a clearer picture of the past?
The science of the mid-twentieth century in the United States was well-funded, as public largesse poured into science for the first time. It gave scientists unprecedented autonomy to pursue what they found interesting, and allowed them to train students at unprecedented rates. The ranks of scientists swelled, and funding competition inevitably increased. But science during that period was also deeply sexist and excluded researchers who were not white or male, contrary to ideals offered in the sociologist Robert K. Merton’s universalism.
Science of that time also had its missteps (e.g., eugenics and racist theories of IQ) and unreliable results (e.g., DES as a miscarriage drug). It is not clear that the promises made by Vannevar Bush’s Science, the Endless Frontier came to fruition, at least by the mechanisms Bush presented. As many observers have argued, the great breakthroughs of the post-war era were not made by the free pursuit of science by free intellects, but rather by cross-institutional collaborations that kept their eyes on grounded goals, and held each other accountable to those goals. The success of such science was shaped very much by accountability to those collaborators outside science.
The problem with today’s science is less about a loss of autonomy and more about to whom scientists are expected to be accountable. Producing knowledge that pleases the funders is not the kind of accountability we want in science, as bias is the obvious result. We should think carefully about what we want scientists to be accountable for, and to whom, and shape the norms of science accordingly.
For example, as Merton noted, scientists should have essential accountabilities to each other, as manifested through a healthy culture of criticism and response to criticism. Yet when scientists are accountable only to each other, they can get lost in rabbit holes that do not effectively answer questions (either for the public or for science), and just perpetually call for more research. Additional accountabilities should be properly external-facing, but tied to the public good rather than private interests. And it needs to be more than window dressing.
We should also keep in mind that responsibility and accountability are not the same. Responsible science demands more, and different, things than accountable science. Crucial are responsibilities to the broader society, to honor key moral constraints (e.g., in human and animal subject research) and to aid society in clarifying and pursuing its goals. The ideal of the autonomous scientists pursuing science for its own sake never did serve society or science very well. We need to forge a better ideal.
Department of Philosophy
Michigan State University
Stephen Turner and Daryl Chubin describe an evolution of science between circa 1930 and the present as starting out being led by lone geniuses who functioned autonomously and engaged in self-policing to now being beholden to productivity, impact, and funding agencies. They claim that the pace of discovery has slowed under these new constraints. They emphasize a few major discoveries from the former halcyon days, but with a few exceptions ignore the many breakthroughs in the past several decades. The authors scoff at the value of advances in mathematical modeling, which in the COVID-19 era is proving itself to be lifesaving.
Although the authors’ argument is elegantly written and thought provoking, their romantic view of the past and cynical view of the present derives from selectively emphasizing elements of both. They ignore how the previous norms of science excluded women, males of color, and individuals with disabilities, and even used scientific arguments to justify this exclusion. They describe how the current system has led to the “surrender of individual autonomy” by scientists whose “special status as experts is compromised.” Absent from this description is how the previous era of science enabled both scientists and society to ignore the contributions of female scientists, such as Rosalind Franklin in the discovery of DNA (which the authors highlight as an example of the benefits of the past approach to science), and scientists of color, such as Percy Lavon Julian, who synthesized physostigmine, used to treat glaucoma, among other conditions.
Although the authors praise previous generations of scientists for being able to police themselves with their own social norms, they ignore the egregious deficiencies in this practice. Among notable examples of past failures in self-monitoring, in the name of scientific discovery scientists from Harvard University and the Massachusetts Institute of Technology fed radioactive iron in oatmeal to developmentally disabled children to study the absorption of calcium; and in a study that was deemed scientifically valuable black men with syphilis in Tuskegee, Alabama, were left untreated to study the natural course of the disease.
The authors state that “progress on fundamental issues has stalled,” but who defines fundamental? One positive outcome of the current approach to science is its invitation to include a broader array of scientific disciplines than in the past. This has opened the door for scientific scrutiny of issues fundamental to creativity and innovation, such as research on study design, data analysis, peer review, team dynamics, and translation of discovery into practice.
I applaud the authors for writing such an enjoyable and provocative paper. They remind us that science exists in a larger society, and that there are benefits and harms to past, present, and most assuredly future approaches to conducting, evaluating, funding, and valuing science and scientists.
Virginia Valian Professor
Departments of Medicine, Psychiatry, and Industrial & Systems Engineering
University of Wisconsin-Madison
In their deep reservations about the “impact agenda” in contemporary science, Stephen Turner and Daryl Chubin reveal a disconcerting feature about the ideal of scientific autonomy that they defend. As their own repeated references to Vannevar Bush and James Bryant Conant show, this ideal has been historically promoted by the academic establishment, typically in aid of elite science.
Yet it is far from clear that academia has been the home of scientific autonomy in its broadest sense; namely, the free selection of the ends and means of research. Instead, what has been upheld has been the much more narrow and self-serving idea of academically oriented scientists being autonomous from any constraints imposed by the rest of society. This effectively grants the academic establishment exclusive rights to impose all the constraints it wants on those who would claim to do “science.” Thomas Kuhn’s totalitarian sense of paradigm, inspired by Conant, concedes this point by making it a condition of being considered a scientist that one self-presents as having undergone the appropriation forms of acculturation (aka indoctrination) into science; hence the significance attached to peer review.
Put another way: the defense of scientific autonomy has been really about the entitlement of a group of self-certifying academics to monopoly ownership over science, understood as the most highly valued form of knowledge in society. The terms on which the US National Science Foundation was established, masterminded by Bush and Conant, was a victory for this vision. Thus, Conant’s promotion of “no science before its time” should be understood as a form of soft power, whereby science domesticates the passions of a populace increasingly impressed by its achievements.
From this perspective, the “finalizers” of science discussed by Turner and Chubin are best seen as constructive critics. They were observing that scientists left to their own devices were likely to squander their entitlement by becoming increasingly self-involved in problems that removed them from larger societal concerns, even though they had already secured a body of usable knowledge. Today’s impact agenda in science policy is the heir to such constructive criticism.
However, the conditions under which the NSF was established—the success of the Manhattan Project—by no means guaranteed this academic power grab. Notwithstanding the elite scientists involved, the creation of the first atomic bomb was a task set by the government in the context of national security. Within those constraints, the scientists were given free rein to solve the problem, and their solution was validated outside academic peer review. In short, it was closer to the broad sense of scientific autonomy: no one really knew what an atomic bomb was until one had been successfully detonated.
In effect, the Manhattan Project scientists had more discretion over the product they delivered than academic scientists ever do. Indeed, the Defense Advanced Research Projects Agency would have been a more appropriate long-term US response to the Manhattan Project than the NSF to promote the potential power and benefit of scientific autonomy for the larger society. But creation of that agency required nearly another decade—and the launch of Sputnik.
Auguste Comte Chair in Social Epistemology
Department of Sociology
University of Warwick
Stephen Turner and Daryl Chubin question whether our values have changed along with the movement from individual to groups and with the changing mandates and mechanisms used to support discovery. The authors highlight examples and patterns that point to a shift in the ethical underpinning of the academic enterprise as a whole.
I appreciate their reminders of scholarly philosophical definitions of science and of essays from leadership that influenced the change in how science is supported. Although I am not a scholar in either philosophy or the history of science, I do dwell on causality. In that role, I’m not ready to accept many of the causal connections that they propose.
Turner and Chubin ask whether “the disappearance of exploratory science (is) a result of science’s natural growth and evolution … or the result of the structures of institutional science themselves.” I favor the former explanation. A natural progression of our ability to record observations and ideas is that our collective understanding of the world, and even the universe, has grown with time. New methods and new strategies replace the old. Deeper problems require more than single disciplines; they require collaborations. In some cases, huge collaborations are needed to advance the cutting edge.
Individuals and institutions adapted to take advantage of opportunities when presented. Applied science, for-profit science, is now as respectable as basic science, and in some quarters is seen as even more desirable. As the knowledge became more complex, the ability for nonexperts to judge the quality or contribution became increasingly difficult, thus making hiring, promotion, and funding decisions more challenging.
An unfortunate workaround has been the use of surrogates, such as numbers of papers or grants, or numbers of citations. Rather than examine the original work in depth, one might assume the respectability of the work merely from whether it was cited by others. The price we pay for not making the effort to dig deeper ourselves is that the use of surrogates leaves room for manipulation.
For most of us who joined the ranks of science professionals in the latter third of the twentieth century, the exciting areas to explore had been opened by the new tools developed by our predecessors. We led or followed the crowds, not because that was where we would find funding (as suggested by the authors), but because that was where the new methods and the revised cutting edge would be.
Turner and Chubin note that along with the emergence of large projects conducted by teams, the pattern of funding science changed, from people-oriented to project- and impact-oriented. The authors leave to our imagination the challenges and successes of the agency staff who were successful in their arguments to Congress for public funding of broad swaths of scientific inquiry. We imagine that their arguments were pragmatic and focused on projects and outcomes.
Indeed, the change in how projects are judged evolved from how the funding was justified. The agency staff was also concerned with bias in the review process, the “old boy network.” A resulting approach was to focus on the feasibility and potential value of the project rather than on the esteem of the person. In my opinion, serving the public good and reporting to Congress shaped the review process more than any shift in values of science as a method.
I believe that science, a broadly accepted method for understanding the what and the how, operates under different values as we become more enlightened. We can no longer rely on common sense and individual approaches to carry out our shared values of how research is conducted. As tools and methods of investigation evolved, we recognized other issues that went beyond the domain of individuals. Unfortunately, the bad behavior of individuals can influence the public impression of the methods and ethical values of scientists as a whole. We needed to assure the welfare of research subjects, whether animal or human. We needed to prevent exposure of lab workers or the public to hazardous materials. We needed to prevent or detect fraud.
In their thoughtful essay the authors have raised critical questions for those responsible for developing the next generation of scientists. I’d be delighted if this is chapter 1, recalling the good old days when individuals had more autonomy. I await a chapter 2 that might take up the challenge of telling us how to retain the noble nature of discovery and adapt it to the needs of our changing society.
Clifton A. Poodry
University of Oregon
University of California, Santa Cruz
Reinventing science fairs
“Reinventing Science Fairs,” by Frederick Grinnell (Issues, Spring 2020) is a timely contribution. The work that he and his colleagues have done on science fairs is in sync with the growing recognition in the field of science education that a lot of science learning goes on outside formal classrooms.
Grinnell is to be commended for exploring the diversity of science fairs in several dimensions, among them compulsory vs. optional and competitive vs. noncompetitive.
Noncompetitive science fairs were found to afford more opportunities for students to learn about the process and the nature of science than did competitive ones. It has become a central mantra of science pedagogy that actually doing science is better that memorizing “science facts.” Thus, noncompetitive science fairs are evidently well positioned to give students a real-life experience of the difficulties and triumphs of scientific inquiry.
Not surprisingly, voluntary science fairs were found to be more likely than compulsory ones to increase participants’ interest in science. Surprisingly, even half the students who were required to participate reported an increase in interest. Having done research in science education for a while, I have observed that most of the effects usually detected are small, so this effect of science fairs among nonvoluntary participants looks quite impressive to me.
The underlying dichotomy driving the various forms of science fairs (in terms of competitive vs noncompetitive; voluntary vs. compulsory) is a tension between two fundamental goals of science education—to ensure a science-literate citizenry necessary for a flourishing democracy, and to populate the hierarchical system of science with highly competent scientists. The science profession is competitive and often requires a tremendous amount of stamina and tolerance for delayed gratification from its practitioners. The most competitive science fairs function as a feeder system into that science profession.
On the one hand, there is a competitive focus on excelling and accruing advantages for a successful pathway into college and a career. On the other hand, there is the noncompetitive focus on learning or engaging in something that is beneficial for the individual as well as the citizenry as a whole.
The good news is that science fairs do not really need “reinventing.” The very precondition for the research has been the existing variability among science fairs. Hence, all that might be needed is recalibrating this existing diverse ecosystem of science fairs by placing more weight on the voluntary and noncompetitive kind, while not getting rid of the other kind. Herein may lie cause for optimism.
Education Specialist / Lecturer on Astronomy
As a student in New York City, participation in science fairs was fairly commonplace in grade 4–12. Most of the time, projects were required, but occasionally they were voluntary for extra credit. Science fairs were held at the school level, and the best projects went to a citywide fair. My feelings about the science fairs varied, and really depended on how I felt about the teacher of my class and the specific requirements.
As a biology teacher, I have had my students participate in science fairs, when a school-wide structure was in place. Sometimes it was required of students; other times it was not. I have also served as a judge for numerous high stakes, and not-so-high stakes, fairs. After many decades of experience with science fairs in a variety of roles, I am a supporter of science fairs, but only tentatively so.
Frederick Grinnell’s article provides a good balance of the pros and cons of having voluntary or involuntary science fairs and those that are competitive or noncompetitive. Indeed, the positive and negative ways that students interact with voluntary/involuntary and competitive/noncompetitive educational situations has been well established by the research on self concept, needs achievement, and locus of control. In short, what Grinnell has clearly explained extends well beyond science fairs to educational experiences in general.
The same is true with the resources a student may have at his or her disposal. This may come in the form of both cognitive and physical resources. A student may have a partnership with an active scientist or have a highly engaged parent, or have a parent that can give them access to an electron microscope. Again, the great educational divide between students of different socioeconomic groups has long explained what can be noted in competitive science fairs.
In general, there is not much interest in the science education community, as Grinnell noted, because the research at best is ambiguous with respect to the success that science fairs of any format have on students’ learning of science or their attitudes toward science. Students who participate in voluntary science fairs tend to already have positive attitudes toward science, as well as stronger science backgrounds.
In terms of redesigning science fairs, there needs to be a much more concerted effort on connecting the focus of projects to the science curriculum and what students are expected to learn. Science fairs do not need to be a “break” from learning what many learned individuals have decided is important to learn.
The most important question that I would like to raise is, should we focus on consistently promoting authentic research for individuals and groups in our science classes as opposed to spending time and resources on difficult-to-implement science fairs? This is why I am uncertain whether science fairs are really needed. Regardless of their structure, they may very well be counterproductive to the goal of scientific literacy.
Norman G. Lederman
Illinois Institute of Technology
Why COVID-19 needs to be political
In dealing with the COVID-19 pandemic, a repeated message is that we ought not to politicize it. At the same time, social media are replete with stories about how the coronavirus will change the world because it has cruelly laid bare the frailty of our health care systems and underscores the need to build more resilient societies in anticipation of future shocks. What a predicament.
There are reasons to be hopeful for our post-pandemic world. As Daniel Sarewitz contends in “Pandemic Science and Politics” (online exclusive, March 25, 2020), we are converging around a shared concern of valuing life. We have witnessed acts of heart-warming solidarity, and many countries are enacting social-support policies that had been unthinkable just a short time ago.
Yet as discussions about how to emerge from lockdown are beginning to circulate, we hear the same old disagreements across the political spectrum about the substance, pace, and direction of post-COVID change. Environmentalists insist that we must give up flying and other excessive luxuries to enact deep ecological transformation. Conservatives are promoting sovereignty of nations over international cooperation. Free-marketeers demand contracts (and bailouts) for the private sector, while, equally predictably, anarchists proclaim the end of global capitalism.
Hopes of durable structural change are further tempered by our past experiences with worldwide systemic crises. In the wake of the 2008 global economic meltdown, financial institutions including the World Bank called for profound economic transformation. But these changes are as elusive today as they were then. The financial system largely returned to business as usual, and policy-makers made status quo choices at the expense of social considerations.
This should not make us cynical about the prospects for transformation, but it is important to bring these political and economic realities into the present debates about our post-COVID future. Developing a truly convincing response to the present crisis will depend on many factors, many of which are still unknown: the availability and cost of a vaccine; who benefits from the crisis in the short and long run; what we remember and what we intentionally, inadvertently, or carelessly choose to forget. Most important, it will depend on whether and how people fight for the changes they want to see, be it health care for all, universal basic income, more coordinated risk-mitigation strategies, a stronger role for government, or social and environmental protections.
We also must not overlook the politically directed changes already heading in a different direction than many people would like to see: the United States is rolling back environmental protection under the guise of getting the economy going again, Hungary has moved to dictatorship, and surveillance is likely to increase with greater contact tracing, whether accidentally or by design. Those who want to see durable change for the better need to act now by developing strategies and robust political alliances for sensible, equitable responses that can adapt to present and future uncertainties.
In our forthcoming book, Responsibility Beyond Growth, we argue for mobilizing these alliances around the notion of responsibility and adopting new thinking about how to organize the global economy “beyond the market.” By focusing on what different parties mean by responsibility and by innovating beyond the usual measures of growth (typically the gross domestic product, which is said to represent the total value of goods and services produced by the economy), we turn to politics in the purest sense of the term, the art of contesting and constituting power for social decision-making, not the argy-bargy of whose ego is most massaged.
Without this dedicated and sustained political engagement, the COVID pandemic will pass as just another crisis—an unfortunate calamity that has hit the world hard, but will not unleash the social and political change needed to build more responsible and resilient societies.
Michiel Van Oudheusden
Stevienna de Saille
The authors are social scientists based in the United Kingdom and New Zealand. They study the relationships between the political economy, innovation policy and concepts of responsibility.