Semiconductors and Environmental Justice

In “Sustainability for Semiconductors” (Issues, Fall 2022), Elise Harrington and colleagues persuasively argue that the CHIPS and Science Act of 2022 offers the United States a unique chance to advance national interests while decoupling the semiconductor industry from supply chain vulnerabilities, public health problems, and environmental hazards. The path the authors suggest involves improving industry sustainability by, among other actions, circularizing supply chains—that is, by designing with a focus on material reuse and regeneration.

But any industrial effort to circularize the supply chain will face an uphill battle without addressing competition policy concerns. Today’s large companies and investors seem to assume it natural to seek illegal monopoly power, regardless of its toxic effects on society. Some companies may claim consolidation as a cost of US technological sovereignty, but consolidation is actually a threat to national security. Achieving a circular supply chain will require innovative policies for competition and coordination of pre-competitive interests across use, repair, reuse, refurbishment, or recycling of semiconductors and destination devices. Securing downstream product interoperability, rights to repair, and waste and recycling requirements would be a promising start.

Further, building a strong and sustainable semiconductor industry should not come at the expense of public and environmental health. Attention to environmental justice must be front and center. The European Commission is advancing a novel approach to regulating economic activities for sustainability through the principle of “do no significant harm.” However, the Commission, as well as Harrington et al., fixates on negotiating quantitative, absolute environmental targets to arbitrate the harm of an industrial activity. Harm and its significance are subjective and contingent on the parties involved (and the stage of the industrial lifecycle). Too often, research, development, and industrial policies end up simply becoming governments doling out permission to harm individuals and communities for the benefit of a few for-profit companies. Silicon Valley, with 23 Superfund sites, the highest concentration in the country, has a lot to answer for on this front.

Building a strong and sustainable semiconductor industry should not come at the expense of public and environmental health. Attention to environmental justice must be front and center.

Finally, the United States should avoid a “race to the bottom” of state and local governments undercutting each other to secure regional technology hubs. Too often, relocation handouts siphon tax dollars from communities and schools to attract corporations that prove quick to loot the public purse and leave to the next doting location. For example, the Mexican American environmental justice movement has noted how major semiconductor industries in New Mexico and Arizona regularly secured state subsidies yet provided few quality jobs and little community reinvestment, burdened communities with environmental wastes, and drained scarce water resources. To center environmental justice in semiconductor sustainability efforts, much can be learned from such highly effective good neighbor agreement efforts. Respecting small and medium-size industries, centering environmental justice, and fairly distributing the benefits of a semiconductor renaissance around the country would be not only good policy but also good politics, as shown in other industrial policy efforts. A sustainable semiconductor industry considering these strategies would be more likely to win political and public support and stand a better chance of genuinely benefiting the nation and its people.

Center for Innovation Systems & Policy

AIT Austrian Institute of Technology

Technology-Based Economic Development

In “Manufacturing and Workforce” (Issues, Fall 2022), Sujai Shivakumar provides a timely and important review of the CHIPS and Science Act. This landmark legislation aims at strengthening domestic semiconductor research, development, design, and manufacturing, and advancing technology transfer in such fields as quantum computing, artificial intelligence, clean energy, and nanotechnology. It also establishes new regional high-tech hubs and looks to foster a larger and more inclusive workforce in science, technology, engineering, and mathematics—the STEM fields. In a recent article in Annals of Science and Technology Policy, I noted that the act focuses tightly on general-purpose technologies, emanating from technology transfer at universities and federal laboratories. Shivakumar correctly notes that public/private investment in technology-based economic development (TBED) in manufacturing must be accompanied by workforce development to match the human capital needs of producers and suppliers.

I have two recommendations relating to workforce development, in the context of technology transfer. The first is based on evidence presented in a 2021 report by the National Academies of Sciences, Engineering, and Medicine titled Advancing Commercialization of Digital Products from Federal Laboratories. (In full disclosure, I cochaired that committee with Ruth Okediji of Harvard University.) The report concluded that accelerating commercialization of research requires that we achieve a better understanding of workplace and managerial practices relating to technology transfer, including individual and organizational factors that may inhibit or enhance the ability of scientists to engage in commercialization of their research. These factors include the role of pecuniary and nonpecuniary incentives, organizational justice (i.e., workplace fairness and equity), championing, leadership, work-life balance, equity, diversity and inclusion, and organizational culture. Understanding such issues will help identify and eliminate roadblocks encountered by scientists at federal labs as well as universities who wish to pursue technology transfer. It would also allow us to assess how “better performance” in technology transfer is achieved.

Accelerating commercialization of research requires that we achieve a better understanding of workplace and managerial practices relating to technology transfer.

A second recommendation concerns a major gap that needs to be filled, in terms of developing a more inclusive STEM workforce to implement these technologies. This gap centers on tribal communities, which are largely ignored in TBED initiatives and technology transfer. Unfortunately, economic development efforts for tribal communities have predominantly focused on building and managing casinos and developing tourism. Results have been mixed, with limited prospects for steady employment and career advancement.

Opportunities for TBED strategies to aid tribal communities might include the development of new investment instruments, the strategic use of incentives to attract production facilities in such locations, and the promotion of entrepreneurship to build out supply chains. This would require adapting tools for TBED to be better suited to the needs and values of the communities. That means developing a TBED/technology transfer strategy that simultaneously protects unique, Indigenous cultures and is responsive to community needs.

In sum, I agree with Shivakumar that workforce development is key to the success of the CHIPS and Science Act. Two complementary factors that will help achieve its laudable goals are improving our understanding of how to better manage technology transfer at universities, federal labs, and corporations, and involving tribal communities in technology-based economic development initiatives and technology transfer.

Foundation Professor of Public Policy and Management

Co-Executive Director, Global Center for Technology Transfer

Arizona State University

Sujai Shivakumar stresses the importance of building regional innovation capacity to bolster manufacturing innovation in the United States. He rightly notes that this needs to be a long-term cooperative effort, one requiring sustained funding and ongoing policy attention.

One of Shivakumar’s key points is the necessity of complementary state and local initiatives to leverage federal and private investments through the use of public-private partnerships. As he notes, the success of the nano cluster centered in Albany, New York, was initially based on collaboration with IBM and the state, especially through the College of Nanoscale Science and Engineering, but it reflected a 20-year commitment by a succession of governors, both Republican and Democratic, to developing a regional semiconductor industry.

We need to capitalize on current centers of excellence, even as we seek to create new ones in a spoke-and-hub model, using their proven strengths to reinforce this ambitious national undertaking.

Working with Thomas R. Howell, we have documented the long-term nature of this effort in a recent study titled Regional Renaissance: How the New York Capital Region became a Nanotechnology Powerhouse. We describe in some detail (because the details matter) the role of regional organizations, such as the Center for Economic Growth and the Saratoga Development Commission, steadily supported by leaders of the state assembly to find a site, obtain the necessary permits, build out infrastructure, and win public support. The state also encouraged training programs in semiconductor manufacturing by institutions such as the Hudson Valley Community College. Moreover, the semiconductor company AMD (now Global Foundries) was attracted by the resources and proximity of the College of Nanoscale Science and Engineering, building and expanding a semiconductor fabrication plant—or “fab,” in the field’s parlance—that has led to many thousands of well-paying jobs.

This model is especially relevant to meeting the needs of growing numbers of semiconductor fabs that are encouraged under the CHIPS and Science Act. Indeed, in order to address the need for applied research, student training, and collaborative research among semiconductor companies, the Albany facility stands out, not least for its proven track record and its exceptional capabilities, based on its commercial scale fab, ideal for testing equipment and designs but unusual for a university.

This facility can and should be a central node in the semiconductor ecosystem that the nation seeks to strengthen. If we are to avoid an excessive dispersal of funds and the long lead times of new construction and staffing, we will need to draw on existing facilities that are already operational and can be reinforced by the resources of the CHIPS and Science Act.

In short, we need to capitalize on current centers of excellence, even as we seek to create new ones in a spoke-and-hub model, using their proven strengths to reinforce this ambitious national undertaking. Time is not our friend; existing assets are.

Adjunct Professor

Global Innovation Policy

Science, Technology, and International Affairs

School of Foreign Service

Georgetown University

R&D for Local Needs

In “Place-Based Economic Development” (Issues, Fall 2022), Maryann Feldman observes that the CHIPS and Science Act marks “an abrupt pivot in the nation’s innovation policy” away from the laissez-faire system of the past and toward a policy focused on addressing regional economic development. Central to this new course is the act’s directive for the National Science Foundation (NSF) to “support use-inspired and translational research” through its new Technology, Innovation, and Partnerships (TIP) directorate.

Yet nowhere within the statute are these terms defined or described. The phrase “use-inspired research” was coined in 1997 by the political scientist Donald Stokes in his seminal work, Pasteur’s Quadrant, in which he sought to break down the artificial distinctions between scientific understanding and wider use while rejecting overly limiting terms such as basic and applied research. For Stokes, research focused on real-world societal problems—such as French chemist Louis Pasteur’s work on anthrax, cholera, and rabies—can spark both new fundamental knowledge and applied breakthroughs.

But what potential uses will inspire the next generation of innovators? If we look to the text of the CHIPS and Science Act, the legislation outlines 10 technology focus areas and five areas of societal need to guide use-inspired research overseen by the TIP directorate. Beyond these lists however, there is another source of inspiration that is strongly implied by the legislative language: regional societal and economic needs—specifically, the needs of places where scientists live and work.

What potential uses will inspire the next generation of innovators?

While this observation may sound simple, implementation is not. Indeed, researchers at the University of Maine previously described in Issues the intricate challenges of crafting a regional use-inspired research agenda, creating community partnerships, engaging stakeholders, and breaking through institutional and cultural barriers that transcend publish-or-perish incentives to help produce real-world solutions. 

The CHIPS and Science Act has launched such an endeavor on a national scale with NSF as the driver. It is a new place-based research enterprise that finds inspiration from the needs of diverse geographic regions across the United States. The statute is an important step, though many bumps in the road lie ahead, including securing the necessary appropriations. However, by focusing more on the needs of geographic regions through use-inspired research, NSF can better meet the mandate of CHIPS and Science to address societal, national, and geostrategic challenges for the benefit of all Americans.

President, Arch Street

Former professional staff member, US House Committee on Science, Space, and Technology

The recent CHIPS and Science Act ensures that the invisible hand in the capitalist US economy is now far from invisible. With nearly $280 billion in planned investments by the federal government for domestic research, development, and manufacturing of semiconductors, officials are hoping this support will lead to a technological paradigm shift for future generations. Officials contend this sizeable investment will decrease the nation’s dependence on countries such as South Korea, Taiwan, and China, which have dominated the semiconductor industry for the past two decades. By their dominance, these countries are effectively in control of the US supply chain and thus threaten the nation’s current and future national security. Credit is due to federal elected officials for realizing the need for such critical investment in the US economy and semiconductor industry. How that funding and support is distributed, however, remains a critical component of the legislation.

In her essay, Maryann Feldman argues that “the United States needs a bold strategic effort to create prosperity.” While one could argue that the Chips and Science Act is the bold public policy needed to ensure the nation’s technological independence and innovation, from an economic and public policy perspective I would argue that the most critical components of that act will be source contract award processes that are directly connected to defined requirements, strong agency oversight, engagement throughout the award timeline, early definition and commitment to creating commercialization pathways, implementation of award support for research personnel in flyover states, and a commitment to assess program results as they relate to the requirement that generated the award.

With nearly $280 billion in planned investments by the federal government for domestic research, development, and manufacturing of semiconductors, officials are hoping this support will lead to a technological paradigm shift for future generations.

Additionally, based on contemporary research from TechLink, a US Department of Defense Partnership Intermediary, in order to develop innovative, successfully commercialized technology, identifying the most effective research personnel in flyover states—those individuals who will be able to ensure that their technology innovations and commercialization overcome the well-known “valley of death”—will be critical for place-based economic impacts and outcomes.

The CHIPS and Science Act needs to ensure that when technology decisions and appropriations occur at the practical level, they are funded to a diverse set of independent entrepreneurs, innovators, nonprofit organizations, universities, small businesses, local government partners, and educated citizens in research parks—those whose sole purpose is in developing the most advanced technology conceivable. COVID-19 changed how researchers can coordinate remotely with leading technology experts in the field, regardless of their physical location. Technological innovation and commercialization must take precedence over quid-pro-quo politics in Washington, DC, or the nation’s attempt to become the global leader in the semiconductor industry will have started and ended with federal public policy officials and bureaucrats. If the recommendations that Feldman makes regarding place-based economics, public policy implementation, and economic development are implemented, I’m confident the United States will surpass China, Taiwan, and South Korea in a semiconductor paradigm shift that will last for decades to come.

Department Head, Economic Impacts

TechLink

Maryann Feldman in her excellent article makes a strong case that one of the most important strategic requirements for future growth in high-income jobs is expanding what the regional economic growth policy arena calls “innovation hubs.”

Feldman states that “place-based policy recognizes that when firms conducting related activities are located near each other, this proximity to suppliers and customers and access to workers and ideas yield significant dynamic efficiency gains. These clusters may become self-reinforcing, leading to greater productivity and enhanced innovation.”

There are a lot of economic rationales and policy implications packed into this summary statement. From an economist’s perspective, innovation hubs are an essential industrial structure for future economic development, first and foremost because they enable the realization of “economies of scope.” This is a distinguishing characteristic from the Industrial Revolution in which “economies of scale” dominated.

More specifically, scale is the dominant driver when product technology is relatively simple and product differentiation is therefore limited. In such cases, the emphasis is largely on reducing unit cost; that is, price is the basis for competing. In contrast, modern technology platforms offer a much wider “scope” of potential product applications, which requires more sophisticated process technologies in terms of both quality and attribute flexibility. The increasingly differentiated needs of modern high-tech supply chains means that economies of scope with respect to emerging technology platforms are now the major policy driver.

More technically demanding product and process technology development and use require higher and diversified labor skills. As the complex set of labor inputs changes continuously with the evolution of technology development, responsive educational institutions are essential to update and refocus workers’ skills. The resulting diverse local (and hence mobile) labor pool is essential to meeting rapidly changing skill requirements across firms in a regional innovation cluster.

The increasingly differentiated needs of modern high-tech supply chains means that economies of scope with respect to emerging technology platforms are now the major policy driver.

Further, the potential for economies of scope provides many opportunities for small firms to form and pursue niche markets. But doing so requires the availability of a local start-up infrastructure embodying such institutional entities as “accelerators” and “incubators” to facilitate evolution of optimal industry structures.

The extreme dynamic character of technology-based competition determined to a significant extent by economies of scope inherent in modern technology platforms means considerable shifting of skilled labor among competing firms, as new application areas are developed and grow. Co-location of a large skilled labor pool and a supporting educational infrastructure is therefore essential. Similarly, the extreme dynamics of the high-tech economy that affords opportunities for new firms to form and prosper works well only if a significant venture capital infrastructure is present.

These factors—facilitation of economies of scope in research and development; a diverse and skilled local labor pool; start-up firm formation; risk financing; and technical infrastructure—collectively promote the innovation hub concept. As Feldman states, “For too long, the conventional policy approach has been for government to invest in projects and training rather than in places.”

In summary, the complexity of modern technology development and commercialization demands a four-element growth model: technology, fixed capital (hardware and software), and skilled labor, all of which depend on a complex supporting element: technical, educational, and business infrastructure. All four assets must be co-located to achieve economies of scope and hence broad-based technology development and commercialization.

Research Fellow, Economic Policy Research Center

University of Washington

The Problem With Subsidies

The United States government is waging a “chip” war with China. The war is fought on two fronts: one is preventing China from accessing the latest artificial intelligence chips and manufacturing tools, and the other is subsidizing large firms to bring manufacturing back to the United States. But as Yu Zhou points out in “Competing with China” (Issues, Fall 2022), “whether [the CHIPS and Science Act] will improve US global competitiveness and prevent the rise of China is uncertain.”

A race to subsidies as America’s solution is uncertain and problematic because it is based on a misunderstanding of how innovative Chinese firms advanced their technologies. Contrary to the popular belief that China built its technology industry through massive subsidies, China’s records of state-sponsored technology investments are often spotty. The prime example is the semiconductor industry, with billions of dollars invested by the state over the last three decades; the industry’s advancement consistently fell short of government targets. The real secret of the Chinese high-tech industry is indigenous innovation—that is, innovative Chinese firms sense unfulfilled domestic demands, innovate to generate localized products at a lower cost, build on access to the vast Chinese market to scale up, and eventually accumulate problem-solving capabilities to approach the technological frontier. Ironically, the US government’s chip war is creating a space for indigenous innovation for Chinese semiconductor companies, which was previously absent when China relied on American chips.

A race to subsidies as America’s solution is uncertain and problematic because it is based on a misunderstanding of how innovative Chinese firms advanced their technologies.

Taking the wrong lessons from China could have unintended consequences for the US industry. Since leading American high-tech firms have spent some $633 billion on stock buybacks over the past decade, it can hardly be assumed that their lack of enthusiasm for investing in semiconductor manufacturing is because of a lack of cash. But showering money on business, as China’s experience showed, would certainly lead to unhealthy state-business relations. Already, the CHIPS and Science Act has created perverse incentives for lobbying for more and more subsidies.

Instead of competing on subsidies, the United States should compete with China in areas where it excels, namely innovation in emerging technologies. Historically, the United States has had few successes in reviving mature industries through subsidies, whether it was steel, automobiles, or memory chips. But it has consistently led the world in new technological revolutions over the past century. In a world facing a climate crisis, the United States should compete with China to bring innovations to solve the nation’s existential problems and win the ultimate prize of technological leadership that benefits all humankind. After all, the subsidy war benefits few but corporate bottom lines.

Assistant Professor of Innovation Policy

School of International Relations and Public Affairs

Fudan University

Shanghai, China

Author of China’s Drive to the Technological Frontier (Routledge, 2022)

Time to Reform Academic Publishing

In “Public Access to Advance Equity” (Issues, Fall 2022), Alondra Nelson, Christopher Marcum, and Jedidah Isler touch on the many reasons why open access to federal research is critical and highlight some of the challenges to come. We wholeheartedly agree with their sentiment—“A research ecosystem where everyone can participate and contribute their skills and expertise must be built”—and we applaud both the Biden administration and the White House Office of Science and Technology Policy (OSTP), where the authors work, in their commitment to make federally funded research open to the public.

In particular, as graduate, professional, and medical students, we have been shaped by the relics of an inequitable publishing model that was created before the age of the internet. Our everyday work—from designing and running experiments to diagnosing and treating patients—relies on the results of taxpayer-funded research. Having these resources freely available will help to accelerate innovation and level the playing field for smaller and less well-funded research groups and institutions. With this goal of creating an equitable research ecosystem in mind, we want to highlight the importance of creating one that is equitable in whole.

In the same way that open access to reading publications is important to keep the public and institutions informed, open access to publishing is equally important, as it allows institutions to make their work known. With free access to federally funded research, this effect will be even greater. It is critical that access to publishing is open to promote learning from the public knowledge as well as contributing to it.

As graduate, professional, and medical students, we have been shaped by the relics of an inequitable publishing model that was created before the age of the internet. 

But today, the incentives for institutions do not align with goals of equity, and change will be necessary to help support a more equitable system. Nor do incentives within institutions always align with these goals. This is especially true for early-career researchers, who might struggle to comply with new open-access guidelines if they need to pay a high article publishing fee to make their research open in a journal that is valued by their institutions’ promotion and tenure guidelines.

To these ends, it is imperative that the process for communicating research results to the public and other researchers does not shift from a “pay-to-read” model to a “pay-to-publish” model. That is, we should not use taxpayer dollars to pay publishers to make research available, nor should we simply pass these costs on to researchers. This approach would be unsustainable long-term and would go against the equity goals of the new OSTP policy. Instead, we hope that funders, professional societies, and institutions will come along with us in imagining and supporting innovative ways for communicating science that are more equitable and better for research.

As the research community works to implement the new OSTP policy intended to make scientific results publicly accessible, it will be critical for the next generation of researchers that the federally funded research system be made open in a way that is equitable and inclusive of those early in their careers.

President, National Association of Graduate-Professional Students

National President, American Medical Student Association

A New Role for Policy Analysts

In “Government and the Evolving Research Profession” (Issues, Fall 2022), Candice Wright highlights the increasing pressure on researchers to make fundamental advancements in their fields, enable technology transfer to help solve pressing problems, contribute insights to policy, and appropriately manage national security risks and research security. Navigating such challenges requires a mix of skills that are hard to find in any single person. Instead, these challenges call for collaboratively produced technical and policy-relevant analysis that can then be applied in both public and private spheres.

It is imperative to consider how scientific and technical experts can best contribute to a productive, evidence-based scientific and policymaking process. How researchers answer this call can have a profound impact on their career, their professional standing, and occasionally their public reputation. The problem is that for professional academics and researchers with an educational and knowledge-generating mission, such policy, tech transfer, and national security work is difficult and often requires time and resources that can be hard to justify within existing incentive systems. How do you best retain your honest-broker status amid the risk of entering the political fray or distracting from your research agenda with a multitude of nonresearch engagements and travel? Policy analysts who are well-trained and make a career at the intersection of policy, national security, and emerging technologies can help fill this gap with specialized skills that augment those of researchers.

Policy analysts who are well-trained and make a career at the intersection of policy, national security, and emerging technologies can help fill this gap with specialized skills that augment those of researchers.

Policy analysts (including those at the Government Accountability Office) can help translate the scientific and technical content for nontechnical decisionmakers. This is a viable career for both technically and policy-trained individuals. Working with good analysts who have a strong contextual understanding of policy and enough scientific technical expertise to understand the core issues in play is transformative. Their ability to communicate and translate information from technical experts will help bench scientists increasingly understand the analysts’ value and will open up new ways to work together.

Finding people who can work both sides of the technical and policy equation is difficult. There’s a history of training policy fellows (e.g., at the American Association for the Advancement of Science and the National Academies) and now some new fellowships are increasing attention on this role (TechCongress, Open Philanthropy, Federation of American Scientists). This is heartening, but the total number of highly skilled emerging-technology policy analysts is still relatively small, and their long-term career viability is still uncertain. The scientific research and academic communities need to create ramps and pathways from traditional fields to policy analysis roles with formal training options in these hybrid areas. Technical experts need to be encouraged to take these paths and find home organizations where they can develop and excel.

Those who choose to stay within research careers can cultivate alliances with colleagues at policy analysis institutions, and I offer the one I lead, the Center for Security and Emerging Technology, as an example. As this becomes more common, universities may choose to create more independent centers devoted to policy analysis or incentivize sabbaticals for those who can coproduce relevant policy analysis. Scientists and policy analysts are natural partners and have a vested interest in each operating at the top of their game, which can help fill the gap that Wright keenly observed.

Director

Center for Security and Emerging Technology

Georgetown University

The US research environment is at an inflection point. While the Vannevar Bush model expressed in Science, the Endless Frontier has evolved somewhat over time, it has served the nation well for decades. Yet in the face of formidable competitive challenges, there are growing concerns that this model will not enable the country to maintain its international leadership position in the future. As an example, the Special Competitive Studies Project recently wrote, “China is creating spheres of influence without any clear limits, underpinned by physical and technological infrastructure, cemented with commercial ties and technology platform dependence, deepened by authoritarian affinities, and enforced by growing military capabilities.”

The current US model for scientific advancement is enabled by federally sponsored basic research, the results of which are leveraged by the private sector to produce new capabilities. There has been little strategically planned connectivity across sectors or through the innovation process, however, and the government’s ability to drive activities has diminished over time. Yet this approach is now competing with China’s holistic strategy for international leadership in science and technology (S&T) arenas important to national and economic security. To better compete, MITRE has called for a 

new federal effort to build innovation-fostering partnerships: a voluntary coordination of government, industry, and academic activities to holistically address our nation’s most-critical S&T priorities. It must integrate such diverse players into a collaborative network to share information about opportunities and solutions, and to coordinate shared, complementary efforts across sectors, institutions, and disciplines … to help catalyze solutions to the biggest technology-related challenges.

The Special Competitive Studies Project is now working to develop such a new model that could drive collaboration of America’s “five power centers” on critically important S&T topics. It is against this background and toward this future model that researchers will likely have to work, and about which Candice Wright provides useful insights.

In the face of formidable competitive challenges, there are growing concerns that this model will not enable the country to maintain its international leadership position in the future.

Wright tackles three key issues: collaboration, balancing idea exchange with security, and rigor and transparency. All are important. Research collaboration (between government, industry, academia, and international partners) will soon become as important as cross-discipline collaboration has been, but while it’s easy to discuss the virtues of open research environments, this can be difficult to implement due to security and intellectual property concerns. Scientific integrity is also paramount, and Wright recommends actions to help at project initiation.

Additional components that must also be considered and incorporated within the future research paradigm include determining the roles, requirements, and collaboration approaches of different innovation sectors (including national laboratories and federally funded research and development centers) so that each succeeds and the nation’s capabilities advance; enhancing federal government S&T coordinationcommunicating to nonscientists; and ensuring that needed technical advancements occur while maintaining important national ideals such as privacy, equity, and free enterprise.

Our research future will be collaboration-based and strategically driven. Let’s begin now, together and with a consistent vision.

Senior Principal, Science & Technology Policy Analyst

MITRE

Gene Therapy for All

In “Making Gene Therapy Accessible” (Issues, Fall 2022), Kevin Doxzen and Amy Lockwood highlight contentious issues around gene therapy, even as the treatment shows good results and has raised hopes for many people with incurable diseases. The authors rightly point out that unless leaders carve out appropriate policies to develop gene therapy through a collaborative process, this novel therapy will be accessed by less than 1% of the world’s population.

Treatment that is unavailable to patients in need has no value at all. Policies will need to focus on research promotion, clinical and regulatory infrastructural development, capacity-building, training, and development of an approval pathway and community adoption for success and sustainable affordability. And as the authors suggest, rather than concentrating on single diseases, efforts should be focused on establishing a platform that would be applicable to multiple diseases. This approach could help researchers working to develop therapies not only for infectious diseases such as HIV and hemoglobinopathies such as sickle cell disease and thalassemia, but also for “rare” diseases that may in fact be common in low- and middle-income countries (LMICs).

In our opinion, one of the biggest roadblocks in this regard is intellectual property rights. The successful application of patent pools in the development of antiretroviral drugs in LMICs provides a tried and tested strategy for bringing down the cost of gene therapy by sharing these rights. Moreover, lack of competition affects the cost of gene therapies, as only a very small number of companies are developing such therapies. Bringing in more players may bring down the costs markedly.

Treatment that is unavailable to patients in need has no value at all. 

Apart from encouraging research and development, the authors also rightly underline the significance of regulatory guidelines and laws to ensure execution of safe and ethical research. To begin, global clinical trials need to be encouraged and facilitated with the participation of various patient populations from countries associated with high disease burdens. There needs to be proper guidance documents for development of indigenous platforms—utilizing the current capabilities and intellectual property of researchers and clinicians of various countries—to establish self-reliant assets for LMICs. This is also necessary for local gene therapy methods and products developed through technology transfer. To encourage the best practices globally, there should be twinning programs to provide appropriate hands-on training on the platforms established elsewhere and to generate a well-trained workforce for these resource-intensive and innovative technologies. Data sharing across the globe for drafting evidence-based recommendations for treating diseases with these modalities should also be encouraged so that stakeholders may learn from each other’s experiences.

Also important are organizations such as the Global Gene Therapy Initiative, which, as the authors highlight, play a pivotal role in the development of gene therapies in LMICs. With the participation of multidisciplinary experts, such initiatives can go a long way toward preparing LMICs to maximize the impact of gene therapy.

Finally, policymakers and other authorities in government need to develop funding mechanisms and policies to prioritize long-term success and stronger health systems with respect to gene therapy, realizing the transformative potential of these technologies in improving millions of lives. Also, as emphasized above, regulatory convergence needs to be aimed for, to solve the existing bottlenecks and build the ecosystem for gene therapy methods and products in LMICs.

Indian Council of Medical Research Headquarters

New Delhi, India

Development, acceptance, and sustainability of successful health interventions require both good planning and sound policies, as well as partnerships among many stakeholders. This is particularly the case when dealing with complex and sensitive interventions such as the introduction of and equitable access to gene therapy in low- and middle-income countries (LMICs), as Kevin Doxzen and Amy Lockwood highlight. The authors point out the familiar long delays between the time that new interventions become routine in high-income countries and their accessibility in LMICs. This must change.

To accelerate this change, Doxzen and Lockwood advocate for intersectoral, cross-cutting programs rather than single-disease vertical programs. Besides the examples the authors cite, this approach has proven to be very effective by other programs such as the European and Developing Countries Clinical Trials Partnership (EDCTP) and World Health Organization-TDR, which support interventions against diseases of the poor, especially in LMICs, and have broader mandates that include capacity development, networking, and fostering co-ownership of their programs.

Capacity development, including building environments for conducting quality health research and health service delivery, has far-reaching outcomes beyond the intended primary focus, as the authors cite in the case of the President’s Emergency Plan for AIDS Relief, or PEPFAR, and its contribution to the COVID-19 response. The same can be said about the EDCTP and the World Health Organization-TDR programs, which have shown that it is most cost-effective to support cross-cutting issues that can be used generally in different settings.

Capacity development, including building environments for conducting quality health research and health service delivery, has far-reaching outcomes beyond the intended primary focus.

However, as Doxzen and Lockwood point out, for ambitious programs such as equitable global accessibility of gene therapy, it is paramount to have in place good policies, sound strategic delivery plans, and coordination of activities. The strategy should consider inputs from all major stakeholders, including health authorities, regulatory agencies, civil society, international health organizations, the scientific community, development partners, industry, and the affected communities.

Of particular importance should be the involvement of health authorities in LMICs right from the start to inculcate a sense of co-ownership of the program. This will foster acceptance, active participation, self-determination, and program sustainability. Failure to do so may lead to public resistance, as evidenced in, for example, the vaccination campaign to irradicate polio in Africa and vaccinations efforts against COVID-19 and Ebola. Polio vaccination programs were falsely accused of imposing birth control in Nigeria, and COVID-19 immunization programs using mRNA vaccines were widely associated with negative misinformation about interference with human genes. Such involvement is particularly important in dealing with a sensitive issue such as gene therapy, which is prone to misinterpretation and misinformation. The involvement of local authorities and communities will also point out areas of weakness and capacity that need strengthening, including regulatory, laboratory, and clinical services.

Since it requires many years for such services to be readily available and accessible, this planning should take place now. There is no time to waste.

Retired Vice Chancellor

Hubert Kairuki Memorial University

Dar es Salaam, Tanzania

The Future of Global Science Relations

China’s scientific rise provokes strong global reactions. Research collaboration between North American and European partners with counterparts in China is now increasingly seen in light of securitization, asymmetrical dependencies, predatory practices, and general rivalry. In the middle of these heated debates, Joy Y. Zhang, Sonia Ben Ouagrham-Gormley, and Kathleen M. Vogel are doing something remarkable: holding trilateral experimental online meetings that focus on ethical and regulatory issues in the biosciences, and reporting about it. In “Creating Common Ground With Chinese Researchers” (Issues, Summer 2022), the authors not only describe fascinating, often-overlooked nuances of the Chinese science system; they also are admirably honest about the problems they encountered in establishing the dialogue and about the learning processes and necessary adjustments. This sort of open, flexible, and daring approach has become increasingly rare, as moral arguments and calls for taking a stance in and on bilateral relations grow louder.

Of course, there are heavy challenges involved, and the authors describe many of them. For instance, they address the often-questioned individuality and agency of Chinese scientists and argue that the question is too simple (and sometimes racist) and creates unproductive boundaries. Being sensitive to the increasing political pressure on and control over Chinese scientists could, however, be an important resource in the dialogue. After all, it could be tried as a starting point of mutually discussing experiences of societal demands on scientific work, and more.

 This sort of open, flexible, and daring approach has become increasingly rare, as moral arguments and calls for taking a stance in and on bilateral relations grow louder.

Yet the most important message the authors convey is that the decision to enter such a dialogue, and how it would work, is above all our Chinese colleagues’ choice. We in the West cannot and should not try to force that decision on them, or unilaterally exclude them from participating in the dialogue, no matter our reasons and however well-meaning we consider them. Moreover, different from how big platforms blend science, politics, and science policy together, intimate and more specific dialogues between scientists and scholars are usually harder to politically instrumentalize. And should that still happen, then that can be called out or discussed there, or rather be contested on the basis of intellectual arguments.

Another matter is that COVID-19 still largely halts travel to China and thus hinders personal encounters, making it difficult to add new people to the conversation. Current online forums favor long-established and heavily trusted ties. Fortunately, expanding the conversation by snowballing beyond these ties seems to still work. My personal experience with online exchanges confirms that more than ever there is a tangible interest, including among colleagues in China, in keeping up the conversation, continuing to cooperate and learn, and trying not to let external pressure and interference blur our views of each other. Giving up on this opportunity now would be detrimental. But the online dialogues can go only so far. It will be a task for different scientific associations and academies to put them on a regular and broader (and offline) basis in the future.

Finally, while intense exchanges about meta-topics such as research ethics and work structures are extremely valuable, in the end what will count most is that researchers factually work together (again) in postpandemic times. Especially in my field, the social sciences, research and collaboration opportunities were already significantly limited for several years before the pandemic, due to the increasing levels of control in China over access, data, and publications. In addition, wemust now address the heightened sensitivities in North American and European societies about the legitimacy and value of our collaborations. We scientists and scholars on all three continents must find honest and compelling ways to fight for the future of science relations.

Head, Lise Meitner Research Group: China in the Global System of Science

Max Planck Institute for the History of Science

Berlin, Germany

Investing in European Innovation

It’s not (just) the economy, stupid!

When Daniel Spichtinger writes, we pay attention. But after reading “From Strength to Strength?” (Issues, Summer 2022), we wondered whether his analysis of the problem was right. The article’s premise is that more money for research will ensure that Europe remains a science research superpower, which is why we chose to rewrite the famous political slogan from the Clinton era as our opening sentence. We think it is more complicated than just money. People are the most important resource in research, and as long as universities struggle to manage that resource responsibly, they are not getting any more of our hard-earned tax money.

Governments should be investing in well-functioning systems, but is academia such a system? Most people in academia who are not white, able-bodied, heterosexual men wouldn’t think so. Spichtinger refers to the renewal of the European Research Area (ERA). It’s interesting to notice that in this new ERA there is an increased focus on both diversity and culture. Why? Because research institutions are losing talent as they struggle to create inclusive research cultures. It is damaging not only to the reputation of universities when people of color, LGBTQ+ people, and people with disabilities experience fewer career opportunities and more harassment, and often leave academia. When research institutions don’t attract and retain the best talent, they don’t produce the best research. The responsibility of creating a research culture that is attractive to a diverse array of people lies with the universities and doesn’t require more money. The ERA plan just shows that too many universities don’t take action until pushed. Academia has known about this problem for years, and nothing has been done. The sector’s habit of pretending all is good until being forced to change is immature. This is not a responsible actor that should be sent more money, no matter how hard it lobbies.

People are the most important resource in research, and as long as universities struggle to manage that resource responsibly, they are not getting any more of our hard-earned tax money.

And Spichtinger suggests more lobbying for more money for actors in research. But lobbying requires a coherent policy position that goes beyond send more money. It requires a well-defined place in society, where one takes responsibility, but Spichtinger’s article shows how large parts of academia appear to exist in a vacuum. Due to the legal aftermath of Brexit, the United Kingdom is not part of Horizon Europe, the European Union’s key funding program for research and innovation. Neither is Switzerland due to a wider diplomatic fall-out with the EU. Despite these severe legal disagreements, many universities in Europe have supported a campaign to include the research environments in the two countries in Horizon Europe. Why is research so precious that in this specific area there can be no consequences for Switzerland and the United Kingdom? Should nations be able to pick and choose which rules to follow? 

We want to agree with Spichtinger. Europe should invest more in research. We just think that academia should do some serious soul-searching and develop mature and coherent policy positions internally and toward the world that it is part of. Scientific excellence begins with thriving researchers, and science is never conducted in a vacuum. Act accordingly, and we would be happy to see the research community receive the 3% investment of national gross domestic product in research that it seeks.

Director of DIVERSIunity

Codirector of Cloud Chamber

Together they host the Diversity in Research Podcast

Daniel Spichtinger touches upon a number of important issues scrutinizing the European Union’s position of strength. With the overall geopolitical disruptions, the EU is facing two “fronts” in becoming a research and innovation powerhouse: one relates to the internal challenges of closing the research and innovation gap within the EU, while the other relates to its global role as a promoter of research cooperation.

Internally, the uneven capacities and investment intensities of its member states are posing a risk to an accelerated development and the attractiveness of the EU as a science location. A special report from the European Court of Auditorsconfirms the general suitability of the widening measures implemented in Horizon 2020, the EU’s major research and innovation program. Challenges remain, however, such as the timing of complementary funding, sustainability of financing, recruitment of international staff, exploitation of results, or imminent disparities among the EU’s 13 newest member states.

Within Horizon Europe (the successor to Horizon 2020), actions taken under the widening participation rubric include the development of new instruments such as Excellence Hubs and the Hop-On Facility, along with placing a stronger focus on synergies with the Cohesion Funds (e.g., via the Seal of Excellence and the transfer of funds from the European Regional Development Fund to Horizon Europe or the European Partnerships). Although these instruments are in place, it is the main responsibility of the EU member states to deliver on the proclaimed consensus to make research an investment priority, especially with the new inflation challenge and strain on national budget already looming. The value asserted to science is contested, and communication measures to strengthen trust in science may be as important as fiscal measures.

For the international research cooperation perspective, a certain ambiguity seems eminent. Europe’s expressed strategy for a global approach to research and innovation tries to strike a balance between reaffirming openness while stressing the importance of a level playing field, reciprocity, technological sovereignty, and respect for fundamental values in research and innovation such as academic freedom, ethics, integrity, and open science.

It is the main responsibility of the EU member states to deliver on the proclaimed consensus to make research an investment priority, especially with the new inflation challenge and strain on national budget already looming.

These values and principles are currently being discussed in a multilateral dialogue with international partner countries to foster a common understanding and their promotion in future cooperation settings. However, the consequences of being a “like-minded country” respecting those values and principles are not yet obvious, nor is the opposite. Currently, 14 of the 16 associated countries in Horizon 2020 have already been associated to Horizon Europe; further negotiations with Morocco, Canada, and New Zealand will take place in fall 2022, as will exploratory talks with South Korea and Japan. So far, the values and principles have played only a marginal role at best in these negotiations. The enormous pressure geopolitical developments can put on international research and innovation cooperation has, however, become clear in light of recent events. Here, Spichtinger rightly raises the question of whether it is helpful to “use science as a stick.” And indeed, one has to assess if all issues the EU faces with different countries—from Russia and Belarus to China, the United Kingdom, and Switzerland—are of the same nature and warrant the current measures taken.

If the European Union wants to become a respected promoter of international cooperation in research and innovation, it is vital that partner countries and their research, higher education, and innovation organizations perceive the EU as being ambitious, fair, and impartial in advocating mutual benefits in jointly facing global challenges. Noncooperation could otherwise prove quite costly for the EU research and innovation community.

Department of International Cooperation and Science Diplomacy

Austrian Federal Ministry of Education, Science and Research

Remembering the Harrisons

Helen and Newton Harrison, You Can See that Here the Confluence is Pretty... From the Fourth Lagoon, The Lagoon Cycle, 1974–1984. Paper on canvas, acrylic gouache, collage, photographic print with ink, print, and pencil.
Helen and Newton Harrison, You Can See that Here the Confluence is Pretty… From the Fourth Lagoon, The Lagoon Cycle, 1974–1984. Paper on canvas, acrylic gouache, collage, photographic print with ink, print, and pencil.

We are saddened by the news that pioneering eco-artist Newton Harrison passed away on September 4, 2022. Born in 1932, Newton graduated from Yale in 1965 with both a bachelor’s and master’s degree in fine art. He secured his first faculty position as assistant professor at the University of New Mexico (UNM), before moving to La Jolla, California, in 1967 to cofound the Visual Arts Department at the University of California, San Diego (UCSD). Helen Mayer Harrison (1927–2018), who was known for her activism and research-based work in literature at UNM, chose to dedicate herself to the Harrison collaboration when they made a map of endangered species around the world for the Fur and Feathers exhibit at the Museum of Crafts in New York City in 1969. The Harrisons, as they became known, then collectively made the decision to do no work that did not benefit ecosystems. Their collaboration lasted nearly fifty years and led to the first husband-and-wife shared professorship at UCSD. 

As part of the Getty Foundation’s Pacific Standard Time: 2024 initiative, the La Jolla Historical Society in San Diego, in collaboration with three other venues, will present Helen and Newton Harrison: California Work, a groundbreaking four-part exhibition about this pioneering couple, offering a critical reappraisal of their California-based works. The exhibition will highlight the Harrisons’ extraordinary art and science collaboration, which ignited the field of ecological art and fostered it for decades. Many artists will continue to be inspired by them, as artist and environmentalist Lillian Ball affirms: “They were the forces of nature whose ongoing influence will be felt throughout generations.” 

Tatiana Sizonenko, art historian and curator

Episode 21: To Solve Societal Problems, Unite the Humanities With Science

How can music composition help students learn how to code? How can creative writing help medical practitioners improve care for their patients? Science and engineering have long been siloed from the humanities, arts, and social sciences, but uniting these disciplines could help leaders better understand and address problems like educational disparities, socioeconomic inequity, and decreasing national wellbeing. 

On this episode, host Josh Trapani speaks to Kaye Husbands Fealing, dean of the Ivan Allen College of Liberal Arts at Georgia Tech, about her efforts to integrate humanities and social sciences with science and engineering. They also discuss her pivotal role in establishing the National Science Foundation’s Science of Science and Innovation Policy program, and why an integrative approach is crucial to solving societal problems.  

SpotifyApple PodcastsStitcherGoogle PodcastsOvercast

Resources

Transcript

Josh Trapani: Welcome to The Ongoing Transformation, a podcast from Issues in Science and Technology. Issues is a quarterly journal published by the National Academies of Science, Engineering, and Medicine and Arizona State University. I’m Josh Trapani, senior editor at Issues. I’m truly excited to be joined by Kaye Husbands Fealing, who is something of a living legend in the science policy community. Kaye is the dean of the Ivan Allen College of Liberal Arts at Georgia Tech. She previously taught for 20 years at Williams College and served in several positions at the National Science Foundation, including playing a pivotal role in creating the Science of Science and Innovation Policy, or SciSIP, program. On this episode, I’ll talk with Kaye about her work at Georgia Tech on integrating science and technology with humanities, arts, and social sciences, referred to as HASS. We’ll also talk about her career, and of course, I cannot pass up the opportunity to get her insights on the science of science policy. Kaye, thank you so much for being here.

Kaye Husbands Fealing: Thank you, Josh. It’s really great to be here with you today.

Trapani: I’m really delighted to have a chance to speak with you, because even though our paths first crossed directly only recently, I’ve heard your name numerous times in virtually every position I’ve held in Washington, DC, over the last 17 years. And your work particularly on science and science policy, as well as on science and innovation indicators, looms large over my career and those of many people who work in science and technology policy. And I’d like to ask you about some of that work. But let’s start with the piece that you and co-authors, Aubrey DeVeny Incorvaia, and Richard Utz have just published in Issues. In the piece, you argue science and technology education must be better integrated with humanities and social sciences, and describe some of the work you’ve been doing to make this happen. One thing you mentioned that really struck me is that more than 75 years ago, Vannevar Bush, in Science, the Endless Frontier, warned against this separation. And we listened to Bush on so many things, but not on this. Why do you think this challenge has been so longstanding, and what is science missing by not doing it better?

Husbands Fealing: Great question, Josh. And I wanted to take that question in two parts. First, talk about the challenge that has been so longstanding that Aubrey and Richard and I have been working on, and then I’d like to turn it and talk a little bit about what’s missing or what we can do better. So along the lines of the challenges, the premise of our article is that creative possibilities that lie at the intersection of science, engineering, art, humanity, social sciences, that the investment has not been pulled together in those areas the way they could be for a terrific return. So Vannevar Bush wrote that to set up a program under which research in the natural sciences and medicine was expanded at the cost of the social sciences, humanities, and other studies that are so essential to national wellbeing, that to set up programs that way, we would be missing something.

He also said science cannot live by and unto itself. So I just want to expand on that a little bit, because that was really what drew me into thinking about writing about this issue regarding science policy. Richard is a humanist, and Aubrey is a terrific social scientist. So we wanted to combine those areas to really explore this idea of humanities, arts, and social sciences integrated with STEM, science, technology, engineering, and mathematics. So for example, if you think about, and you go back and look at science advisors, go back, let’s just not go back that far. Let’s just go back to Holdren and look at the priorities that were written by him for OSTP fiscal years 2010 to fiscal year 2017. Here is what you see. Calls out these priorities, needs of the poor, clean water and integrity of the oceans, healthy lives, clean energy future while protecting the environment, safe and secure America and weapon-free world, economic growth, and jobs. Added to that, in the same set of priorities, STEM education, high performance computing, advanced manufacturing, and neuroscience.

So you see the difference. Some are big topics, big global issues, where clearly HASS and STEM coming together can really address issues of the human condition. So go forward to Lander and Nelson. The most recent priority memo was written by Alondra Nelson. And there we see pandemic readiness, Cancer Moonshot, climate change, security, economic resilience, STEM education, but also innovation for equity, open science and community engaged R&D. So then you see that scale back to something that is a larger context where the humanities and social sciences and even the arts come together with STEM and R&D to try to move us forward as a country.

So my observation is that there could be an increasing laser focus on competitiveness, and there’s nothing wrong with that. But with that, you see the increased focus on very specific areas in science and engineering. But these big topics; needs of the poor, clean water, safety, security, economic growth jobs, those certainly do require this kinship between HASS and STEM.

So for me, that sort of disciplinary fragmentation is the challenge and something that we can actually try to work through better as a group of science agencies. So let me address the what is missing part. What is missing by not doing it better was your question. And as we wrote in the paper, STEM and HASS domains intersect in the challenges and threats that people face every day. So we’re trying to get back to those issues of the human condition where the humanistic lens is needed to elucidate problems, imagine solutions and craft interventions. And we also think of it as these lenses allow us to think not only downstream about communicating science and communicating to either senators, congressmen, the populists, international leaders. We’re not just talking about the communication part of it, but we’re thinking upstream about also trying better to have that understanding of what the problem is, the discovery process.

And we think that it is important to have this discovery, design, solutions and communication process integrated into this combination at the intersection of HASS and STEM. Now, let me say just one more thing, and that is, it sounds as though we’re saying that this is easy, it’s not. It sounds as though we’re saying without it that we’re failing. We’re not, we don’t want to give that impression. In fact, accolades for our scientific progress surely are very well founded. So we’re not saying that that’s not the case or that arts and humanities need the sciences to buttress them. We’re not saying that either. What we’re saying is that there is a possible adaptivity that can accelerate progress in STEM, in science, technology, engineering, mathematics, and also in the arts and humanities and social sciences if we could work together. And the other issue is that it’s also not easy because we have to develop a common lexicon. We have to develop trust across the sciences and the humanities to allow the benefits that we foresee to come about.

So we need a way of creating learning pathways, experimental pathways to see this happen, to see this take off. And I think it’s worth our attention to see how we could get about many of the discoveries and then solutions to issues that continue to plague us.

Trapani: Well, thank you so much for that great answer. That was really, really clear. And it just shows the importance of taking the holistic approach. The end of your answer actually teed up my next question perfectly, because I was wondering why it’s been so hard to develop and scale up integrative approaches to building these things together in education network. And also because you’ve been leading the way at Georgia Tech, what are some of the things that you and your colleagues have been doing to bridge the gaps?

Husbands Fealing: I want to answer your question by talking through a few things that we’re doing here at Tech and then really address this issue of the difficulties of developing these and also the scalability. So some examples of what we’re doing at Georgia Tech. For one, a two-semester junior capstone sequence where that is co-taught between computer science and technical writing faculty. So what’s interesting about this, this is an arrangement that not only sharpens students’ communication skills, but it also inspires them to situate their scientific work in a larger context. For example, by considering how it will be received in a field rife with gender or racial bias. And so having the writing experts and scholars working directly with folks in computing, and then that allows both to advance, right? Because, also these writing scholars are technical writers, which we all know we need at NSF or at the National Academies or places like that.

So having that flow between the two, HASS and STEM, STEM and HASS, that’s an example. There’s another example of EarSketch, which is now used by more than a million students worldwide. And EarSketch integrates coding education with music competition. So using music as a pathway to get the students to learn to code worldwide. And so it’s really fantastic here to see that interdisciplinarity between the College of Liberal Arts and the College of Design, putting together with college computing, and more than a million students worldwide are using something that’s in the arts music to learn to code. So it’s really important that students are trained to think across a range of disciplines to leverage their exposure to diverse methodologies, to better understand and tackle complex problems. So why is this so difficult, and how can it be scaled? I think the difficulty goes back to something I said a little earlier, which is, we do need to develop a common lexicon and we do need to develop a sense of trust across these disciplines.

Even if you’re working just within HASS, the social scientists, economists, sociologists, political scientists are not all coming from the same place and they now are working with computer engineers or working with biomedical engineers or working across different avenues. Another area of difficulty, which we can work on it, there are ways of dealing with this. How do you assess the return on investment to having this complex combination of humanities and science, or arts and engineering, how do you figure out what the return on investment is from those? And typically we’re looking at number of papers, number of patent’s, number of grants, how much are you funded in those grants? But those are not necessarily the ways in which we should be assessing the breakthroughs that come at this intersection. And there are ways of quantitatively but also qualitatively measuring those breakthroughs. And I will put on the table that one important product is this talent pool, amazing talent pool.

And it’s not just the first job that they get and then we measure, think about, well, how much did they earn. But it’s really five years, 10 years down the road, sometimes even longer, where you see the amazing results of resilience and agility of the students that are coming out of these programs.

The second part of your question here about scale up, I think that we miss opportunities by focusing only on the private sector in terms of the outputs of R&D and that there are many ways in which innovation benefits the nonprofit sector and the government sector. There is innovation in government administration and there are ways of using some of these outcomes and some of these products to really have innovation in sectors other than just the private sector. Although the private sector, obviously, industry is really one of the main recipients of our investments in R&D, and it should be. There’s no reason to argue that. But I’m just trying to say here that we could expand on that a little bit.

A second part is that, I’m an economist, so I have to say, when I think of scale, I think of economies of scale and economies of scope. And it’s one thing to say scale up the same, and it’s another thing to say, well, look for the different use cases, things that are combined, how can they be used in the environmental area, or in the health area, or in the security, all the things that we talked about at the beginning, including getting to zero poverty, things that are really primarily top of mind to the ordinary citizen. And so thinking of not only how these combinations can be used to advance science and also to advance the social sciences, the arts and humanities, but also what are those use cases? Those are the things that are salient, those are the things that sing. Those are the things that really make sense to the ordinary citizen, and therefore that support for these investments, I hope, can be better articulated when we’re able to do those types of combinations and actually do that kind of communication.

Trapani: Your background as an economist came through loud and clear in that answer, and I wanted to turn to that next. So beyond your distinguished academic career, you’ve also played important roles outside the academy, including some key ones in science policy. In particular, you played a seminal role in developing and leading the National Science Foundation’s Science of Science and Innovation policy program, as well as leading the Science of Science policy Interagency Task Group. Now, before I came to issues, I also served briefly as an NSF program director, and I can say based on my experience that most program directors don’t get to start new programs, lead interagency groups, and work directly with the director of the White House Office of Science and Technology Policy, or OSTP, as you did with Jack Marburger. So I’ve been curious to talk to you for a long time and to ask you if you could talk a bit about that time and how you first got involved, and what you and others who were working on it were hoping to achieve.

Husbands Fealing: Thank you. That was a great time. I got to NSF, National Science Foundation, in 2005, was a program director in the economics program, one of three program directors. In my 11 months, so forward fast to 2006, I was asked by David Lightfoot, he was an associate director of the Social Behavioral and Economic Sciences directory, and he said Jack Marburger gave a talk at AAAS in 2005 where he called out the social sciences and said, “You need to stand up and be really part of this process of trying to get the evidentiary basis for funding science,” and that we needed to stand up and take that responsibility to do so. David Lightfoot, Mark Weiss, Wanda Ward, they were all in SVE at the time, and they said, we have social, behavioral, and economic scientist, so behavioral sciences also are part of this.

And we also have an arm. At the time it was SRS, now it’s the National Center for Science and Engineering Statistics. So we also had this quantitative part of us as a directorate. So they said, well, what can you do to draft something that would give us the platform to start something called science metrics. That’s what they called it, science metrics. But yet they want the sociologists and the behavioral scientists and others to be part of it. So it couldn’t just be metrics. So we knew it had to be science of science, which was something that existed, which means basically, what are the frameworks, the models, the tools, the data sets that are needed to make good decisions on funding science, or to make good decisions on how teams should be assembled to do science and so on. That’s the scientific foundation for science. So Science of Science and Innovation policy made sense, because at the end of the day, we want to have the evidentiary basis for policymaking. And that’s precisely what Dr. Marburger said he wanted.

So by the fall of 2006, I had finished writing with a lot of input from a lot of folks that were, at the time, in 2006, at NSF and finished this prospectus and showed it to Dr. Marburger. Obviously, David Lightfoot did that, I was a program director and came back, and they said, it’s a go. So we wrote the solicitation that fall. We were on continuing resolution. February, the continuing resolution lifted in 2007. Solicitation went out by that summer. We ran the panel, funded a number of proposals, and we had our first wrap. So from the summer of 2006 to the summer of 2007, prospectus, solicitation, proposals came in, proposals vetted, funded. It was a quick clock. I won’t give you all the details, but here are the categories that we funded in the first round: human capital development and the collaborative enterprise related to its science, technology and innovation outcomes.

So we did a lot there, including some work on the US, Mexico and Brazil. Biomedical, nano, hydrology, it’s all that foundational work behind funding those types of sciences. Another was returns to international knowledge flows, and once, test case was biofuels. A third, creativity and innovation. This is really interesting. This came out of the behavioral sciences, cognitive models of scientific discovery and innovation. Chris Schunn from Pitt was doing work where he would observe how engineers did work in labs and what were the cognitive processes that were going on so that we can understand ingenuity. So not just the commercialization, but all the way back to the ingenuity and that process. We funded that, we funded that project. Another set of projects, knowledge production systems, and looking at big systems, risk and rewards, low carbon energy technologies and things like that. And the last category was science policy implications.

And at the end of the day, everyone always wanted to know. Not only did you find the evidence behind how to fund or arrange activities in science better, but how did it affect science policy? And I’d say that we had the foundations of that even in the first round in 2007 in the SciSIP program. Very pleased about that. Dr. Marburger was very pleased about that, and forward fast to when Julia Lane, she took over after I left as program director, Stephanie Shipp and Dr. Marburger, the four of us wrote the preface of a book and then had many collaborators give contributions to the Science of Science handbook. And we finished that, I think it was published in 2012, but it was fun working on that with Dr. Marburger. So that does a little bit of background on the Science of Science and Innovation policy. Dr. Marburger really did give the charge for this, but it was fun. And yes, program directors at NSF get to do a lot of other things. So it was good for us.

Trapani: Well, that’s really remarkable. Thanks for telling that story. I don’t know that I’d ever heard it quite so succinctly and concisely, the very early days. So I guess it’s been 10 years or more though since then and I was wondering from your perspective, how has the landscape for the Science of Science policy evolved since those days, and how far do you think we’ve come in meeting some of these challenges and what remains to be done?

Husbands Fealing: I think that the advances that have been made, we have better models, I think, and frameworks that integrate across economic sociology especially. I think the original setup of this program envisioned having more domain scientists working with the social scientists especially, and behavioral scientists, and I think we’re making advances there. We’ve made many advances on the data side. I think the part where we could do more, we could do more in the behavioral space. I don’t think that we pulled in as much in the Science at Science, now has been renamed Science of Science, the behavioral piece as we wanted to at the beginning. In the prospectus, there was a real emphasis on creating a community of practice, and that would not only be academics, but it would also be individuals who are in the variety of agencies. The Interagency Task Group had representatives from 17 agencies that were part of the NSTC in the subgroup on social, behavioral, and economic sciences.

And the idea was to try to get more of the agencies to take on this Science of Science approach, but it would need funding, it would need to be a priority, it would need leadership. So I think that that’s something that’s still ongoing. I think the biggest question we get often is, well, how has this affected policy. And I don’t think that we’ve done the work to show that mapping distinctly between the science and science and policy changes. It’s hard to do. But I think that that is something where we still have a way to go. And the last thing I would add, Kellina Craig Henderson is now the AD for Social Behavioral and Economic Sciences. She and I rotated to NSF the same year, 2005, and she’s been there for a long time. And back then she really was working hard and diligently on the science of broadening participation in STEM.

And it is something still that to this day we’re still thinking about and talking about it. Dr. Panchanathan, the current director of NSF, is very focused on this. The NSB is very focused on the missing millions. And they just even created a new program called GRANTED to get R-2s and other universities the infrastructure so that they can apply effectively to NSF and get the grants to perform science and engineering activities at their institutions. And so I think the Science of Science, or SciSIP, depending on what you want to term it, I think we have an area to contribute on the science of broadening participation. And this is the time, because this is something that Pancha’ is talking about all the time, National Science Board and others. And it is in the priorities, the innovation for equity, that is in the priorities from OSTP. So I think we have an opportunity to keep moving along this line of Science of Science, or Science of Science and Innovation policy, especially at this time.

Trapani: Well, while I was there, I briefly ran the Science of Science program. We put out a special call that we called BP Innovate, and it was about building an understanding of the science behind what leads people to enter entrepreneurial activities or to not, and the sort of incentives and disincentives that are there and how that varies across people’s race and gender and geographic background. And that was a time thing, but I think it’s something that they are planning to repeat. I’ll just add that you mentioned a lot of names and places that I know. And I would just mention that Julia Lane just had a piece in Issues in Science and Technology that lays out a vision for the evidence-based policy making act.

And one more Issues plug before I move on—you mentioned the National Science Board, and last year we had a piece by the chair and co-chair of the National Science Board, and it was partly focused on the need to broaden participation. So this is very much the conversation that’s going on today. My sense is that when it comes to this field, the Science of Science policy, or Science of Science, is multidisciplinary, as you described, led by the quantitative social sciences. But to get back to your piece, you called for more than just that. You wanted more multidisciplinary, including the humanities, to be built into science policy. And I wonder if you could speak a bit just to what that would look like and what benefits you think it would bring.

Husbands Fealing: I’d like to see, for example, use cases where we can actually see the advancement of science using these activities. And also, I’d love to be able to see, and this I got from talking to our executive vice president for research who read the article and he came back and said, but we also need more science in art. And so consider that. Consider that. I want the listeners to think about what that means. So the art in science, there’re ways of visualizing, there’s something called medical humanities. So there’re ways of using those activities within the arts and music to improve not only outcomes in medicine for individuals, but also hopefully to really get at the kernel of issues in an interesting way using maybe art or visualization techniques that come from the humanities, arts and social science side. But the other challenge here was also, well, what if someone that is using materials or different types of paint understood the chemical processes or the composition in a way that actually enhance the product in the art side.

So that’s the vision, it may be out there a bit. It’s off the beaten path, but one of the things we’re trying to do here at Georgia Tech is create an area called Art Square. Now, Art Square, on a continuum, on campus is on somewhat of the periphery, but they’re building Sciences Square not far away. So imagine if we could really collaborate across those. And we’re also not far from the new Lewis Center, so the DEI aspect or DEIA aspect of this could come into play as well. So it could be a fulcrum, it could be a hub, it could be an area where we can really see advances that we hadn’t thought about by doing this. And for me, it’s an experiment. It’s something that’s worth investing in. We are doing much of it here at Tech. And so I don’t want to make it sound as though we’re not doing this, but with so much more that we can do to see this type of integration.

Another area is that we want to be able to understand how this intersection of HASS and STEM will improve policy. So it goes back to your previous question about the Science of Science and Innovation policy, that foundational element behind policy. Well, could it be crisper, more nuanced, more connected to communities if it includes the humanities part of it? And the National Academies report branches from the same tree. That is something that is important for us to remember, is that this cleaving, this disproportionate investment over these many decades, we really have to give back to the fact that maybe that didn’t need to happen, or we can do something that corrects that split and see better integration and investment in that integration. So that’s the vision.

Trapani: This has been such a wide-ranging conversation. I really appreciate your time and your insights, but I have one more question before we go. I was wondering if you have any advice or perhaps lessons learned from your experience for younger people who are interested in or getting started with careers related to science and technology policy who want to have a positive impact?

Husbands Fealing: Sure. I like that question very much because I’ve been in the business, so to speak, for more than 33 years. So I’ve been a professor for a long time and students are a top priority and it’s really important for us to have some takeaways that students can dig into.

I have three things I want to put on the table. Broaden your networks. And we’re not just talking HASS and STEM now. We’re talking just the networks that students can really utilize, not just to get access or economic and social mobility, but also to find pathways and career pathways, and those networks will really allow that to happen. The second piece is the focus on humanities or social sciences. If you’re an engineer, it’s not a distraction. It’s actually an enhancement in your area of expertise in the sciences and engineering and computing. So I’d like to just put that on the table, that oftentimes you may be chided, that, “Well, why are you doing that? Just spend 10 more hours in the lab and it’ll be better.” I want to say no. There’s a lot of benefit from looking at and having these other lenses to really do the exploratory work.

So humanities or social sciences is not a distraction, it could really be additive. And the third thing I’d like to say, because I had to really think about as you asked the question, what else would I want to put here? And I have to say, right, I have a math degree and an econ degree, and I was not a writer. I was not a person that did a lot of writing. I crunched equations. I loved QED at the end, especially when I knew I was right. And when I was a math major, the task was to solve the proof in as few steps as possible. I love that. But I will tell you the good writing, great communication, telling the story, there’s nothing more salient than that to put all of that hard work into people’s minds so that they understand what you’re talking about.

It even is important if you want to be an entrepreneur, it’s important if you want to set policy, it’s important if you want to let other students understand what you’re working on in terms of these peer effects that I talked about before. So please write, figure that out. It’s not always that easy, but it’s so incredibly important.

Trapani: As an Issues editor, I’m going to transcribe the part of your answer about writing. We’re going to put it on our homepage and I’m probably going to put it on a t-shirt too and wear it everywhere. Thank you. Kaye, it’s been delightful to talk with you. Thank you so much for being here.

Husbands Fealing: Thank you. This was a pleasure.

Trapani: This has been a wonderful conversation. And thank you to our listeners for joining us. As Kaye notes in her piece, Yo-Yo Ma once said, “Culture turns the other into us.” Science and technology has for so long seen the humanities and arts as other, and it’s time we turn them into us.

To learn more about how we can achieve that, read Kaye Husbands Fealing, Aubrey DeVeny Incorvaia, and Richard Utz’s piece in Issues, entitled “Humanizing Science and Engineering for the Twenty-First Century.” Find a link to this piece and the others we mentioned in our show notes. Subscribe to The Ongoing Transformation wherever you get your podcasts. You can email us at [email protected] with any comments or suggestions. Thanks to our podcast producer, Kimberly Quach, and audio engineer, Shannon Lynch. I’m Josh Trapani, senior editor of Issues in Science and Technology. See you next time.

Is Open Source a Closed Network?

In “Architectures of Participation” (Issues, Summer, 2022), Gerald Berk and Annalee Saxenian present a compelling question: “Given the complexity and divergent trajectories of today’s innovation systems, how should public policy foster innovation and openness, and support the process of making data more accessible?” In the 2000s, the authors note, open-source ecosystems, or “networks of networks,” propelled Silicon Valley’s first wave of technological advancements in internet innovations. Since the 2010s, however, as computation capabilities and software systems became larger and more complex, a handful of dominant platforms—Amazon, Google, Microsoft—appear to have abandoned that openness of the previous era.

These giants have done so in a number of ways, including by restricting access to their application programming interfaces (or APIs, which simplify software development and innovation by enabling third parties to exchange data and functionality easily and securely), by acquiring start-ups, and by developing their own proprietary (closed) systems. For example, in order to maintain and enhance their market share, these firms’ cloud platforms impose “anti-forking” requirements that restrict access to data and block software developers from building out new applications on their platforms. Also, as Berk and Saxenian write, critics say Amazon Web Services has improperly “copied the open-source code for a pioneering search engine named Elasticsearch and integrated it into its proprietary cloud services offerings,” thus making it harder for smaller companies to use the search engine to market their products. In response, the authors note, “at least eight open-source database companies, including Elastic, modified their licenses, making them so restrictive that they are no longer considered open-source by the community.”

Berk and Saxenian offer several prescriptions for policymakers, regulators, and jurists to consider. Antitrust law, or competition policy, offers one tool to restore and foster an open-source collaborative ecosystem. In opposition to today’s centralizing tendencies, the authors argue that interoperability and “the democratization of the use of data” are key to high-quality, fast-paced innovation. As they highlight, Google Cloud already collaborates with the Linux Foundation and shares revenue with its smaller partners. But, of course, the worry—at least in my mind—is that Google will not maintain the partnership when it sees an opportunity to build out proprietary products or services, particularly in areas complementary to its existing dominant position in internet search or mobile phone operating system. As a Department of Justice complaint against Google argues, the firm appears to have used restrictive licensing and distribution agreements with hardware producers for its Android operating software, requiring producers to place Google Search along with a bundle of other Google applications in prominent places on the screen. These cannot be deleted by users, and these contracts contained anti-forking requirements as well.

Antitrust law, or competition policy, offers one tool to restore and foster an open-source collaborative ecosystem.

The DOJ has a strong case against Google, but what remedy should the agency seek? Here, Berk and Saxenian cut to the heart of the matter. Neither breaking up Google nor simply prohibiting the distribution agreements will restore the dynamic, innovative engine of open-source collaboration. We need to think beyond the dichotomy of market competition versus monopolistic or oligopolistic firms in order identify creative, forward-looking solutions. Here, we might draw from recent examples, such as the Microsoft 2001 consent decree that required interoperability for competing browsers on Microsoft’s operating system. Moving forward, courts, regulators, and legislators should consider the procompetitive benefits of interoperability and information pooling, which may answer these authors’ call for both fostering collaborative competition and maintaining economies of scale.

Associate Professor

University of Georgia School of Law

Gerald Berk and Annalee Saxenian focus on the information technology industry in Silicon Valley and the relationship between open-source and proprietary-platform companies. They base their article on interviews with a range of software developers and managers, but significantly all from inside the industry. They argue that the open-source segment of the industry is especially innovative, that the proprietary segment can and does work with open-source companies, a partnership that further strengthens the innovative capacity of the industry as a whole, and for that reason ought to be promoted by public policy. They focus on antitrust policy, but the logic would apply to other domains of policy, most particularly the immigration of high-skilled workers from abroad.

I find their argument unpersuasive and ultimately disturbing. My main concern is the way it hinges on “innovation” as if that were the singular goal of public policy and could be pursued independently of other policy issues and debates. In effect, the authors seem to be arguing that society cannot have too much innovation.

Two specific issues at the forefront of current policy debates call into question the value of unlimited innovation. One is the debate about the impact of artificial intelligence and robotics on jobs, and the fear that workers are being displaced more rapidly than they can be absorbed into other parts of the economy. The other centers on the distribution of income and social mobility; in particular, Silicon Valley is a bifurcated economy, with one sector with highly paid software developers and other professionals and the other with low-paid service sector workers catering to the professional class. I would be much more persuaded of the value of open source if the authors could show that it was not only more open to ideas but also more open to a demographically diverse workforce.

I would be much more persuaded of the value of open source if the authors could show that it was not only more open to ideas but also more open to a demographically diverse workforce.

The failure to recognize these issues reflects in part the limits of the analytical lens through which the authors are working. That lens is, as the authors recognize, the network structure of the industry. Network theory in the social sciences has basically been concerned with the interaction of members within a defined network. It has not been especially concerned with the boundaries of the network, how its members are recruited, trained, and socialized to the norms and standards that govern their relationships. Indeed, it does not generally recognize that social networks are typically closed to outsiders.

A limited ethnographic literature looking at what software developers actually do on the job suggests that a better lens through which to view the industry is probably craft work, where skill and proficiency are acquired through experience on the job and close interactions with more senior developers. However, because experienced workers in such a system have to work closely with “apprentices” in order to train them, they tend to resist admission of new members from backgrounds very different from their own. The classic example is the skilled trades in the construction industry, which are notorious for resisting government pressure to admit workers from underrepresented demographic groups, particularly women and ethnic and racial minorities.

An important difference between the authors’ focus on innovation and the craft analogy emerges in the debate about expanding visas for highly skilled foreign workers. In the authors’ perspective, expansion is promoted as an impetus to innovation. In the perspective that recognizes other social goals, such expansion is also way of avoiding pressures for the upward mobility of low-skilled immigrants and their children.

David W. Skinner Professor of Political Economy, Emeritus

Department of Economics

Massachusetts Institute of Technology

Authoritarian Environmentalism

In “China Planet: Ecological Civilization and Global Climate Governance” (Issues, Summer 2022), Yifei Li and Judith Shapiro seek to explain not only whether China can uphold its climate promises, but also whether the costs of achieving these promises would be worth it for the democratic world. They point out that answering those questions would “require transparency, accountability, and social equality—all of which are in short supply in the Middle Kingdom.”

The article makes a significant contribution to challenge and deconstruct China’s green image under the era of President Xi Jinping’s “Ecological Civilization.” Since 2012, China’s leadership has been using a green “China Dream” discourse that connects domestic environmental actions to global leadership on climate change and the “glorious revival” of the Chinese nation. This discourse often speaks of green policies in glowing terms, such as “green mountains are in fact gold mountains, silver mountains.” However, whether and at what cost the Chinese leader’s lofty rhetoric has been translated into environmental outcomes in practice has become a major question that is much harder to answer.

One key point the authors emphasize is the many nonenvironmental consequences of the coercive, state-led “authoritarian environmentalism” over the course of China’s making and remaking of international climate politics. They believe that instead of serving to achieve sustainability, China’s proclaimed emphasis on ecological civilization is actually a means to strengthen the Communist Party within and outside China. Therefore, a more accurate term than authoritarian environmentalism should be “environmental authoritarianism,” which resonates with the comparative environmental politics literature.

The rise of environmental authoritarianism reflects the long debate about the relations between China’s regime type and its government’s environmental performance. Considering the climate challenges that liberal democratic systems have faced, critics have questioned the performance of liberal democracies and especially their capability in leading global climate change governance. China created the concept of environmental authoritarianism to bring together these doubts about democracy as a favorable and capable model for environmental decisionmaking and governance. China is widely regarded to be a preferable example. Supporters of the actual environmental authoritarianism assume that a centralized undemocratic state may prove essential for major responses to the growing, complex, and global environmental challenges.

The rise of environmental authoritarianism reflects the long debate about the relations between China’s regime type and its government’s environmental performance.

Li and Shapiro deeply engage with the ongoing debate and provide an insightful answer. In their opinion, “Although China has seen some success in realizing its ambitious climate goals, the country’s achievements have come at a social and political cost that few democracies could—or should—tolerate.”

Their observations might inspire anyone who is curious to critically explore the following two questions:

First, why did the Chinese government intentionally select the ecological civilization green discourse as the “clothing” of authoritarianism? Could it be a double-edged sword for China’s governing party to maintain its legitimacy? In the formerly communist Eastern European countries of Ukraine and Poland, environmental crises led to national social movements that presented significant challenges to their governments’ political legitimacy. Do the political elites in China have the same concern that using environmentalism as a cover might in the end turn out to be “lifting a stone only to drop it on your own feet,” as an old Chinese proverb predicted?

Second, why does the western liberal world still want to cooperate with China on climate change governance if China is not genuinely interested in green values and is using environmentalism only to maximize its power in the international community and increase control of its own institutions and citizens?

Associate Professor of Environmental Politics

Renmin University of China

To tackle the climate crisis, it is necessary for China, the world’s largest emitter of greenhouse gases, to take decisive actions to cut emissions. Almost paradoxically, China at the same time dominates the supply chains of technologies that are needed for the world to transition to renewable energy.

Meanwhile, in democracies, the inaction and the messy fights among government bodies and interest groups have left many people frustrated with the democratic process’s ability to address climate change. This and the fact that the climate fight hinges so much on China have created a willingness by some governments to overlook the problematic ways China approaches climate change. 

Yifei Li and Judith Shapiro, in their essay as well as in their 2020 book, China Goes Green, caution about the perils of China’s top-down, numbers-based, and tech-driven environmental governance model. As a human rights researcher who over the years has witnessed and documented the tremendous human rights cost in the Chinese government’s pursuit of grand development goals, I am relieved to see scholars in the environment field sound the alarm about the “China model.”

As Li and Shapiro discuss, the burden of making the 2022 Beijing Winter Olympics green “fell primarily on China’s most vulnerable and politically disenfranchised.” Similarly, to reduce coal consumption, some local authorities have banned the burning of coal, including for home heating, without consulting the affected communities, forcing people who couldn’t afford alternative energy sources to freeze in the winter, and fined those who secretly burn coal.

As a human rights researcher who over the years has witnessed and documented the tremendous human rights cost in the Chinese government’s pursuit of grand development goals, I am relieved to see scholars in the environment field sound the alarm about the “China model.”

China’s supply chains for renewable energy are also ridden with human rights violations. Almost half of the world’s supply of polysilicon, a key component of solar panels, is produced in Xinjiang, a region where government abuses against the 13 million minority Uyghur Muslims “may constitute … crimes against humanity,” according to a recent United Nations report. In Guinea, to mine bauxite, a primary source of aluminum, which is a key component of electric vehicles, Human Rights Watch documented a joint venture linked to a Chinese company that pushed farmers off their ancestral land and destroyed their water sources. The dust produced by the mining also caused respiratory illnesses in villagers.

Assessments of the Chinese government as a climate model should take into account that it bans independent media, stringently controls the internet, and routinely jails government critics. The human rights abuses that are publicly known are only the tip of the iceberg.

Li and Shapiro also call into question the sustainability of the Chinese government’s rights-trampling climate fixes, arguing that they “cause people to become confused, angry, and even hostile to the climate cause,” and that “better outcomes are achieved when grassroots, citizen-driven environmental initiatives and projects become trusted partners with the state.” This corresponds with Human Rights Watch research on climate and human rights globally. Robust and rights-respecting climate action requires the full and meaningful participation of all stakeholders, including governments, activists, civil society groups, and populations most vulnerable to the harm of climate change. Doing away with human rights to address the climate crisis is not only ethically unacceptable, but also fundamentally ineffective.

Senior China Researcher

Human Rights Watch

Yifei Li and Judith Shapiro raise many good points about China’s dominance in global infrastructure construction and development.

To add to this discussion, I’d note that it is one thing to build the infrastructure, but quite another to manage it reliably. China’s safety cultures may be far less exportable—despite the country’s expansive outreach through its Belt and Road Initiative. Indeed, we need to first know considerably more about China’s track records in high reliability management of infrastructures. I have in mind particularly the real-time management of the nation’s high-speed rail system and coal-fueled power plants, and of the backbone transmission and distribution of electricity and water supplies in large metropolitan areas.

It is one thing to build the infrastructure, but quite another to manage it reliably.

We know that infrastructure data are in short supply from China, but it is important, I think, that data gaps be differentiated going forward by both types of infrastructure and their management cultures. How to fill these gaps? I know of no real substitute for Chinese scholars willing, even if currently unable, to analyze and research these major topics further.

Senior Research Associate

Center for Catastrophic Risk Management

University of California, Berkeley

Imagining a Better Internet

The future of technology is too often imagined and engineered by corporations and venture capitalists, foreclosing more radical possibilities. Today, it is Meta’s iteration of the Metaverse that dominates headlines, a riff on an old theme: the monetization of networked social life. Apparently the future includes stilted, legless avatars in VR versions of Microsoft Teams meetings. After the launch in 2016 of Facebook Live, its developer, Mark Zuckerberg, called for the formation of a twenty-first century “global community” through technology, harking back to Marshall McLuhan’s “global village” of the 1960s. But who and what is a community for? As critics such Safiya Noble, Virginia Eubanks, Ruha Benjamin, Sarah T. Roberts, and Siva Vaidhyanathan have long argued, the democratic ideal of “everybody” connecting to the internet through social media platforms is undermined by the narrow visions of elite technologists.

It doesn’t have to be this way. Kevin Driscoll’s “A Prehistory of Social Media” (Issues, Summer 2022) helps us reimagine internet futures by looking to the many nets of the past. Rather than drawing on a singular narrative—the straight line from ARPANET to the World Wide Web and, eventually, platform supremacy—Driscoll emphasizes the grassroots, locally situated networks that emerged from the growth of the personal computer. Individual enthusiasts started bulletin board systems, and rather than relying on opaque terms of service and precarious content moderators to manage people’s relationships to the network, you could contact the volunteer owner directly or perhaps even meet in her living room.

Rather than drawing on a singular narrative—the straight line from ARPANET to the World Wide Web and, eventually, platform supremacy—Driscoll emphasizes the grassroots, locally situated networks that emerged from the growth of the personal computer.

Web 2.0-era social media platforms are a departure from early community networks, a diverse ecology with subcultures that matched their location and participants: Amsterdam’s DDS, the Boulder Community Network, Montana’s Big Sky Telegraph, the Blacksburg Electronic Village, the Seattle Community Network, and Berkeley’s Community Memory project. Such community networks were tied to specific places, not anonymous cyberculture. Even for electronic communities that were mostly associated with online interactions, there were some in-person encounters. Members of the Whole Earth ’Lectronic Link, known more popularly as The WELL, for example, met at potluck dinners around the Bay Area, even if the early virtual community was open to anyone regardless of location.

As Driscoll notes, while a plethora of networks for queer and trans people, Black people, and others from marginalized communities flourished, even grassroots networks are plagued by the ills of society. From the earliest days of cyberculture, critics pointed out that race, gender, sexuality, and embodiment cannot be left behind in cyberspace. Rather, racism and sexism structure people’s experiences in virtual environments as they do IRL.

While the past wasn’t perfect, disenchantment with digital advertising and surveillance models has catalyzed nostalgia for earlier internets. GeoCities, founded in 1994 as “Beverly Hills Internet,” fostered community through web-based neighborhoods and user-generated content. GeoCities closed in 2009. Neocities, launched in 2013, is an unrelated homage website that calls for bringing back the independent, creative web. Similarly, SpaceHey, a reimagined version of MySpace, is intended to revive the original site’s ability to teach coding skills to young people. Folk histories of the internet provide an entry point for using many pasts to envision a multiplicity of futures beyond Big Tech.

Director of Developer Engagement

Intel

The article in Kevin Driscoll’s title is its most important part: prehistory, not the. Some arguments narrow to closure, arriving at “the point.” In the course of convincing us, Driscoll’s argument instead opens out, welcoming us into the big country—literally and figuratively. He makes the case for modem culture and bulletin board systems as salient antecedents of contemporary online culture, and uses that point to bust right through the simple, received narrative of how “the internet” came about. In opening up the past, he opens up the future too: the history of networking computers together is a reservoir of alternatives. His history offers other technologies, other communities, other applications, other ways of being online—many of them better, for various values of better, than what we’ve got.

The engineers in Cambridge and Palo Alto created much of the fundamental infrastructure, but the way it is used can be better understood by starting with the modem on the kitchen table or the garage workshop in Baltimore or Grand Rapids.

Other places too: the big country. Notice the geography of Driscoll’s prehistory, rattling off place names like a Johnny Cash song. Chicago, Atlanta, Northern Virginia, “Alaska to Bermuda, Puerto Rico to Saskatchewan.” Notice how much of it happens in people’s homes in cities and towns across the North American continent. You can count the locations of the popular narrative of the internet on one hand: the research powerhouses SRI International, the Massachusetts Institute of Technology, and the University of California Los Angeles, and in the corporate world maybe Bolt Beranek & Newman (now BBN Technologies) and Xerox. It does not detract from that history to point out that it was only one of many ways that people were networking computers together—and a highly specialized, idiosyncratic one at that, reflective of the agendas of big science, Cold War R&D, and the nascent tech industry. Driscoll reveals how people outside this domain were connecting their computers for their own purposes in ways that prefigure the internet’s broad adoption much more accurately than electrical engineers with security clearance. The engineers in Cambridge and Palo Alto created much of the fundamental infrastructure, but the way it is used can be better understood by starting with the modem on the kitchen table or the garage workshop in Baltimore or Grand Rapids: the laboratory of digital social media that came before the internet.

Driscoll’s story reminds us that internet is a verb as well as a noun: internetworking the many and varied networks of computers together. Networking and internetworking comprise the labor of making BBSs, launching AOL, getting onto Usenet, rolling out ATMs in banks and convenience stores, tying satellites and radio telescopes into a computation grid, or setting up servers. How the networks work is an expression of agendas, ideologies, and expectations, and the “modem world” clarifies how different such agendas could be from our era of platform dominance and vertical integration of industry. There is no “history of the internet,” in other words, only histories of all the ways computers were and are networked and internetworked. Histories, and possible futures such as the one Driscoll gives us here: social media that are local, personal, inventive, messy, communal, and do-it-yourself.

Professor

Science and Technology Studies

University of California, Davis

Kevin Driscoll offers a fascinating account of the rise and fall of the bulletin board system (BBS), in the process highlighting an important path not taken in the history of digital communication and the internet. Unlike government-funded networks of the era, BBSs were spaces for experimentation and innovation driven not by funders’ goals but users’ own idiosyncratic interests and desires. In the process, they found new ways to work within the limitations of the existing phone infrastructure to create truly international connections.

As Driscoll recounts, whatever their reason for joining the “modem world,” users of a local BBS were able to make social connections and build community with other users well beyond their immediate social circles. For some users, pseudonymity allowed them to explore aspects of their identities, such as sexuality and gender identity, that they didn’t feel safe discussing anywhere else. In their governance and design, individual BBSs and associated software reflected their system operators’—or sysops’—own personal and political investments. The time that Tom Jennings spent in queer punk spaces shaped his focus on repurposing existing infrastructure to develop DIY solutions—a foundational aspect of his “Fido” software that, as Driscoll notes, became an open standard for exchanging files and messages between BBSs. Sister Mary Elizabeth Clark, who founded AEGIS to carry information about living with HIV and AIDS, brought what she’d learned about information dissemination during her years as a transgender advocate and activist to her work at AEGIS.

Unlike government-funded networks of the era, BBSs were spaces for experimentation and innovation driven not by funders’ goals but users’ own idiosyncratic interests and desires.

Beyond how it shifts our focus away from ARPANET and other state-sponsored internet infrastructure, Driscoll’s essay also provides fertile new ground for reimagining what the internet could be. As he shows, most BBSs operated on a far smaller scale than current platform monopolies. With that smaller scale came a distinctly different sense of sociality. For many people, socializing online via a BBS became a conduit for building community offline. Unlike current platforms, community moderation disputes were settled not by a faceless corporate entity, but by an identifiable member of the community invested in its continued success. As she recounts in her book Cyberville, Stacy Horn, the sysop of EchoNYC, encouraged users, at different points, to be active participants in her decisionmaking process regarding board governance and moderation.

This smaller scale and sense of community investment can be particularly potent for individuals who are often the targets of harassment and abuse online. What would communities created by and specifically for these individuals look like? How could their design use key features of the BBS, like pseudonymity, locality, and accountable governance, in ways that not only meet these users’ specific needs, but also ensure that they feel comfortable communicating online? Moreover, the history of the BBS includes a variety of models for monetary support not based on harvesting and reselling user data. Focusing on sustainable models of maintenance, as opposed to growth at all costs, opens up room for the kinds of experimentation and play needed to imagine a more equitable future online.

Founder, Queer Digital History Project

Lecturer, Women’s and Gender Studies, Gonzaga University

As the internet landscape of social media becomes increasingly embedded in day-to-day lives, many contemporary thinkers and critics have decried that the internet is broken. When Twitter and Facebook posts fuel widespread misinformation campaigns or inspire tumultuous market conditions, it might be difficult to recall the deeply intimate and personal roots of internet technologies.

Kevin Driscoll paints a vivid picture of those electric early days of networked computing, when a modem was a luxurious add-on and PC enthusiasts convened in homebrew clubs to discuss the latest microprocessor. Driscoll explains how the advent of the internet was really a collage of computer networking efforts rather than one seminal development by Department of Defense military researchers or the standardization of the internet protocol suite commonly known as TCP/IP. Most important to Driscoll’s internet history is the BBS, the bulletin board systems that facilitated widespread communication between tech-hobbyists and amateurs alike. It was this fervor for BBSs, Driscoll explains, that allowed the “modem world” to flourish.

Overlooking the history of the BBS presents the false notion that the reins of the internet have always been out of reach for the average computer-owner, better left to the Zuckerbergs and Bezoses of the world.

The 1970s and ’80s represented an increase in the adoption of personal computers, a shift from business to pleasure. What was previously found only in university research labs or government buildings was now available for citizen-consumers. Off-the-shelf computer kits made it easier than ever before to build an impressive piece of computing technology right in a person’s sitting room. Add to that the dropping price of modems and the increased exposure to computer networking through timeshare programs or hyperlocal BBS terminals, and the novelty of electronic communication became commonplace for those with the inclination and the means to pay for it. Driscoll shows how these developments snowballed through the late ’70s and early ’80s, paving the way for the BBS.

BBSs were an important, and distinct, precursor to the commercial internet and World Wide Web of the 1990s. Calling one computer to another had the added draw of being a one-to-one connection, an intimate sensation of dialing right into someone’s home. Even multiline systems, reliant on multiple phone lines, lent the cozy feeling of a cocktail party. In most narratives about the development of the internet, there’s a neat line from the ARPANET (the Advanced Research Projects Agency Network developed by the Defense Department) to the World Wide Web. These stories neglect the individual roots, and the personal touches, of a communally built public network like the modem world. Overlooking the history of the BBS presents the false notion that the reins of the internet have always been out of reach for the average computer-owner, better left to the Zuckerbergs and Bezoses of the world. In reality, computer networking technologies are historically a people’s technology.

Today, the expectation of ubiquitous internet access is rampant. Conditions brought on by COVID-19 pandemic lockdowns highlighted the need for high-speed at-home connections to facilitate schooling, work, and community connection. At the same time, the lockdowns brought inequities in the digital divide to the fore. As the internet morphs into increasingly partitioned spaces, funneling users between the same five mega-websites, it has become more urgent than ever to reexamine the stakes of internet ownership. When these extant structures seem inevitable, it’s helpful to remember how things got started—for people, by people.

PhD Candidate, University of California, Irvine

Kevin Driscoll insightfully notes that to reenvision the possibilities of the internet today, we need to recast its past. For Driscoll, that involves looking not to the mythologized narrative of ARPANET as a Cold War-era, US-built communications infrastructure for surviving nuclear war, one built by eccentric computer “wizards.” Rather, we might look to the networked links of bulletin board systems (BBSs) connected by telephone lines that were at their peak popularity in the 1970s through the 1990s.

“Why haven’t our histories kept up?” Driscoll asks. It’s an important question. As a scholar curious about the narratives told about technology, I might phrase it differently: What are those mythical tales of the internet doing? What hangs in the balance when they are repeated? When public and scholarly discourse leaves out other narratives of the networked past, what’s at stake? In other words, what do our histories of the internet do?

Driscoll rightfully notes that internet histories such as the ARPANET mythology have effects: these stories represent and reentrench values and are used in turn to advance arguments (for better or worse) from public policy to corporate conduct. Origin stories like these often act as a sort of founding document, taken as a blueprint for how the rest of the story should unfold.

Stories that look outside the ARPANET mythology can help us view more clearly the social and technical entanglements of the internet as they stand today.

Dominant narratives also inevitably perpetuate notions of who belongs—in this case, who belongs in the realms of high technology. Popular images of computing’s foundations are largely mapped to subjects typically white, male, and American. But looking to the world of BBSs instead supports a vision of the internet that is less commercial, more community-based, and more representative of the variety of people who created the sociotechnical basis for the online world as it is experienced today.

Along with the question of what current histories of the internet do, there’s also the question of what they can do, especially when conceptualized outside of typical paradigms. Driscoll suggests that stories such as those of BBSs can help tell a more accurate backstory for the internet—and give a foundation for imagining a better future. In my own research, where I have gotten lost is in attempting to draw this line directly between the recast past and a better future.

Instead, I want to suggest that stories that look outside the ARPANET mythology can help us view more clearly the social and technical entanglements of the internet as they stand today. The “internet,” after all, is a useful if inexact shorthand for the social and technical, the virtual and physical, the governmental and corporate and grassroots layers of networked computing. That is to say, what the internet is, what its problems are, and how to go about solving those problems are notoriously tricky to grasp. It might be one reason we rely on the well-worn ARPANET story that focuses on a few machines and a few people.

Understanding the internet of today is partly why I have researched its history myself: the past can offer a less volatile scenario while highlighting contemporary aspects that otherwise appear natural, that fade into the background and thus appear to be unchangeable. Looking to the rise and fall of BBSs’ popularity, for instance, accentuates the commercial consolidation of the internet that has resulted in conglomerate platforms such as Facebook or Google or Amazon, companies that in turn exercise and reentrench their overwhelming political, cultural, and economic power. Recognizing these realities is maybe one of the best things internet histories can do, and perhaps the first step in drawing that line between the internet’s past and some possibilities of a more hopeful future.

Postdoctoral Fellow, Center on Digital Culture and Society

University of Pennsylvania

Innovation in Mentorship

Academic Mentorship Needs a More Scientific Approach,” by Beronda L. Montgomery, Fátima Sancheznieto, and Maria Lund Dahlberg (Issues, Summer 2022), calls to light how well-intentioned but neglectful mentorship serves as a severe detriment in how the United States provides training in science, technology, engineering, mathematics, and medicine—the STEMM fields. Importantly, the authors point out that “mentorship is largely an ad hoc activity, with institutions delegating responsibility to graduate training programs and the PIs of individual research groups. This entrenched, informal system revolves around each scientist’s individual commitment to mentorship and personal experience with past mentors.”

Indeed, this ad hoc, do-it-yourself approach serves neither faculty members nor students well; it perpetuates inconsistent and outright bad experiences, and it ultimately hurts the research enterprise altogether by undermining the well-being and creativity of the humans who drive it. Frankly, it amounts to institutional neglect.

However, we keep doing the same thing: creating mentorship training programs that no matter their quality—and many are great—can be onerous to commit to. Absent institutional support, one’s ability to prioritize mentorship training inevitably competes with formal metrics of early-career success. In this sense, we operate in an academic ecosystem that inherently deprioritizes and disincentives mentorship.

We need to stop thinking about STEMM mentorship training as “if you build it, they will come.” Considering all the other pressures students, postdoctoral researchers, and early-career faculty members face, we need to build it, and then bring it to them—with institutional support that prioritizes, incentivizes, and rewards excellence in mentorship.

We operate in an academic ecosystem that inherently deprioritizes and disincentives mentorship.

A typically overlooked focus is on teams. Thoughtful programming for research teams at universities would pay dividends. At Stanford, with generous support from the Shanahan Family Foundation, we are testing ideas—drawing from successful teams across diverse sectors (business, academia, and even sports)—to implement the practice of mentorship in a research team context.

Whole-team participation by STEMM labs can engender greater and more meaningful engagement with mentorship training and practice, while also lending itself to scale-up via institutional support, with team-focused programs serving researchers at different career stages simultaneously. For a faculty member or student to learn with one’s team is to amplify the opportunities for healthy mentorship alliances to blossom across a team at all levels. And it takes what the business consultant Patrick Lencioni, in The Five Dysfunctions of a Team, calls “vulnerable trust” for faculty and students to engage honestly and effectively.

Currently, academic researchers who become faculty members—or who go into industry—are trained on research. Period. But the hardest part of science isn’t the science; it’s the people stuff. It’s tragic that virtually no one in STEMM explicitly receives training either on how to lead a research group or how to be an effective trainee within a group. Leading a research team requires one to manage others; communicate effectively with diverse people; mediate conflicts; do budgets; set operational, research, and cultural expectations; and implement inclusive training practices. Being an effective trainee requires one to understand roles and responsibilities; grasp timelines and programmatic obligations; pursue good grades in classes; publish and present one’s research; hone effective oral and written communication skills, including self-advocacy; and give credit and encouragement to others.

The unique focus on the team—enrolling whole teams inclusive of faculty, students, postdocs, and staff—provides an opportunity for innovation in the approach to mentorship education for STEMM researchers. It enables reciprocal mentor-mentee relationships to develop through healthier team environments in which mentorship alliances throughout an organization can thrive.

Director, Strategic Program Development & Engagement

Associate Director, Center for STEMM Mentorship

Stanford University

Senior Research Scientist, Lab Manager

Associate Director, Center for STEMM Mentorship

Stanford University

Sanjiv Sam Gambhir Professor of Translational Medicine and Chemical Engineering

Faculty Director, Center for STEMM Mentorship

Stanford University

Mentoring has been a cornerstone of the scientific enterprise, as knowledge is transferred from one generation to the next. But as Beronda L. Montgomery, Fátima Sancheznieto, and Maria Lund Dahlberg describe, the traditional ad-hoc form of mentoring has limitations and consequences, and it is time to reimagine how mentoring is practiced.

Mentors who had benefitted from being mentored early in their career usually adopt a similar, if not identical, approach. They vow to do the opposite if it was a toxic relationship. This is limiting as it lacks perspective, insight, and accountability. Mentors might try to transform their mentees into “mini-mes,” but this, as the authors describe, will only widen the diversity gap in science. Further, it discourages mentors from learning from each other. This is especially needed when facing unfamiliar or extreme situations, such as how to mentor during a pandemic.

The halls of science are littered with bad mentors who, as researchers have noted, often served up numerous forms of inadequate mentoring, both active and passive. Mentors might not even realize the negative impact of their practices from unanswered emails, months of not reviewing their manuscripts, and preventing their mentees from collaborating with others.

The authors advocate for a scientific approach to mentoring, underscored by collaboration, a hallmark of modern science. We can learn from other industries (after all, checklists in the operating room were copied from those that pilots use before takeoff). It is common in nearly every industry to utilize expertise from other sectors. Science should take a similar approach to mentoring.

The authors also recommend institutionalizing mentoring, making it a part of the promotion, tenure, and hiring practice. But some people may give the recommendation only lip service, and there is no means of holding them accountable for what they claim. Asking faculty to include a mentoring plan is a one-dimensional approach they can copy and paste among multiple applications.

It is common in nearly every industry to utilize expertise from other sectors. Science should take a similar approach to mentoring.

Instead, consider asking faculty: What is the most challenging mentoring issue you faced this year? or What is the greatest achievement of one of your mentees, and what was your role in it? These questions would force mentors to think more deeply about their actions and impact, far more than a generic plan can accomplish. Furthermore, mentees’ needs vary, and this broader approach prevents a one-size-fits-all model.

To move mentoring from a haphazard, ad hoc approach with questionable impact and scalability, academic institutions should consider a more collaborative approach:

Mentoring teams. As the authors explain, one mentor cannot have all the answers to all questions. Having a diverse group of mentors from various generations and fields is pivotal for offering the needed array of career guidance and psychosocial support.

Community of practice. Mentors need the chance to learn from each other. Being one of many will enable them to share ideas, ask for guidance on how to deal with difficult situations, and find an expert in a particular field.

Recognize the bidirectional effect. While traditionally mentors have been senior faculty, mentoring is now recognized as working in all directions. Senior faculty can learn from junior members in the lab, and peers can learn from each other.

Just as science has evolved, so must our views and approaches to mentoring. To compete, be inclusive, and stop the leaky pipeline that is plaguing science, we must take a more scientific approach to mentoring.

Assistant Professor and Chief Learning Officer

Department of Anesthesiology

Weill Cornell Medicine

Author of The Success Factor

Reading Beronda L. Montgomery, Fátima Sancheznieto, and Maria Lund Dahlberg’s essay made me want to stand up and cheer. These authors address a critical issue that threatens aspiring scientists and the very future of science, namely the lack of systematic, data-driven approaches that place mentoring the next generation of scientists on an equal footing with other required activities such as publishing and obtaining research funding.

The authors are accurate in their assertion that with the current ad hoc approach, despite the critical importance of mentoring, this activity is neither rewarded nor incentivized. In fact, performing the labor of mentoring is not only notvalued, but can, and often does, have a negative impact on the typical metrics used to measure “success” for scientists.Time and effort devoted to mentoring are inherently time and effort not spent writing a manuscript, crafting a grant proposal, or brainstorming about a new research idea. The situation is even more dire in that mentoring is absolutely essential for the continued success of the scientific enterprise, particularly to build a diverse and inclusive scientific community, but the burden of mentoring often falls disparately on a subset of individuals, typically those from the very groups that the community claims to want to lift up. Without systematic training in mentoring, even well-intentioned individuals who are willing to put in the work may not have the ability to maximize the potential and impact of their effort.

Despite the critical importance of mentoring, this activity is neither rewarded nor incentivized. In fact, performing the labor of mentoring is not only not valued, but can, and often does, have a negative impact on the typical metrics used to measure “success” for scientists.

As the authors point out, there is ample research that defines mentoring best practices, but these substantial data have not been used to develop and implement mentoring programs at the institutional level, nor have they been adopted by granting agencies. In the interest of provoking action, the authors charge scientific leaders to institutionalize mentoring. For example, creating an institutional office of mentoring that provides training and measures compliance in a manner equivalent to that required for environmental health and safety would place mentoring on an even platform with other mandated activities. A major challenge is how to build in accountability. As the authors note, while mentor training can be implemented, required, or both, oversight and regulation pose a major challenge.

One of the major threats associated with poor or damaging mentoring is the continued failure to diversify the scientific research community. The authors raise the important point that scientists from groups that are historically excluded and underrepresented in STEM fields are more likely to be impacted by negative mentoring than those in majority groups. Thus, a lack of evidence-based and systematic approaches to mentoring works against a major stated goal in STEM, namely building an inclusive and diverse scientific community that is best poised to use creative approaches to tackle future scientific challenges.

Scientists apply innovative, evidence-based approaches to their research questions. These approaches need to be used in a similar manner to develop, implement, evaluate, and reward mentoring. Kudos to these authors for continuing this important conversation and suggesting actionable approaches to address this very real threat to the future of science.

Samuel C. Dobbs Professor of Biology and Senior Associate Dean for Research

Emory College of Arts and Sciences

A New Model for Philanthropy?

Two respected former leaders of the Defense Advanced Research Projects Agency, Regina Dugan and Kaigham J. Gabriel, have teamed up to lead a new philanthropy-funded, DARPA-like entity—Wellcome Leap. It is supported by the United Kingdom’s Wellcome Trust, an independent charity focused on health science. In “Changing the Business of Breakthroughs” (Issues, Summer 2022), Dugan and Gabriel propose this as a model for other DARPA-like entities to be funded by philanthropy.

Science policy theorists have long studied two innovation factors: research and development levels and directions and the talent base behind that R&D. The first focus stems from work by the economist and Nobel laureate Robert Solow, who argued that the dominant factor in economic growth was “technological and related innovation.” The second stems from work by the economist and Nobel laureate Paul Romer, who argued that “human capital engaged in research” was the vital factor behind the R&D for technological advance. These can be considered two direct factors behind innovation (as opposed to a multitude of indirect factors). However, a third direct factor, innovation organization, is less understood and has received less scrutiny. Dugan and Gabriel are, in effect, arguing for its centrality, pressing a new approach upon us.

They suggest that an era of complex technologies makes innovation organization a necessity. The lone innovator in the garage never happened; complex innovation (as opposed to examples of discovery or invention) requires a collaborative process involving a mix of skills and disciplines, putting innovation organization at a premium. This is not a new reality; it has been true since Thomas Edison’s group at Menlo Park developed the incandescent lightbulb and proposed the electrical system behind it. But getting to the right innovation organization is a minefield, littered with many inadequate models.

Dugan and Gabriel focus on DARPA, a famously successful model they know well. It has focused on taking high risks for high rewards, on breakthroughs not incremental advances, and it relies on empowered program managers to find the best research groups to implement new technology visions. They cite former DARPA program manager Dan Wattendorf’s vision that led to a critical DARPA effort a decade ago to advance mRNA vaccine technology. The model has been successful enough to have spawned successful DARPA clones, ARPA-E (for energy technologies) and IARPA (for intelligence technologies). A new clone, ARPA-Health, is now in the offing.

The lone innovator in the garage never happened; complex innovation (as opposed to examples of discovery or invention) requires a collaborative process involving a mix of skills and disciplines, putting innovation organization at a premium.

However, governments have faced challenges in creating ARPAs. Within the Department of Homeland Security, HSARPA was well-staffed at the outset by former DARPA program managers, but was never allowed to operate independently by its departmental overseers who limited its freedom of action. Other countries that have attempted an ARPA model have faced problems of locating it within an established agency, which can limit the needed entrepreneurial culture; of controlling the level of risk it can undertake; and of finding ways to link the ARPA to the scale-up efforts that must follow initial prototyping. Governments always have trouble with failure—of spending taxpayer dollars on high-risk ventures, whatever the potential rewards.

Could philanthropy be an alternative? Dugan and Gabriel suggest that it could face fewer of these restraints, citing their own Wellcome Leap effort. They argue that while governments must innovate within national borders, many technology answers, particularly in health, will be found by creating networks across borders, and philanthropy can operate internationally.

A potential problem for philanthropy is mustering the scale of funding needed. DARPA is a $3.8 billion a year agency. But how much funding do you need to make a difference? ARPA-E has shown that you can have a tenth of that funding level and spur important progress.

Also, philanthropy has been teaming up lately. Cooperation across leading foundations working on climate technologies is now widespread. Fast Grants has brought together some of Silicon Valley’s most successful—the Chan Zuckerberg Initiative, the Collinson brothers, Elon Musk, Schmidt Futures, Reid Hoffman, and others—collaborating to pool funding for projects such as a universal coronavirus vaccine.

Could there be too many DARPAs? In the 1940s IBM chairman and CEO Thomas Watson allegedly said there was a world market for about five computers. We’re now at about 2 billion and counting. Science has truly turned out to be an endless frontier that keeps building on itself; the more innovation the more opportunities there are for more. The DARPA innovation model has proven an unusually viable one; there seems no good reason not to bring on the clones.

Maybe we should encourage this?

Lecturer, Massachusetts Institute of Technology

Coauthor of five books on science and technology policy, including The DARPA Model for Transformative Technologies(Open Book Publishers, 2020)

As a former CEO and senior tech executive at companies such as Xerox PARC, Sun Microsystems, and Google, I have been a direct beneficiary of the DARPA model that Regina Dugan and Kaigham J. Gabriel describe. The Defense Advanced Research Projects Agency’s critical role in creating the internet is widely appreciated, but it also helped to enable many other technological revolutions, including design software for computer chips, handheld GPS receivers, speech recognition, and intelligent assistants. Google itself grew out of a DARPA-funded project at Stanford University on digital libraries. So I am a big believer in the DARPA approach of recruiting world-class technical program managers with a compelling vision, setting ambitious but measurable goals, and backing multidisciplinary teams to achieve those goals.

I am also delighted to see the growing support for the DARPA model, including the United Kingdom’s planned launch of the Advanced Research and Invention Agency (ARIA), funding from the US Congress for an ARPA for Health, and Wellcome Leap, led by Dugan and Gabriel. 

I’d like to pose three questions that, if addressed, could increase the impact of these and other future ARPAs.

What takes the place of Defense Department procurement for other ARPAs?

I am a big believer in the DARPA approach of recruiting world-class technical program managers with a compelling vision, setting ambitious but measurable goals, and backing multidisciplinary teams to achieve those goals.

The original DARPA has benefited from the fact that Defense Department procurement will often create markets for the technology developed by DARPA’s R&D investments. What’s the equivalent of that for other ARPAs? Will market forces be sufficient to commercialize the results of ARPA-funded research programs, or will they get stuck in the “valley of death” between the lab and the market? One possibility to explore is what economists call “demand pull” (as opposed to “technology push”) approaches. For example, DARPA’s investment in the development of mRNA vaccines was complemented by Operation Warp Speed’s advance market commitment to purchase hundreds of millions of doses of a COVID-19 vaccine from Pfizer and Moderna.

What other pressing problems would benefit from a public or private ARPA?

For example, President Obama proposed creating an ARPA for Education, and the US House of Representatives recently provided funding for a National Center for Advanced Development in Education. What other economic, societal, and scientific challenges would benefit from an ARPA? What goals might these ARPAs set, and what are examples of projects they might support to achieve those goals?

What can we learn from the original DARPA, and what experiments should new ARPAs consider?

The original DARPA has been operating for over 65 years, and I think there is more we can learn by studying the different strategies used by DARPA program managers. For example, one highly successful DARPA program was called MOSIS, which provided shared access to semiconductor fabrication services to academic researchers and start-up companies. This accelerated the pace of innovation in microelectronics by providing access to an expensive resource, allowing more people to get involved in semiconductor design. There are dozens of these DARPA strategies that new ARPA program managers should learn from. New ARPAs should also take advantage of their ability to experiment with new models, such as Wellcome Leap’s Health Breakthrough Network.

A STEM Workforce Debate

As a labor economist and director of a research institute, I am often asked to make forecasts about economic conditions. To be honest, I often demur because to forecast the economy means that nine times out of ten you will be wrong. Forecasting occupation demand is even more fraught because a dynamic economy will end up creating and destroying jobs at such a rapid pace as to be highly unpredictable. With that background, I read Ron Hira’s “Is There Really a STEM Workforce Shortage?” (Issues, Summer 2022) with great interest. While I mostly agree with Hira’s approach, there are aspects of this question that deserve a more nuanced discussion.

I wholeheartedly agree with Hira’s critique of Bureau of Labor Statistics Employment projections, and his conclusion that “technological disruptions, and their effects on employment, are notoriously difficult to predict.” Exhibit A is the impact of COVID-19 on the labor market. In February 2020, the United States had 152.5 million people employed, and by April 2020 over 20 million were out of work. It was only in August 2022 that employment finally exceeded February 2020 levels. These kind of employment shocks are not anticipated and are difficult to incorporate in models. Like Hira, I recommend that people take employment projections with a grain of salt.

That said, the aftermath of the COVID-19 pandemic has created an unprecedented labor shortage. According to the Bureau of Labor Statistics, as of July 2022 there are two job openings per every unemployed worker. Recent data from the Indeed Hiring Lab suggest that the shortage of STEM workers is more acute than in other fields. Using Indeed’s dataset, which calculates the percentage change in job openings by selected occupations since the labor market peak of February 2020, job postings in STEM fields were up as of July 26, 2022, including for software engineers (93.7%), medical technicians (78.8%), and electrical engineers (81.1%). In contrast, jobs in business fields were up half as much—in insurance (60.9%), marketing (46.9%), and management (42.3%).

The aftermath of the COVID-19 pandemic has created an unprecedented labor shortage. According to the Bureau of Labor Statistics, as of July 2022 there are two job openings per every unemployed worker.

Finally, while the current shortage (or lack thereof) of STEM workers may be debatable, the lack of diversity in STEM occupations is not. According to data from the National Center for Science and Engineering Statistics, in 2019 only 29% of employed scientists and engineers were female and 15% were from historically underrepresented groups. The National Science Board’s Vision 2030 plan rightly focuses on the lack of diversity in STEM education and employment.

In the long run, unless all children receive access to a high-quality K-12 education, including sufficient coursework in mathematics and science that will prepare them to participate in STEM careers, demographic trends suggest that there will be fewer STEM workers. This lack of diversity may lead to a lack of discovery. A new study in the journal PNAS shows that gender diversity in research teams generates higher impact and more novel scientific discoveries, and the same has been found for ethnic diversity. STEM education is an investment in the nation’s economic future and should be available to all students regardless of race and gender.

Roy A. Roberts & Regents Distinguished Professor of Economics

Director, Institute for Policy & Social Research

University of Kansas

Research Associate, National Bureau of Economic Research

Ron Hira’s article is an exercise in the giving of good advice. He is correct to advise us to demand better data that paint a fuller picture of the STEM labor market’s nuanced realities. He is also correct that we should make more honest and responsible use of the data we already have. But the advice that strikes me most strongly is the exhortation Hira leaves unwritten. Like most good advice, it can be summarized succinctly: follow the money.

Demanding better data about the STEM workforce raises the question of why we don’t have better data already. Aspiring to more truthful STEM workforce debates leads one to ask who has an interest in keeping the debates mired exactly where they are. Hira argues that we are not suffering the national STEM worker shortage our national discourse assumes; the bulk of his text is a point-by-point dismantling of the misuses of data that sustain this mistaken view. But the question of who benefits from the prevailing view is the heart of his argument, the moral undercurrent supporting his data deep-dive.

Demanding better data about the STEM workforce raises the question of why we don’t have better data already.

Hira’s own answer to that question is clear. He suggests that official statistics on how many STEM jobs are offshored every year would be useful, for example—and then reminds us that both the National Academy of Engineering and Congress have sought exactly this data from federal agencies, only to be thwarted by business interests. Hira recounts Microsoft president Brad Smith’s misuse of unemployment data to suggest a worker shortage in computer-related occupations, when there was in fact a surplus. He unpacks wage data to show that contrary to the higher wages a true shortage would prompt, STEM wages have been largely stagnant for years as employers increasingly meet their STEM labor needs through lower-paid contractors and the abuse of guest-worker programs, rather than through higher pay, a more diverse talent pool, and better professional development.

The larger story, then, is about commercial interests “controlling the narrative” on the meaning and role of labor, to the detriment of workers. The STEM workforce debate is just one instance of this systemic problem. For decades, the US policymaking apparatus has given itself over to an economic orthodoxy that treats labor as merely one factor of production among many, which capital is free to reshuffle, discard, downsize, lay off, or underpay as may be required to juice the bottom line. My organization, American Compass, argues that we would do better to recognize that workers are cocreators of economic value rather than merely commodities to be purchased. Hira’s argument points in a similar direction, and reminds us that more informed and constructive STEM workforce discussions will require honesty about whose interests are being served.

Policy Director

American Compass

Long out of fashion, industrial policy has come back into vogue, amid bipartisan concerns over economic and military vulnerabilities in an intensifying sphere of global competition. Underlying much of this discussion is the fear that America lacks sufficient STEM talent to carry forward its legacy of technological innovation and to maintain its lead over China. In his article, Ron Hira raises important questions about whether such concerns are supported by the facts. 

Hira acknowledges that the lack of availability of detailed data represents a constraint to more effective analysis of imbalances between the supply and demand of STEM talent. As he points out, traditional public data only allow for analysis at the aggregate level, and typically only through a sectoral lens. Just as in any field, STEM roles differ in the skills they require and, correspondingly, in the availability of needed talent, as illustrated in Hira’s article by the contrast between life scientists and software engineers. In the same way that a sectoral lens is insufficient to analyze labor shortages for specific STEM roles, looking only at categories of STEM roles is insufficient to analyze the availability of specific skills in demand in the market. There is no single “skills gap” in the market, but rather different gaps for different skills. Overall, conferred degrees and employer demand in what the Bureau of Labor Statistics refers to as “computing and mathematics” occupations may be in balance, but the pipeline for specific talent can still be severely anemic at the level of specific roles.

This is even more the case when we consider the question of whether existing programs of study are aligned to industry demand at the skill level. For example, while universities may be conferring more than enough STEM degrees to meet demand at the categorical level, these university programs may not be teaching enough of the specific skills that are required by industry, whether those be technical skills such as cloud architecture or soft skills like teamwork and collaboration. Significant gaps between skills taught and skills sought can be as problematic as broader imbalances—but less perceptible.

Conferred degrees and employer demand in what the Bureau of Labor Statistics refers to as “computing and mathematics” occupations may be in balance, but the pipeline for specific talent can still be severely anemic at the level of specific roles.

The assertion that supply and demand are in balance (or that the market is possibly even glutted) also depends on the notion that supply follows demand and not the other way around. There is an argument to be made that jobs follow talent in the knowledge economy. Rather than simply filling demand for STEM roles by entering the workforce, STEM graduates can also launch enterprises, create new products, or drive innovations that ultimately create greater demand for STEM skills. Although demand is never infinitely elastic, growing the strength of the STEM talent base is likely to stimulate demand correspondingly. Simply put, if America can reassert itself as a STEM talent hub, its innovation economy will grow, spurring further demand growth.

STEM is also a field with particularly high attrition—a phenomenon the economists David J. Deming and Kadeem L. Noray study in a recent analysis on “STEM Careers and the Changing Skill Requirements of Work.” According to their article, upon graduation, applied science majors enjoy a salary premium of 44% over their non-STEM peers. Ten years out, that shrinks to 14%. Because of the speed of skill replacement in STEM, STEM workers are less likely to enjoy an experience premium. By the time they have acquired significant on-the-job experience, many of the skills they acquired during their education are no longer seen as relevant. This high rate of skill replacement leads to a loss of the skill premium evident immediately after graduation. Accordingly, many ultimately leave STEM roles in order to continue their career progression. Given these defections, a straight demand-graduate analysis could understate gaps in the market, as assumptions about the number of new graduates needed to meet market demand must consider higher attrition of existing workers and not only new jobs created. 

Hira is correct that there is a need to revisit old assumptions. New, more granular, more timely data sources will afford decisionmakers a more precise awareness of the nature of current and emerging talent gaps and provide a more effective basis for action.

President, The Burning Glass Institute

Chairman, Lightcast

Visiting Fellow, Project of Work at the Harvard Kennedy School

Ron Hira asks whether there is really a STEM workforce shortage and, while noting differences by field, largely answers no.

I largely disagree, but also think that “shortage” is the wrong way to think about whether the United States has enough scientists and engineers. Markets tend to clear. There is neither a fixed number of positions for scientists and engineers in the labor force nor a fixed number of ways to use people with that training.

The better policy issues are whether we would benefit from more scientists and engineers and whether people receiving that training have rewarding careers. With a few exceptions, there is overwhelming evidence that the answer is yes to both. Drawing on data from the National Survey of College Graduates, the National Science Foundation Science and Engineering Indicators, and the US Bureau of Labor Statistics, we find:

  • 85% of recent science and engineering (S&E) graduates say their jobs are related to their degrees: 80% at the BS level and 97% at the PhD level, measured one to five years after degree.
  • Employment in S&E occupations by individuals with bachelor’s degree and above grew by 39% between 2010 and 2019, more than five times the growth rate of the labor force.
  • Degree production in S&E fields grew slightly slower than growth in S&E occupational employment—by 38% at the bachelor’s degree level and 30% at the PhD level.
  • Unemployment rates in S&E are low. Average unemployment in 2021 was 2.4% for computer and mathematical occupations; 3.3% for architectural and engineering occupations; and 2.2% for life, physical, and social science occupations.
  • Pay is high for recent graduates in most S&E fields—and rising. For bachelor’s degree recipients one to five years after their degree, average salary in private industry was $61,242 in 2019. This ranged from $44,910 for physical science to $76,368 for computer and mathematical science. For recent PhD holders, average salary was $115,000.

Differences in our conclusions come from different treatment of occupation data. Counts of jobs in STEM occupations should not be compared with headcounts of degree. Many people with bachelor’s-level STEM degrees pursue careers in other fields, such as law and medicine. Many new PhDs have student visas and may not want or be able to stay. Also, many S&E graduates who report that they are doing work related to their degree are not in formal S&E occupations.

The better policy issues are whether we would benefit from more scientists and engineers and whether people receiving that training have rewarding careers. With a few exceptions, there is overwhelming evidence that the answer is yes to both.

I also disagree about the meaning of changes in occupational wages as an indicator of the labor market value of skills. If the average wages for PhD geoscientists in industry were to fall from its $184,000 average, would that mean the skill is not in high demand? Would society be better off with fewer people with that skill?

Changes in occupational wages are not even a good measure of changes in the demand for skills—fast-growing occupations grow fast by bringing in people with less direct training, less education, and less experience. For this reason, the average wage rate often falls in fast-growing occupations.

There are many career-path issues worthy of policy concern, such as whether researchers in a particular field are too old when they receive their first independent grant, or whether older programmers have problems finding new jobs? But limiting the supply of talent, either by immigration rules or education policy, is a blunt policy tool that may have little effect on such issues. And it is probably not a good deal for the scientists and engineers who do remain. Rather than enhancing careers by reducing “competition,” much R&D activity would leave the United States or not take place at all.

Senior Fellow

National Foundation for American Policy

Ron Hira’s article presents a valid challenge to the long-standing argument that there is a STEM workforce shortage in the United States and causes us to reconsider the premise of decades of STEM education policies and initiatives that are based on the “shortage” argument. Hira proposes that this argument has not only been unsubstantiated, but is based on often flawed, incomplete, and misinterpreted data.

As an African American female chemist and social scientist who comes from a low-income and first-generation background, the critical importance of broadening participation in STEM is paramount. In order for the United States to remain competitive in a global science and technology driven economy, we must engage all of our human capital—particularly those like myself who have been historically disenfranchised and discouraged from scientific pursuits. However, as a STEM policy adviser, I am also keenly aware that STEM policy is shaped by not only by data, but by public sentiment, perception, and stakeholder voices who are the loudest. As Hira posits, voices such as those from students who are the target of these policies, and workers who are the end product of these efforts, are often excluded.

For the United States to remain competitive in a global science and technology driven economy, we must engage all of our human capital—particularly those like myself who have been historically disenfranchised and discouraged from scientific pursuits.

Hira lays out several factors that have influenced this notion of a STEM workforce shortage and how these factors have been based on limited data and the exclusion of dynamic processes and situational caveats. He correctly asserts that employment projections, which are based on current trends that are extrapolated out, may be applicable to occupations with stable trends, such as the legal field, but this method is inadequate for occupations that do not have stable trends, such as the computer sciences. Moreover, as Hira notes, a seemingly low unemployment rate in STEM fields, which evidence shows is actually high due to miscalculations based on comparisons with the national employment rate—a composite across all labor markets—has created inaccurate estimations. Errors of this type can lead to regressive impacts on progressive efforts related to inclusive outreach, recruiting, and hiring policies as organizations rely on these rates, projections, and forecasts to formulate staffing budgets.

Overall, Hira presents a sound, documented argument that the decades-long perception of a STEM workforce shortage in the United States is based on unsubstantiated evidence and flawed data and is often driven by stakeholders who do not necessarily advocate for current or future US STEM workers. Hira lays the foundation to seed discourse on a real and transformative conversation not only about the validity of a STEM workforce shortage, but more importantly about the implications for policy and for current and prospective US STEM professionals. Examining the layered, multifaceted, and cross-sectional mitigating factors at play, informed by analyzing disaggregated data on the persistent unemployment, underemployment, wage disparity, and barriers that impact minoritized groups such as BIPOC and persons with disabilities, who are an untapped source of US STEM talent, would greatly enhance this new conversation that is needed.

Founder and Executive Director

Wagstaff STEM Solutions

The C Word: Artists, Scientists, and Patients Respond to Cancer

Max Dean, "The Gross Clinic" (2016). Image courtesy of Max Dean.
Max Dean, The Gross Clinic, 2016. Image courtesy of Max Dean.

After being diagnosed with prostate cancer on his sixty-second birthday, Canadian multidisciplinary artist Max Dean began to explore his prognosis through his art practice. Striving to visualize the physical and psychological manifestations of his disease, Dean employed the help of animatronic figures from the Wilderness Adventure Ride at Ontario Place, an abandoned theme park in Toronto. Deeply inspired since college by Thomas Eakins’s 1875 painting Portrait of Dr. Samuel D. Gross (The Gross Clinic), which depicts Gross performing surgery on a patient’s thigh, Dean staged an operation on the ride’s moose—exploring the interrelated themes of time, aging, and illness. His process was documented by filmmaker Katherine Knight in Still Max, which premiered at the Hot Docs film festival in 2021.

A clip from the documentary is included in the exhibition The C Word: Artists, Scientists, and Patients Respond to Cancer, which provides a platform for discussing the role of art in negotiating and reimagining humanity’s complicated relationship with cancer and the process of healing. The exhibit, which opened at 850 Phoenix Biomedical Campus on April 21, 2022, is curated by Pamela Winfrey. It represents the first five years of the Arizona Cancer Evolution Center’s Art Program, a residency program that embeds artists in research labs within Arizona State University’s Biodesign Institute.

Cleaning Up Our Mess in Space

In “A Montreal Protocol for Space Junk?” (Issues, Spring 2022), Stephen J. Garber and Lisa Ruth Rand correctly recognize the challenges to pursuing remediation for removing space debris despite the obvious use for the technology. Remediation alone is difficult to incentivize. Despite lowered costs to access space, the incentive to remove debris remains outweighed by the cost of a dedicated remediation mission.

An alternative approach to remediation is to focus on the creative combination of multiple objectives in a single mission. The companies Northrop Grumman and Intelsat recently accomplished two satellite-servicing missions, to extend the operational life and reposition the Intelsat satellites. Similarly, NASA is developing a spacecraft called OSAM-1 (short for On-orbit Servicing, Assembly, and Manufacturing 1) that is designed to test on-orbit refueling of satellites. OSAM-1 (formerly called Restore-L) is intended to refuel the Earth-observing satellite Landsat-7, for mission extension purposes and to demonstrate a repair capability.

Remediation alone is difficult to incentivize. Despite lowered costs to access space, the incentive to remove debris remains outweighed by the cost of a dedicated remediation mission.

Mission extension is a significant driver toward servicing a satellite. There exists a common tension between using limited fuel resources for mission extension vs. removing the satellite from orbit within 25 years of mission completion (known as the “25-year rule”). Private satellites are held accountable by regulators to meet the 25-year rule. However, public goods such as NASA’s satellites are pressured to maximize the utility of their highly valued and well-utilized science missions, particularly when a replacement satellite is delayed.

The current culture focused on near-term science does not necessarily align with the concept of timely disposal. A combined mission extension and disposal mission may offer a solution to this tension. In the case of Landsat, retaining operational continuity is key to achieving science objectives. Thus, a comfortable overlap between the operational Landsat and the developing replacement Landsat mission is often desired.

Technologies for remediation, or satellite servicing, have potential applications in the public and private sectors as well as the civilian and defense sectors. However, no one entity wants to get stuck with the bill for developing a service and sustaining that service. Creative public-private partnerships that meet the needs of nongovernment entities, rather than bespoke solutions, may serve well in this situation. In this manner, the government can encourage the development of an industrial base and be one of many customers.

It is important to remember that remediation is part of a multipronged approach. Debris mitigation continues to serve space sustainability well, but has limitations. Unplanned incidents on orbit will inevitably occur. Having an alternative solution available to support those unexpected accidents is a valuable addition to the suite of technologies that will support space sustainability.

Aerospace Engineer

NASA

Stephen J. Garber and Lisa Ruth Rand make the argument that orbital debris is a form of pollution and that it is constructive to examine past efforts to address global pollution. The authors logically turn to a successful international treaty, the Montreal Protocol on Substances That Deplete the Ozone Layer, adopted in 1987.

How successful is the Montreal Protocol? The United Nations Environment Program recently reported that signatory countries, or “Parties,” have phased out 98% of ozone-depleting substances globally compared with 1990 levels. Without the Protocol, ozone depletion would have increased tenfold by 2050. On a human scale, this would have resulted in millions of additional cases of melanoma, other cancers, and eye cataracts.

The authors highlight lessons learned from the Montreal Protocol that could apply to the planet’s burgeoning space debris problem, including:

  • Developing consensus on the existence of the problem;
  • Emphasizing government-led international collaboration to find solutions;
  • Devising incentives or financial assistance (“carrots”) for developing countries and punitive measures (“sticks”) for developed countries;
  • Evolving regulatory flexibility to align with new discoveries; and
  • Emphasizing the risks posed by inaction.

Emphasizing the risks of inaction is key. What would happen if Earth’s orbital regime reaches a point of no return? While there appears to be a consensus that orbital debris proliferation is a problem, we need to do more to broaden awareness.The value of space extends to all countries. Even those countries without operational satellites will benefit from space services such as increased connectivity, geolocation capabilities, and access to satellite imagery.

What would happen if Earth’s orbital regime reaches a point of no return? While there appears to be a consensus that orbital debris proliferation is a problem, we need to do more to broaden awareness.

Over 50 countries now own and operate space assets. However, equity in space assets is not distributed evenly. Citigroup recently estimated that the space economy would generate over $1 trillion in annual sales by 2040, up from around $370 billion in 2020, but wealthier spacefaring countries are the most invested and stand to benefit the most from the expanding space economy.

Thirty-five years ago, the architects of the Montreal Protocol navigated a dire situation—a thinning ozone layer—and planned a course of action that addressed a diverse range of stakeholders with varying degrees of resources. Now the planet is facing a dangerously congested orbital environment. But the financial consequences will not be felt equally across the planet. The higher-income world has more “skin in the game,” or equity in space-based assets, and therefore more to lose if a worsening space debris cascade threatens the long-term viability of satellites. Following the spirit of the Montreal protocol, more affluent spacefaring countries should lead while the smaller, less-invested countries should be given incentives to follow as debris mitigation policies continue to take shape.

As the ecologist Garrett Hardin noted over 50 years ago, “Freedom in a commons brings ruin to all.” If we are on the brink of a tragedy of the commons in space, now is the time for the space sector to learn from successful international cooperation efforts—and the Montreal Protocol provides a shining example.

Space Policy Economist and Technology Strategist

The Aerospace Corporation

Stephen J. Garber and Lisa Ruth Rand provide a well-supported article regarding the extent of space debris and the potential application of that international agreement for controlling terrestrial pollution to controlling the risk of collisions with debris in space. They eloquently identify the three key aspects for controlling the debris population: debris mitigation, space traffic management, and debris remediation. I would like to focus on a specific attribute of the Montreal Protocol discussed and the most critical, and difficult, means to manage debris growth: debris remediation.

An investment by the government entities that are responsible for the decades of debris deposition on orbit is needed.

Debris remediation is primarily the act of removing massive derelict objects (e.g., abandoned rocket bodies and nonoperational payloads) from orbit to eliminate the possibility of future massive debris-generating collision events. A paper completed in 2019 by 19 scientists from around the world identified the top 50 statistically most concerning objects in low Earth orbit (LEO). Leading the list were 18 SL-16 rocket bodies launched by Russia primarily in the 1990s and left in a tight 40-kilometer-wide band centered around 840 km altitude, where they routinely cross orbital paths with each other, debris from the 2007 Chinese antisatellite test, and defunct US payloads and debris related to their demise. This combination has created a uniquely bad neighborhood in LEO. These objects were deposited primarily by the three major space powers—the United States, China, and the Russian Federation—before the turn of the century. For perspective, if two SL-16s were to collide, this event could singlehandedly double the debris population in LEO (i.e., add up to 15,000 large fragments).

Garber and Rand call for leadership. Indeed, leadership is critical to control the debris population in LEO and catalyze the debris remediation industry. Just as government investment has catalyzed the now largely commercial fields of space-based Earth imagery, global space-based communications, and satellite launch, development and deployment of debris remediation solutions cannot be borne solely by the emerging commercial ventures. Conversely, an investment by the government entities that are responsible for the decades of debris deposition on orbit is needed.

Senior Technical Fellow

LeoLabs

Understanding Noise in Human Judgments

It was a pleasure to read the interview with Daniel Kahneman, “Try to Design an Approach to Making a Judgment” (Issues, Spring 2022), who is a world leader in research based upon his multiple findings in the areas of attention and decisionmaking, as well as his other contributions to psychology and economics. His interview expresses that expertise and is very helpful in understanding the extent and danger of variability in human judgment.

However, there seems to me to be too strong an implication in the interview that we would be better off if everyone came to roughly the same decisions. In the cited case of insurance actuaries, a 10% variance seems tolerable, not the 50% actually found. The assumption that variability is bad is clarified somewhat in the interview as Kahneman discusses how it may aid creative problem solving by allowing diverse opinions.

To view variability as inherently bad seems to me a judgment error of the type Kahneman has discovered and identified in other situations. Even in the justice system, the effort to impose common minimum sentences for crimes might have reduced variability and increased equality, but it also has had some very bad consequences in filling the prison system.

To view variability as inherently bad seems to me a judgment error of the type Kahneman has discovered and identified in other situations.

If we all thought more or less alike, it seems it would be a more just world—but as a species would we be better off? It seems at first that in some issues where the “correct decision” has expert consensus, such as in vaccination for COVID-19 or climate change, we would be. However, in evolutionary biology variation is celebrated as insurance against some common flaw annihilating the whole species. I have been planting conifers along the river that flows through my forest. They are mixed conifers, as I was warned that planting all fir trees might be bad because a single predator or disease might cause them all to die.

Nearly all scientists say society needs to quit using fossil fuels because of the atmospheric warming it causes. However, can they be really sure that some unknown planetary adaptation might reduce the warming effect? Even though vaccination for COVID-19 has been successful, it is still possible that long-term harm from vaccination will come to some people. In many cases, expert opinion can vary greatly partly because the decision (as in the case of insurance actuaries) depends upon many somewhat separate facets. Kahneman suggests that it might be better to have multiple experts independently rate the importance of the many factors involved so that we could be more certain that all factors will be taken into consideration.

Kahneman has identified an important aspect of human decisionmaking and points also to our lack of awareness of human variability. However, Kahneman and biology tell us that variability can be a good thing for the species at least in some cases, even when we think it unfair.

Professor of Psychology, Emeritus

University of Oregon

I have a reinforcing story to tell relating to the Issues interview with Daniel Kahneman. Among many interesting observations, Kahneman points out how unreliable job interview judgments are, largely due to cognitive shortcuts and biases—what he calls “noise”—that shape, and sometimes misshape, human decisions.

While a member of a large scientific R&D institution (IBM Research) for 25 years, I had the opportunity (and job requirement) of interviewing many dozens of candidates for PhD-level research positions. At one point in my career, I had to move offices and clean out my file cabinets.

Kahneman points out how unreliable job interview judgments are, largely due to cognitive shortcuts and biases—what he calls “noise”—that shape, and sometimes misshape, human decisions.

Coming across 15 years’ worth of my records of recommendations based on these job interviews, I was able to make a subjective evaluation of my own opinions, since a good many of the candidates had started their careers at my own institution or at other places where I was able to follow their progress. Aside from the outliers, the very best and the worst, I was humbled by the randomness of my decisions—which had little correlation between my interview judgments and the candidates’ subsequent success. I even misjudged a future Nobel laureate.

As a result, I did change my interview style to a more structured format, and I claim some subjective improvement, but mostly not at the PhD-level candidate.

Making Space for Technology Governance

In “Imagining Governance of Emerging Technologies” (Issues, Spring 2022), Debra Mathews, Rachel Fabi, and Anaeze Offodile II outline the approach of the National Academy of Medicine’s Committee on Emerging Science, Technology, and Innovation (CESTI) for anticipating broad societal impacts from emerging technologies prior to their large-scale deployment and use. In full disclosure, I was involved in developing the figure, “Principles for the Governance of Emerging Technologies,” used in the article. I helped draft its first iteration for CESTI review and further development. I believe it provides a useful guide for the more holistic assessment of emerging technologies, their potential societal impacts, and procedural and substantive policy dimensions for technological governance. I am also impressed by CESTI’s use of narratives and scenarios to explore impacts and normative dimensions upstream of technology design, deployment, and use. I commend the committee for its efforts.

Unfortunately, as a society, we are far behind in the use of such diagrams and approaches for responsible research on and innovation of emerging technologies, many of which are accompanied by significant uncertainties about their impacts and important normative questions. The example the authors describe in their article, transcranial direct current stimulation, and its use for medicine and enhancement without US regulatory approval or governance of social and ethical issues, provides an example of current inadequacies with technology governance. The elephant in the room is, how can we remedy this deficit? To that question, Mathews and coauthors offer limited discussion. Models such as those proposed by CESTI are prevalent in the social science, science and technology studies, and policy literatures, but they will not have impact until policy spaces to implement them exist.

Models such as those proposed by CESTI are prevalent in the social science, science and technology studies, and policy literatures, but they will not have impact until policy spaces to implement them exist.

In my observations, the root cause for our inattention to the ethical and societal dimensions of emerging technologies, as well as the lack of policy spaces in which to consider them, is the lack of political will. We live in a society that is technologically optimistic, dominated by a capitalistic paradigm for technology governance. The predominant narrative is that social good is equivalent to technology development and adoption. With technology development comes capital, jobs, and economic growth. Regulations for safety, let alone considering the social and ethical dimensions or engaging diverse publics, are seen as getting in the way of these primary goals. Power for decisionmaking is concentrated in the industries developing the technology and regulators whose hands are tied by limited legal authorities and pressure from the US Congress to not overregulate (which in turn comes from industry lobbying). Technology governance takes place at the stage of regulation, and largely (almost exclusively) between the product developer and these constrained regulators.

Currently, spaces for the broader analysis and governance proposed by CESTI, and reported by Mathews and coauthors, are woefully lacking. It is imperative that these policy spaces be independently run; include voices of diverse and affected publics; and come with teeth—the ability to constrain or enable technologies—in order to execute the vision for more robust, responsible, and equitable futures with technology. We should turn our attention toward the creation of those spaces, including ways to overcome the political and economic forces, power structures, and strong techno-optimistic narratives that prevent their existence. Yet this is no easy task.

Goodnight-NCGSK Foundation Distinguished Professor in the Social Sciences

Codirector, Genetic Engineering and Society Center

North Carolina State University

Debra Mathews, Rachel Fabi, and Anaeze Offodile II outline a systematic methodology developed by the National Academy of Medicine to inform a novel governance framework for disruptive technologies. The authors note that “fostering the socially beneficial development of such technologies while also mitigating risks will require a governance ecosystem that cuts across sectors and disciplinary silos and solicits and addresses the concerns of many stakeholders.”

The governance framework, however, does not adequately address the role of risk mitigation in the governance process. I propose that the Academy’s Committee on Emerging Science, Technology, and Innovation include risk mitigation in its next round of study of policy tools for governing emerging technologies to ensure that innovation risks are identified and managed in order to ensure high quality and safe patient care.

Advances in patient care involve a learning curve with new potential sources of harm and unintended consequences. For example, data used to “train” technology enabled by artificial intelligence may not be sufficiently diverse, resulting in algorithmic bias that adversely affects certain patients. Risk assessment and mitigation thus should begin in the innovation sandbox and continue through each stage of the product lifecycle to identify, analyze, and control risks. A health care organization’s innovation risk appetite (the amount of risk an organization is willing to accept to achieve its objectives)and risk tolerance (the acceptable deviation from the organization’s risk appetite) should be incorporated into its enterprise risk management program. Since the introduction of emerging technologies presents strategic, operational, and patient safety exposures to the health system, innovation risk also should be included in the governing board’s risk agenda consistent with the board’s oversight responsibility.

Advances in patient care involve a learning curve with new potential sources of harm and unintended consequences.

Critical risks relating to emerging technologies are many and varied, including:

  • Lack of integration with the patient’s electronic health record resulting in gaps in clinician documentation that could negatively affect diagnosis and treatment decisions;
  • Injury to patients and medical malpractice liability exposure resulting from an evolving standard of care;
  • Problems with the accuracy, integrity, and/or completeness of data or information utilized in the development of technologies;
  • Vulnerabilities that may result in data privacy breaches and/or security incidents;
  • Failure to appropriately determine data ownership, rights, and permitted uses in codeveloped intellectual property that adversely affects a codeveloper’s right to use, share, or sell the technology or information generated by it;
  • Disproportionate allocation of contractual rights, responsibilities, and liabilities among all stakeholders throughout the entire development and deployment lifecycle;
  • Inadequate insurance coverage for a product’s deficiencies when used by the organization during the development period and when marketed to third parties;
  • Violations of federal and/or state fraud and abuse laws, such as when the technology could influence a provider’s referral decisions;
  • Uncertain and inconsistent legal and regulatory requirements that may result in litigation, imposition of monetary penalties, or administrative agency action;
  • Financial risk due to unclear reimbursement rules and policies that may affect achievement of economic objectives; and 
  • Reputational risk, such as ethical violations.

Since the dynamic nature and accelerated pace of tech-driven innovation carries inherent risks throughout the entire product lifecycle, it is prudent to include risk mitigation as a policy tool for the governance of emerging technologies.

Corporate Counsel

MaineHealth

A Fresh Look at Power Thieves

One of my most disturbing memories growing up as an infant in Peru is the blackouts (apagones). Provoked by the Maoist group Shining Path as part of its strategy to seize power, the sudden lack of electricity marked (and shaped) the lives of an entire generation of Peruvians, who had to learn how to manage and repair everyday objects while preventing them from damage once energy returned. The close interaction with electricity (as well as its absence) in the 1980s was present while reading Diana Montaño’s fascinating account, “Electrifying Agents and the Power Thieves of Mexico City” (Issues, Spring 2022), of electricity thieves and the rise of an infrastructure in Mexico City a century ago.

Montaño has chosen an unusual path to understand the complex relationship between individuals and systems: those who trespassed the law by procuring access to electric light. By focusing on marginal subjects, the author sheds new light on technological systems. They are no longer abstract “black boxes,” but a set of artifacts that users (and potential users) aimed to understand, intervene with, and incorporate into their daily lives. One way to do this is essentially by circumventing the formal requirements demanded by the company and gaining direct access to this source of energy and prestige.

The sudden lack of electricity marked (and shaped) the lives of an entire generation of Peruvians, who had to learn how to manage and repair everyday objects while preventing them from damage once energy returned.

As the essay reveals, the expansion of electricity in Mexico City and elsewhere ignited the emergence of new practices. Stealing electricity was one of them, a grassroots expertise that also involved technicians and police agents. Montaño makes an excellent point in suggesting that thieves should be considered as part of the human network created by this innovation. Why not? After all, thieves (and hackers, by extension) belong to a long lineage of underrepresented actors and deserve particular attention beyond the legal/illegal judicial dichotomy.

The role of “users” in the coproduction of technoscientific knowledge constituted a crucial debate a few years ago in the field of science and technology studies, and particularly in the SCOT (Social Construction of Technology) approach. Even though the debate seems to be over, it is crucial to return to these foundational exchanges and discuss them with new approaches and case studies, such as Montaño’s own book, Electrifying Mexico, along with other works on the history of electricity, such as the historian Willie Hiatt’s current project on Peruvian apagones, or anthropologist Antina von Schnitzler’s book, Democracy’s Infrastructure: Techno-Politics and Protest after Apartheid, on local resistance to prepaid meters in South Africa. Hence, “stealing” is just another action from a broader repertoire of “moral economy” invoked by those who reclaimed their own right to gain access to certain technologies or to make them available to a larger group.

Most recently, historians of technology have made an important effort to incorporate narratives from overlooked groups as well as from areas beyond the Global North. In doing this, the field has gained a better and more nuanced perspective on how infrastructure interacted with human agents in the past and the legacies of such interventions. Only by expanding our repertoire of actors, places, and practices will we reach a comprehensive understanding of the multiple meanings of technology and the impact it has, and continues to have, on ordinary individuals.

Assistant Professor

Pontificia Universidad Católica de Chile

Diana Montaño highlights the importance of ordinary people’s experiences, practices, and expectations in the making of what she terms the electricscape. By showing how people not only made sense of new energy technology as it arrived (in a top-down model of technological diffusion), but also worked—and made trouble—with it in ways that were socially and culturally specific to Mexico City, Montaño reminds us to attend to a wider range of sites in which energy systems are shaped, extended, contested, and sabotaged.

Of particular interest is her focus on the ladrones de luz (power thieves). In her engaging narration of the problems raised by electric theft, Montaño points us to transgression as an entry point for understanding how energy systems function, and where they break down. The ladrones de luz show us how the theft of electricity has profoundly shaped the way we use, regulate, police, and sell energy. What Montaño identifies as an electrical script is not a product of consensus or cohesion, but rather can be read for conflicts, disagreements, and disparities of power. The electrical script defined by the power companies is not only defied by the residents of Mexico City, but also alerts us to the fact that capitalinos (authorized and unauthorized users) have developed their own electrical script. The process of negotiating that gap shaped everyday practices and produced new legal precedents. The cases remind us how novel the problems presented by new kinds of energy can be.

What impact did the contestation over the electricscape of Mexico City, including the struggle to set the boundary between licit and illicit use of electricity, have beyond Mexico?

Montaño’s account is filled with the rich texture of everyday life of the Mexican capital, but its insights reach far beyond the history of the city, or even the history of Mexico. At the heart of Montaño’s research is a challenge to historians of the countries that exported electric technology, finance capital, and expertise—places such as the United States and Canada, for example—to recognize that technological diffusion was not a one-way process, that capitalinos did not simply accept electricity as it was presented to them, but made it their own. What impact did the contestation over the electricscape of Mexico City, including the struggle to set the boundary between licit and illicit use of electricity, have beyond Mexico? What would it look like to bring this history into studies of the exporting countries? We might look to the processes of investment, scientific and engineering collaboration, and management to follow the impact on the companies. Equally, we might trace how migrants took their electrical scripts with them, from Mexico to elsewhere, or elsewhere to Mexico City.

Finally, I would like to briefly comment on the contemporary implications of this research. If we recognize that the legal regimes governing the distribution and appropriate use of electricity—including the definition of electric theft—grew from earlier moments of energy transition, we should consider how a similar process might play out with the efforts to decarbonize the world’s energy systems and what kinds of reimagination decarbonization might demand. How, for example, would decentralization and democratization of energy production and distribution reshape our electrical scripts, including around questions of theft? Are concepts of property and theft even the best ones we might use to pursue fairness and justice in the electricscape yet to come?

Assistant Professor

Georgetown University in Qatar

In hurricane-prone areas, everyone checks their electric power supplies before a hurricane’s landfall. Recently, days before Hurricane Ian, Floridians like me scrambled to assess how much electricity would their families need for water pumping/filtering and to run electronics, among other jobs. Then we charged our power banks, batteries, and solar panels, preparing for future days without power. People today are acutely aware that access to electricity is an essential part of modern life, and some may die without it. So I sympathize with Tomás Sánchez, owner of the San Antonio mill in Diana Montaño’s history, who said in court records that he needed steady access to electricity to operate his factory or it would perish—although he did not always wish to pay for this access.

Montaño’s engaging story of power thieves is a reminder that the invisible marriage of protons (positive charge) and electrons (negative charge) changed lifestyles around the globe. Beginning in the late nineteenth century, Latin American leaders directly funded or granted permission to private companies to build electric power plants as part of a modernization drive. The aesthetic of a civilized and orderly society was to have electric streetlights and lighted buildings. Electricity was vital in infrastructure, industrial development, and military preparedness. And it implied rising social status, offering life-changing opportunities to individuals who could afford a monthly contract to light their homes during the night and dawn hours. By the early twentieth century, as the scholars Abigail Harrison Moore and Graeme Gooday describe, electricity was common in public areas and designers invented decorative electricity “to produce a new style that would meet both the technological and aesthetic demands of what the Art Journal (December 1901) referred to as ‘the greatest revolution to antiquated customs and appliances … electricity.’”

The aesthetic of a civilized and orderly society was to have electric streetlights and lighted buildings. Electricity was vital in infrastructure, industrial development, and military preparedness.

By the mid-twentieth century, the Mexican power thieves were not alone in stealing electricity from public lines. For instance, in the Quarto de Despejo (1960), Carolina Maria de Jesus, a woman living in a São Paulo favela (shantytown) with her three children, mentions having electricity hooked up to her shack. Having light permitted her to write, read, and relax in the evenings. By the 1970s, gatos (illegal wire hookups) became commonplace in the Rocinha favela in Rio de Janeiro, with the donos do morro (owners of the hill) supplying utility services to residents, including through the illegal hookups, as reported in RioOnWatch.

As other discussants have suggested about Montaño’s case study of ladrones de luz, electricity could be widely available worldwide. However, institutions create unfair and inequitable systems of distribution, making it costly to access electricity—an energy source that is derived from a basic knowledge of transferring or managing it. Montaño shows that historically, consumers have never been passive actors and will ingeniously discover ways to retrieve needed energy for low or no cost. Since the first use of electricity in homes, people have continuously created new uses for it or expanded on its original designs. People today can live “off the grid,” using little more than a relatively inexpensive foldable solar panel system and portable power station. Like the historical characters, these individuals have made electricity their own.

Associate Professor

University of Central Florida

Matchmaking Challenges for Researchers and Policymakers

Interaction between policymakers and researchers is important for informed decisionmaking. But more often than not, as Adam Seth Levine lays out in “Unmet Desire” (Issues, Spring 2022), such linkages are rare. Truly, as Levine points out, policymakers have an unmet desire for science. Establishing these relationships requires deliberative effort and, we would add, time. Two- and four-year terms of office are common across federal, state, and local government elections, thus creating unpredictability in the time and effort required to build strong, professional bonds between researchers and policymakers.

We have long felt that the Vannevar Bush federal-centric view of science policy overlooks real, transformative opportunities at the state level (see “Science for Hyper-Partisan Times,” Issues, Spring 2021). Levine takes this concept a step further and provides examples of both needs and opportunities within local governments. In doing so, he provides a not-too-subtle reminder that small governments oversee almost $2 trillion in annual spending, and their policies can have a large impact far outside the DC Beltway.

Regardless of the policy portfolio, Levine’s overarching thesis revolves around the challenge of matchmaking. Researchers often don’t know the policy needs (or who to contact in the policymaking space to identify those needs), and policymakers likely aren’t aware of available data or expertise (or who to contact at a local college or university to have these needs met). While it is important for researchers to find potential opportunities to be in service to their community, state, and country, the onus should not fall solely on them. Policymakers and their staff should also proactively seek out experts (and their data) who can assist them in their policymaking goals. The matchmaker examples that Levine offers, including his nonprofit Research4Impact, certainly fill a need.

Researchers often don’t know the policy needs (or who to contact in the policymaking space to identify those needs), and policymakers likely aren’t aware of available data or expertise (or who to contact at a local college or university to have these needs met).

Also noteworthy are the survey data Levine presents. Almost half of local policymakers responded with concern that a researcher might be pushing a political agenda. This alarming statistic is worthy of future exploration to better understand a key, if not the key, component of policymaking: trust, or the lack thereof, between policymakers and the academic research community and vice versa. Levine’s observations have already generated discussions with our Collaboratory team here at UNC Chapel Hill to consider standing up a survey instrument to address similar questions at the state level and better understand opportunities and barriers to connecting university expertise and talent to serve the people and policymakers within North Carolina. 

We should not be shy about addressing the elephant in the room. Nor the donkey. Without acknowledging the challenges of the country’s current (and hopefully transitory) hyper-partisan division, we will be unable to work through these challenges and find solutions to pressing issues we all face. Levine had it right. Matchmaking is an incredibly important role that we should all be willing to undertake to meet the unmet desires of policymakers wishing to engage with research experts at all levels of government, while keeping in mind the old trope, “All politics is local.”

Executive Director

Research Director

North Carolina Collaboratory

The University of North Carolina at Chapel Hill

COVID-19 caught policymakers and the scientific community flat-footed. This inability to iterate at scale between the need for urgent policy decisions and fluid bodies of scientific research was in some degree unsurprising, as Adam Seth Levine illustrates, given how little experience both sides had in collaboratively working toward solutions to rapidly emerging dilemmas such as the pandemic.

Some of the frictions at the science/policy interface, however, were more predictable and therefore frustrating to observe as scientists worked frantically toward an understanding of the virus, how it affected different populations, and the efficacy of different public health measures to curb its spread.

First, COVID presented the scientific community with an unprecedented demand for real-time policy advice. This also created an unenviable catch-22 situation in which scientists were asked to inform policy (or correct misinformation) with a highly fluid scientific evidence base that they knew they would have to revise over time. In a number of cases, scientists let themselves be dragged into short-term battles over specific policy questions or pieces of misinformation, using scientific evidence that turned out to be unreliable or even wrong later on.

What lessons (if any) has the scientific community learned from the pandemic about how to better navigate policy interfaces in the future?

Second, the pandemic forced science to operate under a level of public and journalistic scrutiny that it was utterly unprepared for. Normally, much of science’s routine processes of sorting through competing evidence, eliminating dead-end stands of research, and agreeing on reliable models of explaining the world play out at disciplinary meetings, in academic journals, and in other corners of the ivory tower. During COVID, society turned its spotlight on science, with scientists, pundits, and policymakers litigating evidence and disagreements on social media, the pages of the New York Times, and cable news shows.

This was complicated by a third reality that science was slow to recognize during the pandemic. In its attempt to keep up with the virus and its variants, research moved at breakneck speed. In the process, accelerated peer review, rushed preprints, and an unwillingness to wait for replications obliterated many of the guardrails and speed bumps that typically guide us toward a reliable evidence base. Missteps, as a result, were predictably inevitable.

So what lessons (if any) has the scientific community learned from the pandemic about how to better navigate policy interfaces in the future? Unfortunately, we seem to show a very limited appetite for engaging in critical self-reflection. Instead, many in our community have reverted to evidence-agnostic deficit thinking that blames science-policy disconnects during the pandemic on public audiences for not trusting science enough, being misinformed, or not understanding the scientific process. Of course, institutional trust and shared, evidence-based understandings of the world are crucial foundations of healthy, enlightened democracies, especially during disruptive crises such as COVID. But so is a scientific enterprise that constantly learns from its own missteps, both in terms of how it does science and how it responds to rapidly changing demands that policy and public audiences will make from it in our post-COVID and pre-next pandemic world.

Taylor-Bascom Chair in Science Communication

Vilas Distinguished Achievement Professor

University of Wisconsin-Madison

Adam S. Levine describes his recent national survey that examines whether local policymakers have an “unmet desire” to collaborate with researchers in their areas. The survey found that many county and city officials would indeed like to have more informal “collaborative exchanges with local researchers to discuss scientific evidence relevant to policy challenges they are facing.” If these engagements are going to scale—which Levine’s research suggests could happen—public and private research funders will have an important role to play.

As part of The Pew Charitable Trusts’ evidence project, I work with many of these funders while helping facilitate the Transforming Evidence Funders Network (TEFN). In my experience, grantmakers’ efforts tend to shift the informal exchanges Levine describes to formal collaborations. That said, the tone, incentives, and expectations for both formal and informal engagement among researchers and groups outside academia are set, in part, by grantmaking practices.

If these engagements are going to scale—which Levine’s research suggests could happen—public and private research funders will have an important role to play.

TEFN participants have identified a range of field-tested, promising practices that funders can use to support “engaged research”—research that results from scientists collaborating with policy, practice, or community groups to create and use evidence. Grantmakers who support these efforts often focus on deepening the relationships among collaborators and making engagement more routine. These practices are consistent with Levine’s call to develop incentives and matchmaking structures that enable collaboration. For example, funders can:

  • Provide financing and other types of support for nonresearchers to ensure that they have the time, skills, and resources needed to meaningfully engage in projects.
  • Embed matchmakers—a term Levine used to mean boundary-spanners or expert intermediaries—to facilitate connections, blend different types of expertise, and help manage power dynamics between researchers and their nonresearch partners.
  • Allow time for engagement while research questions are being refined, so that nonresearchers can provide on-the-ground perspectives that inform project priorities.
  • Encourage or even require routine engagement between partners over the course of a project.
  • Include nonresearchers in review panels to ensure that diverse perspectives are valued in the early stages of the grantmaking process.
  • Train researchers and nonresearch partners to support the variety of impacts their collaborations can generate.

As we continue to invest in these approaches, we should be mindful of how inequalities persist in academia; still, I’m hopeful that supporting these joint efforts can help mitigate existing inequities. Many engaged research efforts weave together multiple types of expertise by diversifying who has access to and can participate in the research process. This collaborative approach can give people who have been excluded from, or harmed by, the academic system the opportunity to shape research with their interests. 

Investment in engaged research alone, however, will not ensure equity. We need to recognize, learn from, and provide resources to researchers who are committed to engaging with people and organizations outside of academia. It’s also important to remember that many women and scholars of color have prioritized these collaborative exchanges without the support of their academic institutions—even in the face of outright disapproval from their colleagues. Funders and other leaders in academia should seek feedback from such scholars who have built trusting relationships outside of universities and research centers. As we create support systems that address local policymakers’—and others’—unmet desire for engagement with researchers, we should ask ourselves: Who is being rewarded for these partnerships? And how can we build equity into and through these collaborations?

Principal Associate

The Pew Charitable Trusts’ evidence project

Stop Being Alchemists!

In “Opening Up to Open Science” (Issues, Spring 2022), Chelle Gentemann, Christopher Erdmann, and Caitlin Kroeger make a convincing argument for more open science. I commend their comprehensive overview of the benefits of open science while addressing common fears such as research being “scooped.” Here, I will expand on several aspects of open science that might be useful for a more holistic understanding.

The authors insightfully quote the Hippocratic Oath, in which physicians swear to “gladly share such knowledge as is mine with those who are to follow.” Observe that this statement is not preconditioned on any practical benefits of sharing knowledge. Indeed, we should recognize that the pursuit of knowledge is fundamentally iterative, where we always build on what came before. Even without practical benefits (though there are many), we have a responsibility to share our discoveries for those who follow. Open science embodies this responsibility and is a necessary condition for doing better science.

It is also instructive to examine the decades-long open-source movement, which advocates for the fundamental freedoms to use, study, build upon, and share. These freedoms align with the inherent motivation for open science. To paraphrase author and journalist Cory Doctorow, who has written extensively on this topic: the difference between alchemists and scientists is that alchemists kept what they knew a secret. They didn’t advance the art, and each one learned in the hardest possible way that drinking mercury is a bad idea.

We have a responsibility to share our discoveries for those who follow. Open science embodies this responsibility and is a necessary condition for doing better science.

Open science influences both outputs and processes, and is not limited to sharing papers, methods, data, or code. We should think imaginatively about the full breadth of outputs from the scientific enterprise. For example, in “Bringing Open Source to the Global Lab Bench” (Issues, Winter 2022), Julieta Arancio and Shannon Dosemagen illustrated the pivotal role of open-source hardware in science. Other valuable outputs may include educational or outreach materials, lab notebooks, meeting minutes, or even social media posts. Regarding processes, the UNESCO Recommendation on Open Science emphasizes the need for “dialogue with other knowledge systems [and] knowledge holders beyond the traditional scientific community.” This can be seen in the growing field of citizen science, where institutional researchers collaborate with diverse stakeholders to use science for advancing goals of mutual interest instead of merely “using” free labor to crowdsource data collection.

Gentemann and coauthors state that we “must change the game—the structure, the policies, and the criteria for success.” I wholeheartedly agree, but caution that academics are already overworked and underpaid. To bring them on board, proposed open science practices must fit into a comprehensive reimagining of research policies and institutions so that they will not be yet another box that a scientist must tick for professional advancement. In addition, a long-term goal should be engagement with legislators to reform copyright and patent laws, which, by default, criminalize sharing and stifle innovation.

We live in challenging times, with growing mistrust in science and monopolization of the entire research lifecycle by a handful of publishers with a closed-by-default approach to science. This peril is amplified by a desperate need for rapid innovation to overcome global crises such as climate change or pandemics. We do not have time, nor can we afford, to be alchemists. Let us work together to expand the circle of liberty for research and innovation.

Community Councilor, Gathering for Open Science Hardware

Cofounder, MammalWeb citizen science project

Research Associate in the Open!Next project, University of Bath, United Kingdom

The Pomological Watercolor Collection

In the late nineteenth and early twentieth centuries, healthy orchards and groves were seen as crucial to national prosperity. As people across the country created new cultivars, or varieties, through hybridization, the US Department of Agriculture (USDA) set out to document them with a national register of fruits. Between 1886 and 1942, USDA’s Division of Pomology commissioned artists to create illustrations of thousands of fruits and nuts. A historic botanical resource, the Pomological Watercolor Collection, which is housed at the National Agricultural Library in Beltsville, Maryland, contains 7,497 watercolor paintings, 87 line drawings, and 79 wax models created by approximately 21 artists. 

As people across the country created new cultivars, or varieties, through hybridization, the US Department of Agriculture set out to document them with a national register of fruits.

In order to ensure color accuracy, watercolor—rather than photography—was the preferred medium. These technically precise paintings were used to create lithographs illustrating USDA bulletins, yearbooks, and other series distributed to growers and gardeners across America. 

The illustrations realistically portrayed fruit in all conditions: the immaculate, the bruised, and the decaying. These watercolors, most of which were painted by women, tell the story of agriculture at the turn of the twentieth century and provide a visual time capsule of many fruit varieties that no longer exist.

Mary Daisy Arnold, "Elberta" (1936)
Mary Daisy Arnold, Elberta, 1936.
James Marion Shull, "Crawford" (1909)
James Marion Shull, Crawford, 1909.

A Critical Opportunity for Philanthropy

In “A Global Movement for Engaged Research” (Issues, Spring 2022), Angela Bednarek and Vivian Tseng capture well philanthropy’s need to prioritize building the evidence base, infrastructure, and incentives for engaged research—to, in their words, “spur a new vision of science … in direct collaboration with the rest of society.” The challenges we face span areas of expertise in science and society and are laden with complexities, uncertainties, values, and high stakes for all. Science is a crucial part of solution building, but not a sole solution.

One key area that the authors emphasize is the need to support boundary spanners holistically, reinventing systems and structures to appropriately value and encourage their diverse expertise and efforts, their impact in practice, and the new career paths they are forging. We have ample evidence from science communication, participatory research, and other bodies of engaged research that people who can traverse diverse communities and fields of practice are essential in communication and relationship-building.

Yet as the authors outline, there are many reasons why society lacks the boundary spanners needed. These barriers often reinforce each other and are rooted in the fact that boundary spanners challenge the status quo. Cultural and material incentives discourage interdisciplinary and engaged research in the sciences. To create networks of support and opportunities, there often is a need to develop new language and change institutional perspectives—often while battling the effects of systemic racism and other structural inequities, as Bednarek and Tseng highlight.

We are losing vital people, capability, and energy.

We have ample evidence from science communication, participatory research, and other bodies of engaged research that people who can traverse diverse communities and fields of practice are essential in communication and relationship-building.

Researchers focused on science, technology, communication, and their relationship to democracy and inequality have long demonstrated that without boundary spanners, scientific consensus doesn’t translate into actionable or complete answers to the civic question, in the political philosopher Peter Levine’s phrasing, “What should we do?” We are now in a powerful moment ripe for experimentation around the myriad ways to answer such questions and create an active culture of civic science—engaging boundary-spanning research to help solve the most pressing and persistent societal problems.

As Bednarek and Tseng demonstrate, using as examples the Transforming Evidence Funders Network and the Transforming Evidence Network, philanthropy has a critical opportunity to catalyze and cocreate this cultural shift in partnership with communities, not only through providing monetary resources but by serving as connectors and civic investors seeking returns in public goods. We celebrate early adopters and advocates of the emerging movement for engaged research and hope more readers are inspired and called to action.

Across many connected efforts, we acknowledge that it will take long-term commitments to see a fundamental shift toward engaged research take root—one that can generate compounding solutions to address shared challenges. Being in action to support boundary spanners, the next generation of leaders working to bridge civic and science spaces, is at the core of the Civic Science Fellows Program, a still-growing collaboration that was launched in 2020 and now engages partners across philanthropy, academia, media, policy, and engaged research (including the Rita Allen Foundation and some Issues contributors). These early-career Civic Science Fellows and partners have an essential role in helping achieve these long-term changes.

The clear case for engaged research and a culture of civic science has been made. This is the moment to take up the call to action together.

President and Chief Executive Officer

Rita Allen Foundation

Protecting and Empowering Workers

Over recent decades, rapid technological development and automation have been fueling widespread concerns about job security. While many people fear that artificial intelligence may cause mass unemployment, recent studies show that this does not need to be the case, and that the increased demand that new technologies trigger may translate into greater employment—although likely not of the same types as the jobs affected by technological change. In “Stories to Work By” (Issues, Spring 2022), William E. Spriggs highlights how the concerns triggered by narratives of technological inevitability often limit the tools and efforts available to promote equality and opportunity. Spriggs describes recent changes in technology as a missed opportunity for empowerment that has instead resulted in further inequity and the creation of norms that continue to disadvantage workers. Too often, society has ignored this issue by blaming it on the technology itself. It is not the technology, but the regulatory frameworks that we choose to follow that determine these outcomes.

The stories that Spriggs recounts of the railroad workers and telephone operators reveal the underlying dynamics that decided the impacts of these technologies on workers. Spriggs highlights how it was the institutions, rather than skill or talent, that determined the future of the workers. Factors such as race and gender frequently played a large part too. Throughout history, dangerous narratives have fundamentally overlooked the role of institutions and governments in perpetuating labor inequality and have been used to prevent steps from being taken to overcome key concerns. To promote equality and empowerment in the face of technological developments, we must critically examine the regulatory frameworks we apply. As AI continues to develop more rapidly than ever, we are at a key time to reframe core narratives and issues—and that is what UNESCO is doing through our Recommendation on the Ethics of AI.

Narratives have fundamentally overlooked the role of institutions and governments in perpetuating labor inequality and have been used to prevent steps from being taken to overcome key concerns. 

The ample literature that exists about technological paradigms highlights that the current technological wave is not unique in the way it impacts labor markets. When it comes to the importance of regulation, lessons must be taken from stories throughout history, such as those told by Spriggs. The best times, even in terms of economic growth records, were times where social protection systems were established and where labor rights were strengthened. We need to put technology back at the service of people and societies, not the other way around. Today we are being confronted with yet another instance in which labor rights are being eroded. This is not only because of technologies, but because of a race to the bottom where salaries, worker protections, and even environmental concerns are often sacrificed in the name of flexibility, economic development, and other supposedly “superior” goals.

The UNESCO AI Recommendation centers around the defense of human rights and fair outcomes with regards to AI. It looks at the changing world of work and calls for economic and fiscal policies that will help AI applications and business models contribute to economic performance, social empowerment, and inclusion. In the context of huge global inequalities, exacerbated by the COVID pandemic, these policies are a must, or else technological developments will widen and perpetuate existing disparities. Through implementing regulations that protect and empower workers, we can foster positive innovation and growth and begin to change many of the toxic narratives around technologies and people being at odds.

Assistant Director-General for Social and Human Sciences

UNESCO

Deliberating Autonomous Weapons

For those of us having spent many years in diplomacy at the United Nations, Stuart Russell’s account of his involvement in the automated weapons systems (AWS) policy discussion, “Banning Lethal Autonomous Weapons: An Education” (Issues, Spring 2022), was a poignant reminder of the difficulties of achieving results in multilateral deliberations.

As UN High Representative for Disarmament, I met with Christof Heyns shortly after the 2013 publication of his report on lethal autonomous robotics and the protection of life, and subsequently brought it to the attention of representatives of member states. Heyns was a Human Rights Special Rapporteur, and reports such as his are not staple reading material for arms control officials. My interventions stimulated interest. But we needed a “home” for the discussions and found it in one of the Geneva Conventions—generally known as CCW, for simplicity’s sake. CCW is the umbrella convention for five protocols, one of which deals with blinding laser weapons that were outlawed before being fully developed. It was a perfect fit, everyone thought.

Open-ended working groups (in which any member state can participate) convened between 2014 and 2016. Few government delegates were familiar with the issue of AWS. Sharing information and substantive briefings were important to acquaint diplomats with the issues, and Russell participated, explained autonomy, and patiently answered many questions. Another briefer from the British artificial intelligence company Deep Mind also took part in the proceedings, but became disillusioned when he was given only a 30-minute time slot: not worth the travel to Geneva, was his feedback.

Where are we now, nine years after the discussions started? Following the activities of the Open-Ended Working Group, its name (and format) was changed in 2016 to a “Group of Governmental Experts,” a diplomatic construct that allows adoption of reports and documents only by consensus, essentially blocking the will of the majority by giving a veto power to every participating state.

More years spent debating this issue may not yield the desired result; public pressure and advocacy, however, may.

The limited time in AWS meetings was spent debating key definitions, compliance with international humanitarian law, the relevance of ethical principles, the need for human command, and whether nonbinding principles and practices would suffice or were legally binding rules needed. These differences are nowhere near resolution, and states have coalesced around political positions. There are those trying to table a proposal that could find consensus, such as a normative and operational framework. Others propose a political declaration. Over 30 countries have called for a total global ban on AWS.

When a meeting in December 2021 could not agree on a way forward, a group of 23 states delivered a statement highlighting the urgency of an outcome, stating that “in order for the CCW to remain a viable forum to address the challenges posed by LAWS [legal autonomous weapons systems] its deliberations must result in a substantive outcome commensurate with the urgency of the issue.” With a total of 10 days of meetings scheduled for 2022, it is doubtful that the plea for urgency will be heeded.

Russell sets out the complexities of AWS and shares his education in diplomacy. It should be mandatory reading for AI scientists, diplomats, advocacy groups, and the general public. More years spent debating this issue may not yield the desired result; public pressure and advocacy, however, may.

Vice President, International Institute for Peace, Vienna

Former Under-Secretary-General of the United Nations

Stuart Russell’s writings on the problems of aligning artificial intelligence with human values and regulating autonomous weapons systems have had a seminal influence on me and many others. I was thus glad to read about his “education” in the difficulties of effective arms control in the area.

I wish him every success in these endeavors. I also want to suggest an alternative approach to governing autonomous weapons. As Russell notes, no agreement has emerged from almost a decade of meetings under the United Nations Convention on Certain Conventional Weapons. None will in the foreseeable future. Yet these meetings and the civil society groups that draw attention to the topic may nevertheless have had some positive effect. They help create a moral sanction against using such weapons.

If effective versions of these technologies one day spread widely, however, rivals will employ them, and norms against their use will likely lose their power. So it makes sense to consider ways to prevent these technologies—particularly advanced, effective versions of them—from spreading. Just as in the realm of nuclear weapons, and increasingly across a range of technologies, it makes sense to consider the strategy of nonproliferation first, then norms governing use second.

Nonproliferation can be more successful than an outright ban because the major powers do not, as a rule, agree to give up development of militarily important technologies for which they have no substitutes. In the case of the Biological Weapons Convention, for instance, major powers were willing to sign it because they had another technology that was viewed explicitly as more effective in the same tactical-strategic niche: namely, nuclear weapons. Thus, when the Soviet Union violated the treaty with a major biological weapons program, the security of other signatories was not too significantly impacted. Treaty verification mechanisms might seem to be the solution, but major military powers have been willing to allow only less invasive mechanisms and to trust in verification only when the consequences of failure to detect treaty violations are relatively insignificant.

Nonproliferation can be more successful than an outright ban because the major powers do not, as a rule, agree to give up development of militarily important technologies for which they have no substitutes.

In the case of autonomous weapons, the military effectiveness of future generations of the technology is unknown, and it appears likely that it will eventually perform functions that no other current technology can. Thus, major powers will not give them up. Even minor powers will not give them up if they are worried that their rivals will not.

The result is that a mutually supporting nonproliferation regime and norm of use is the sort of arms control that might be made to work, just as in the nuclear case. There, norms around nuclear weapons culminated in the Treaty on the Prohibition of Nuclear Weapons. Yet this treaty and those norms might not exist without the nonproliferation regime. They are mutually supporting.

Nonproliferation is not simple, and it is not the ideal. It would require significant focus on the part of major powers, including security guarantees for countries that give up a means of defense. It might not work, for a variety of reasons. But for a variety of other reasons, it might be worth trying.

Associate Professor of Political Science

University of California, Los Angeles

Strategic Modeling Team Lead

Centre for the Governance of AI

Bonus Episode! A Historic Opportunity for US Innovation

This summer, Congress is trying to reconcile the differences between two massive bills focused on strengthening US competitiveness and spurring innovation: the House-passed America COMPETES Act and the Senate-passed United States Innovation and Competition Act (USICA) legislation. In this episode, we speak with Mitch Ambrose from FYI, the American Institute of Physics’ science policy news service, about the historic conference aimed at negotiating the House and Senate bills. What are the competing visions for US competitiveness in the bills? How do the details get worked out, and what happens if Congress fails to reach an agreement? 

SpotifyApple PodcastsStitcherGoogle PodcastsOvercast

Resources

Transcript

Josh Trapani: Welcome to The Ongoing Transformation, a podcast from Issues in Science and Technology. Issues is a quarterly journal published by the National Academies of Science, Engineering, and Medicine and Arizona State University. I’m Josh Trapani, senior editor at Issues. Today, I’m joined by Mitch Ambrose from the American Institute of Physics’ Science Policy News Service, called FYI, whose newsletters and tools for tracking science policy budgets and legislation are key assets to the science policy community.

On this episode, I’ll talk with Mitch about the ongoing conference of members of Congress to reconcile differences between the Senate-passed USICA Act and the House-passed COMPETES Act, which he and FYI have been tracking closely. These acts both passed with bipartisan support are aimed at increasing American technological and industrial competitiveness, but have key differences which have led to a long reconciliation process. Welcome Mitch, we’re happy to have you back with us.

Mitch Ambrose: Great to be here.

Trapani: So each of these pieces of legislation, the Senate-passed United States Innovation and Competition Act, or USICA, and the House-passed America Creating Opportunities, Preeminence in Technology and Economic Strength Act, or the COMPETES Act, are thousands of pages long. And there are quite a number of differences between them. Senator Maria Cantwell, who is the chair of the Senate Commerce Committee in her opening statement called the conference historic because it’s one of the largest conference committees in the last 10 years for a bill that is not annual must-pass legislation. So clearly what’s going on is a big deal. But before we start getting into more detail, I just wanted to start with a bigger question: you’ve been paying close attention to the conference and everything that’s been leading up to it. Has there been anything about this that’s really surprised you so far?

Ambrose: Yeah. It’s been quite a fascinating story to follow these last few years, and it really starts way back in November of 2019 when the then minority leader in the Senate, Chuck Schumer, he floated this idea of really ramping up investment in science technology, specifically through adding a new arm to the National Science Foundation that started with this thing called the Endless Frontier Act. And it has been a very long and winding road since then. And these bills have grown, as you said, thousands of pages long, and they’ve come to encompass most policy areas, I would say. Not quite all, but it’s just touched so many committees that originally the Endless Frontier Act was renamed and very much expanded into this thing called USICA.

And it touches foreign relations policy, trade policy, supply chain issues, and touches many additional areas of research that were originally contemplated in the bill. And it’s taken us a couple years to get to now, and now it’s the final stretch, in a sense. Whether they’ll be able to come to a compromise—and there’s all sorts of interesting fissures that don’t necessarily play out along party lines; House-Senate dynamics that are at play as well. And just so many storylines that we can tell about debates that are playing out. We can get into some of the biggest ones in this conversation.

Trapani: I wanted to make sure that before we go too far, we take a step back and know that some listeners will be following this very closely. Others, just more casually. Others, not so much. And in discussions about this, there are a lot of buzzwords and concepts that get bandied about. People talk about competition, innovation, there’s a lot of talk about China. And my question for you is, so what does this really mean when we get down to brass tacks? What are some of the main objectives of these bills? What are they trying to accomplish?

Ambrose: Yeah. So I’d say there’s a distinct flavor to each. The Senate has a different approach than the House. And so the Senate bill’s very much, again, coming back to this vision from Senate majority leader, Chuck Schumer, who teamed up with a Republican Senator, Todd Young. Their goal was really essentially concerned about the US falling behind China, in particular, in certain strategic technology areas. Things like AI, robotics, advanced manufacturing, 5G telecommunications, and they got together and said, “Okay, let’s go big on an idea for really focusing federal research funding towards those areas.”

And it framed the whole thing very explicitly in terms of this competition with China over “industries of the future,” is one phrase that people like to use. Whereas the House—especially, the House Democrats—have been very reluctant to have the sole purpose of this bill to be about geopolitical competition, essentially. Their view, especially the House Science Committee, on a bipartisan basis, came up with a different framing that, in addition to wanting to be competitive with China in all these technology areas, they’re careful to say, “Well, it’s not just China that we’re competing with. There’s all sorts of other countries.”

But beyond the competitive dynamic, they’re also very much stressing that there’s other reasons that the government wants to do science, and baked into their proposal for ramping up the budget for the National Science Foundation, they also talk about grand societal challenges. Things like tackling climate change and tackling social inequities. And I think part of that is from the view that there’s all sorts of people that are not necessarily going to be drawn to science because of this geopolitical competition, they’re going to want to solve various other problems in their backyard, for instance. And so they’ve adopted very intentionally, a broader framing.

Now, I think one of the big debates that’s playing out in this is, does that dilute the effort, in a sense? The Senate was trying, their proposal picks out 10 key technology areas and really tries to organize the US research system, to a degree, around those 10 areas, where the House is much less prescriptive. And this gets back to what are we trying to achieve? Is this all just about economic competitiveness, or is it also things like tackling these bigger societal challenges? And I think that’s very much unresolved at the moment.

Trapani: Yeah, no, that’s really interesting. And you can see how that would add a lot of complexity to the process, especially given that this is a rare occurrence in a legislative sphere. So from a process perspective then, when we talk about two bills being conferenced, what is it that we are talking about?

Ambrose: This is a procedure that Congress does fairly rarely, at least in recent history, where the one chamber will pass its version of the bill, the other chamber passes its version, then they formally agree to appoint some number of people from each side that will formally represent the House and the Senate in negotiations. And in this case, it’s over a hundred people. And so it’s just a significant fraction of the total membership of Congress. But it’s not like everyone’s around some big table at once speaking. A lot of it is delegated to the staff level, and in particular, members have been appointed because they’re on certain committees. And so the Science Committee’s going to be the lead on the science provisions, obviously. And there’s a certain number of members and their staffs that are given just a piece of the bill to hash out among themselves.

And what we’ve heard is that the staff have been instructed to work out as much as they can, and then things that can’t be resolved get kicked up to the level of the members of Congress themselves. And we can talk about some of those issues that are likely being hashed out at the higher level, because they’re really thorny issues. And at this stage, it’s all pretty much happening behind closed doors. They did have one big kickoff meeting where a lot of the people who have been appointed to this committee, they showed up to talk about their priorities. And I was there that day, and it was one of the most fascinating congressional meetings I’ve ever been in—probably the most fascinating—where they did have a big round table, but people were coming in and out. It was almost a day-long meeting.

It started with the science folks, and they each made a little statement. A lot of it was just rehashing positions that they already had said previously, but just laying down a marker. And then through the day, different crowds of policy people from different areas of policy would circle in and out of the room. So there was the foreign policy folks showed up, then the tax policy people showed up. It was this huge game of musical chairs. There was just this great turn of people in the room—but that’s the only, as far as I’m aware, big public meeting that they’re going to have, and probably the rest of it will be just behind closed doors negotiations.

Trapani: I would like to get your perspective on what you see as some of the main tensions that are going on under the surface as they do this work.

Ambrose: Yeah. So one main one is this philosophical difference as to what should be the end goals of US science policies? Should we really just be zeroing in on this competitive dynamic, or should we take a broader view? Another big tension point that’s at play is a topic: the umbrella term is research security, where there’s a lot of lawmakers in Congress—on a bipartisan basis, but it’s particularly strong in the Republican caucus—a concern that, OK, the government’s spending all this money on science. We want to make sure that the benefits of that accrue to the US, at least primarily. And there’s a lot of concern that the Chinese government might take advantage of the US research system. The US research system is very open by design and there’s tons of benefits to that, but there’s been over the past few years, the momentum has been building for a while the concern that a lack of reciprocity, for instance, between the US system and China system, which was much more closed, that what should be done to prevent the US system from being taken advantage of?

And there’s a whole host of provisions in both bills that would create restrictions around people who participate in what are known as talent recruitment programs, where you work in the US, but you’re getting paid by a university in China to also spend some of your time over there. And Congress has come to take a very negative view of those things. Participating in that is not illegal in itself, but it’s being viewed as a way that you could have these unwanted technology transfers because of that. And so the bills both have provisions that would essentially, if you’re getting money from the US government, you would be prohibited from participating in those types of programs, at least from certain designated countries. And there’s a big debate about how broadly or not to define that.

Because it’s pretty common in science for countries, there’s all sorts of countries that create these funding schemes to encourage scientists that do work in their country, either completely recruiting them away or just partially recruiting them away. And so there’s been a push to define, well, what constitutes problematic behavior in international science collaborations? One of the big tension points I’ll tee up now is there’s a lot of interest in Congress, from both sides, about distributing research money across the US much more broadly than it has been in the past. And currently, it’s 10 or so states that just completely dominate the amount of winning grant awards from the National Science Foundation and other agencies. And there’s certain people, especially in the Senate, that would like to see that money more broadly distributed. I do think that’s one of the main sticking points in the legislation right now.

Trapani: That’s interesting. Can you say anything about where the administration is on this? Are there things they would like to see? Are they just waiting to see anything? Where are they?

Ambrose: Yeah, the administration is very much pushing for this legislation. President Biden has talked about it a fair amount and really is pushing on Congress to send them something. And the administration, their biggest priority, probably, in this whole thing is funding for the semiconductor sector. Where of course in the pandemic, everyone’s become very acutely aware of supply chain issues. It’s affecting so many different types of products, but particularly for chip manufacturing, there is a special concern. The birth of the semiconductor industry was here and the US used to have much bigger role in manufacturing them. But over time, that’s slipped away to other countries, primarily in Asia, especially Taiwan. And this again, of course, links back to the dynamic with China, where there’s concern that if China were to ever take back Taiwan, what would that mean for the chip sector?

And just for many reasons beyond that, wanting to have the domestic manufacturing base for that and this legislation in both bills, the House and Senate version. So this isn’t so much a tension point in that people are very much on board with really scaling up support for the domestic semiconductor manufacturing. It’s just that finding the right vehicle to get that passed is the challenge right now. And so both bills have about $50 billion worth of money, it’s a combination of incentives for semiconductor companies to build manufacturing plants in the US, which is a very expensive proposition. And there’s also, of the total, about $10 billion of it is for research. And that’s designed towards getting the next wave of chips, making sure that the US would be positioned to play the leading role in developing those. But I think there’s a fear that if they were to just pass that by itself, then there wouldn’t be enough support for these other things. So there’s an incentive to not just do that as a standalone thing, such that they can get a bigger package done.

Trapani: Yeah. It’s a strategic move to put some impetus behind some of the other pieces that otherwise might not move. I have just one more question about the dynamics in the field. When a lot of people think about Congress these days, partisanship is front of mind. You haven’t mentioned a lot about partisanship, but I was wondering if you could talk about what, if any, partisan dynamic might be driving the conference negotiations.

Ambrose: Yeah, there’s starting to be more talk of that, especially as we’ve got an election coming up in just a few months here in the US. And the conventional wisdom is that in the lead up to elections, that can really get in the way of deals being done. Because for instance, the Republicans might not be inclined, if this is viewed as a big win for the Biden administration, passing this thing, they might not be inclined to help Biden get a big win before the election.

But at the end of the day, to pass this type of legislation, you need 60 votes in the Senate, and then there’s still going to be Biden in the White House for at least few more years. And so you’re going to need Democratic support. Even if they were to take back both sides of Congress, they’re going to need to negotiate with Democrats, especially in the Senate and the administration. So would they really have all that much more leverage at the end of the day? It’s hard to say.

Trapani: What’s interesting is even as you were talking about the partisan dynamic, you were talking about some timing and some political considerations, but you didn’t really emphasize a very different vision of where US science should go between the two parties or a lot of substance and policy differences. That’s meaningful the fact that maybe they’re not all that far apart on a lot of these things.

Ambrose: Yeah. I think that there are some partisan tensions, for instance, on research security, the Republicans are generally interested in doing much more on that front than the Democrats are. But then you have other parts of policy where it doesn’t break down on party lines, it breaks down on House-Senate lines. For instance, this debate about expanding the geography of federal funding, where the Senate is very interested in that in part, because there’s this thing, sorry, this is going to get wonky. But there’s this thing known as the EPSCoR program, which is for a collection of states that don’t get above a certain amount of federal research funding each year, there’s a special pot of money that is only available to those states to compete for. And it’s a way of building their capacity to do research. But critics of the program argue that 1) it goes against the principle of merit-based competition, and 2) that there’s all sorts of institutions that would benefit from that type of research capacity support that happen to be in states that don’t qualify.

So you have a state like California, which gets an enormous amount of research money, does not qualify for this special program, but there are underserved institutions in California that would benefit from a leg up in research. And the House, in its version of the legislation, it has various proposals to help provide support to those types of institutions, emerging research institutions, regardless of where they’re located. Whereas the Senate has this framework about really scaling up EPSCoR states. It’s about 30 or so states, and that is pretty contentious. And there’s lots of senators from both parties that are supportive of that concept, because the criteria is tied to state-level criteria. So they have an incentive. Basically, all the people who have signed on in support of that provision are from EPSCoR states, and all the people who are against it are not from EPSCoR states. So you could say it’s a very parochial debate that’s playing out in that sense.

Trapani: Thank you for bringing that up. EPSCoR is wonky, but I was thinking, unlike Bruno, we need to talk about EPSCoR if we’re going to talk about this bill. And this proposal that you mentioned was for 20% of NSF’s budget to go to EPSCoR. Whereas right now it’s a very small percentage, maybe 2 or 3%, and there’s been significant pushback on this from exactly the kinds of people that you were talking about, who may not be in EPSCoR states, but who are saying, there are a lot of different types of institutions that need to have a fair shake here.

It’s interesting though, that in both cases, there is a desire to increase the geographic or institutional diversity of where the federal research money goes to. And it is interesting that objective is shared, even if it’s, as you said, parochial interests are determining in what manner they support that happening. And another one that I think you started to talk about right at the beginning is the new National Science Foundation directorate and the competing visions for that. Where are things with that now? Have they evolved significantly?

Ambrose: It’s hard to say where they are in terms of the current negotiations. Going back to that original Schumer proposal, he proposed renaming NSF as the National Science and Technology Foundation and adding an arm to it that would be funded to the tune of a $100 billion over five years. And by comparison, the current annual budget of NSF is only about $8 billion a year. So he’s talking about adding an arm to NSF that is many, many times bigger than the rest of NSF. So then it’s like, is it really still the National Science Foundation? And there’s a whole camp of people that are worried about this diluting the mission of NSF, which has traditionally been to support just very fundamental research, not necessarily connected to big strategic mission areas, and people are worried about this proposal taking it much more into this technology development mission, which maybe isn’t the best suited for NSF’s culture.

Now, that $100 billion over five years, that aspiration has been really scaled down. In the Senate version as it went through the Senate, they kicked down the budget target for that directorate. But there is still a big difference than the House version, where it also proposes creating a new directorate—with less of a technology mission, but still a sizable new directorate. But it’s still quite a bit smaller than the one proposed in the Senate. And that gets to this philosophical difference. What fraction of the agency do you want this new activity to encompass? But one thing I want to really stress is that at the end of the day, if these bills pass—with a notable exception of the semiconductor money, I’ll come back to that in a second—but if these bills pass or a final version passes, that itself doesn’t provide any new money.

And this is something that very frequently gets misreported in the press, or at least articles will imply that, “Oh, this bill, they’ll say it has $100 billion dollars for X, Y, and Z.” But most of that money is really what’s known as an authorization, which is essentially Congress just recommending to itself how much money they should commit to spend at a future date. And effectively, it’s a way of Congress mapping out a trajectory, saying, “Oh, we think—here’s a five-year target for how much we think the NSF budget should grow.”

And so both budgets, both bills, they propose pretty big increases to the NSF budget. But the way it’s written, Congress is going to have to, year over year through the appropriations process, commit to spending that amount of money. And there are people that are skeptical that that money is going to even appear.

In contrast, the semiconductor funding, the way it’s written, it’s what’s known as a mandatory appropriation. And that means that if the bill passes, that $50 billion is going to be distributed—it’s guaranteed. And that is being viewed as quite a significant step for Congress to do something like that. Because it typically doesn’t fund science and technology programs through mandatory appropriations.

So on the one hand, there’s these big philosophical debates playing out about how big should the NSF budget be, and they have competing targets for how big this new directorate should be. But at the end of the day, Congress is going to have to cough up the money in future years. And there’s a ton of money spent by the government in the pandemic, the federal deficit and debt, that’s a blooming issue. And is there going to be an appetite for spending all this extra science money? And I think that’s a huge open question.

Trapani: Yeah. And even if they were to, they would have to sustain it in future years even beyond that, as we saw with the doubling of the NIH (National Institutes of Health), which was done, actually, and then consistently eroded away by subsequent inflation increases when the budgets didn’t, in subsequent years after the five-year doubling, didn’t keep up.

So we’ve talked about a couple of big issues here. We’ve talked about the new technology directorate. We’ve talked about the geographic dispersal of funds. We’ve talked about semiconductors. We’ve talked about research security. Are there any other aspects of these bills that you’d like to highlight?

Ambrose: Yeah. Another one I’ll highlight as a pointed debate in the conference is high-skilled immigration. So the House version of the bill has proposals that would create essentially fast-tracked visas for people who graduate with advanced degrees in STEM fields. It would be all STEM doctorates essentially, and then certain categories of master’s degrees. And the Senate bill doesn’t have any provisions like that. And so going into the conference, if you want your provision to make it into the final bill, it helps to, from the start, have a somewhat similar provision of both bills, and the fact that the Senate doesn’t address it at all makes it a bit more of an uphill battle. This proposal has colloquially been known as “stapling a green card to diplomas.” And it’s been around, talked about for well over a decade, trying to do something like this. But immigration policy in the US has been deadlocked for well over a decade.

Going back to what I said earlier about this conference committee meeting, most of the statements were things that were already fairly well known, but one of the more newsy statements was the top Republican on the Judiciary Committee, which has control over immigration policy, this is Senator Chuck Grassley, he showed up to the meeting and he made his statement all about that he doesn’t think the immigration provisions should be in the bill. He, on principle, thinks that they should be handled through standalone immigration legislation. I was able to catch him as he was leaving the room, and I asked him for clarification on his position. And he told me that he’s actually open to the idea of having a fast track for green cards for STEM graduates, but he doesn’t like the thought of establishing a precedent of attaching that sort of immigration reform to a much, much bigger bill.

At the end of the day, they could do it without his vote. But when you have the lead Republican on immigration policy opposing the idea of doing it in this legislation, that makes it quite an uphill battle.

Trapani: Yeah. That’s what really great example of all the different factors that are in play and how one member can have such a big impact on one particular issue. So now they’re in the conference doing their work—what happens now? What can you say about the timeframe? And then, what are the risks of this not going well or not going at all?

Ambrose: Yeah. So there was a notional push to get the bill wrapped up by the end of this month. I should stress that there’s tons in this bill that I haven’t talked about that has nothing to do with science or technology. It’s like esoteric trade policy that I haven’t even taken time to understand because it’s a whole nother world. But a lot of the trade provisions that were attached to the legislation are apparently one of the big sticking points.

So there’s a push to get it done before what’s known as the August recess in DC, where Congress is not in session. So that’s a natural deadline, and there’s a worry that if they can’t finish it before the August recess, then you’re just getting that much closer to the election. So in terms of the risk of them not finishing it this year, Schumer first made this proposal several years ago, and it’s just till now taken it to getting close to the finish line. So some people have already argued that’s been too much of a delay. But at the end of the day, it’s touching so many fundamental issues that I think there’s a camp of people that very rightfully doesn’t want to rush this.

I think it’ll come down to how firm people stand on certain issues, like the structure of the National Science Foundation and some of these research security provision, EPSCoR. The EPSCoR change alone would be a massive,massive—if they do what the Senate wants to do, that would be a massive, and I think if you’re in a state that currently is not eligible for these, a state that gets a lot of federal money already and wouldn’t be eligible for this set-aside that’s being proposed, if Congress were to pass a law saying that 20% of the NSF budget has to go to these other states, and if Congress doesn’t increase the NSF budget, then that means NSF is going to have to take money that is going to current projects and divert it to other states. And that could really upset things, I think will be one of the biggest sticking points.

Trapani: So my last question is just about how FYI and you and your colleagues are tracking the conference. And if you wanted to say a few words more generally about FYI and what it does for people who may have missed the first time that you were on the podcast.

Ambrose: Sure. So just a quick note on FYI, it’s a science policy news service, and we’d like to really track legislation in the US at a very fine grain of detail. So we’ve been following all sorts of bills that have been merged into this increasingly big package that’s now thousands of pages. And the way we’ve been tracking it is there’s events that give you a little window into where they’re at. So there was a hearing just this week, for instance, on immigration policy, where senators were talking about the case for making a special pathway for STEM immigrants. And that’s a way of getting a feel for how the conversation’s going. And so there are some open door events and also reporting from other outlets. In FYI, we very much rely on other outlets, and we like to link to them in our newsletter. We have a big weekly newsletter we encourage everyone to sign up for that has a roundup of reporting from around the web each week.

Trapani: Well, I would say if you work in science policy, you need to subscribe to FYI’s weekly newsletter and you need to check out the resources that they have on their website. It’s really indispensable for all of us who work in this space. It’s a great resource and we really appreciate it. Mitch, thanks so much for coming back on the podcast. It was great to have you. This was a really interesting, informative conversation.

Ambrose: Great to be on.

Trapani: Thank you for joining us for this episode of The Ongoing Transformation. If you have any comments, please email us at [email protected] and visit us at issues.org for more conversations and articles, visit aip.org/fyi for more science policy news. I’m Josh Trapani, senior editor of Issues in Science and Technology. See you next time.

Episode 15: Biotech Goes to Summer Camp

Who gets to be a scientist? At BioJam, a free Northern California summer camp, the answer is everyone. This week we talk with Callie Chappell, Rolando Perez, and Corinne Okada Takara about how BioJam engages high school students and their communities to create art through bioengineering. Started as an intergenerational collective in 2019, BioJam was designed to change the model of science communication and education into a multi-way collaboration between the communities of Salinas, East San Jose, and Oakland, and artists and scientists at Stanford. At BioJam, youth are becoming leaders in the emerging fields of biodesign and biomaking—and in the process, redefining what it means to be a scientist. 

Resources

Transcript

Lisa Margonelli: Welcome to The Ongoing Transformation, a podcast from Issues in Science and Technology. Issues is a quarterly journal published by the National Academies of Sciences, Engineering, and Medicine and Arizona State University. I’m Lisa Margonelli, editor in chief of Issues. And on this episode, we are going to summer camp. BioJam is a camp in the California communities of Salinas, Oakland, and East San Jose. The camp brings together teenagers, scientists, and artists to learn the principles and practices of biodesign to create art, and then to share what they’ve learned with their communities. I’m joined today by three members of the BioJam team, Rolando Cruz Perez, Corinne Okada Takara, and Callie Chappell. Welcome, Rolando, Corinne, and Callie. I’m going to start with a big question. BioJam is a little bit hard to describe—it’s a camp, but it’s also a way of engaging the community and embodying the future. Could you tell me what BioJam means to you in one word?

Rolando Cruz Perez: Happy to share one. Early on in the development of BioJam, I interacted with a professor, Bryan Brown, from Stanford. He said that BioJam can be activism. It brings together the practice of, of course biology or biodesign, and community engagement. I mean, all these different kinds of engagement, intergenerational engagement. So, you can look at, well, this is what’s happening in my community. These are the kinds of things that are important in my community. These are the dialogues that I engage in. And how can I then take action on even not necessarily solving a problem, that it may be food scarcity or something like that. It might be, well, we want to express ourselves and we want to have new ways of expressing our love and joy in the world. And that can also mean going out and having a support drive for farm workers, where you are making masks and visors for farm workers that are down the chain of priority for folks when we are distributing PPE and so on.

Corinne Okada Takara: So, you are asking for a word to describe BioJam. I would say it’s communion. I think so much of what we do, I’d agree with what Rolando is saying, is expanding who’s in that conversation space and who is elevated. So, he was mentioning agricultural workers. When Rolando and I started BioJam, we both, without realizing it, we were coming from a space of ag community history, Rolando from Salinas and me, my father’s side is from Maui and a plantation community. And we really came from this space of, how do we imagine a new space where we can have these conversations around biomaterial design and synthetic biology and create a space where everybody’s shaping the conversation and questions? And then have teens take what we are learning together back into their community.

So, for me, it’s about communion and storytelling and expanding everyone’s notion of what is science, what is science practice, and where does it exist already, in communities that may not see themselves as practicing science, when really their grandparents are. Their ancestral knowledge has much wisdom. And whenever we are talking about sustainability design, climate change, we do have to look at our more global ancestry and our global community for the best practices that maybe have been kind of shunted aside.

Callie Chapell: I think both of what Rolando and Corinne shared is so beautiful. Thinking about activism, thinking about communion, and then reflecting on that, I think the point where that intersects is courage. I think that BioJam is about humility and courage for everyone who‘s in communion, who‘s pursuing activism together. And my position is coming as an academic to this conversation. And we sometimes talk about how to create pathways that diversify science, and often times it‘s very reflective, of how do we diversify existing structures.

Instead of asking, how can we imagine completely new ways of doing things, of new ways of being, or new ways of thinking about what knowledge—or what science even—can be? And I think something that’s really beautiful about BioJam, is that it is the activism. It is the manifestation of that imaginative future, alternative world of what science can be. But I think from the perspective of an academic, it requires courage and it requires humility to imagine that, and from the perspective of the youth and educators and activists that we also engage with, that are not academic scientists, or don’t necessarily see themselves as academic scientists, it also requires courage to come into the spaces where we can create activism together, and we can have communion in that way.

Lisa Margonelli: Do you want to tell us the story of where BioJam came from, how it started?

Corinne Okada Takara: Sure. It kind of had two starts, I guess I would say. The first was when I met Rolando. We were in an Uber or a Lyft and we just got to talking about the need for, as I said earlier, just a non-existent space. There is a need for space for teens and youth in general, to engage in conversations about biodesign, sustainability design. So just to back up a little, I’m an artist, a biomaterial artist, and Rolando was a bioengineering PhD candidate at the time. And we both were coming at this idea of more equitable spaces and introduction to important conversations before university. Because we didn’t see people in these spaces that were reflective of Rolando’s community. And in my dad’s day, when he was at university, he also had similar experiences.

Rolando Cruz Perez: Just some background. I came to this kind of, second or third chapter in my life as an academic and bioengineer. It wasn’t a linear path. And so, starting now, and then all the way through to arriving at Stanford, I really didn’t fit in anywhere.

Lisa Margonelli: So, you and Corinne got in the Uber, and you started to talk?

Rolando Cruz Perez: We started to talk. I had been working with some youths at a local high school. They wanted to do some mushroom experiments, and Corinne had long been working with youth and the community. And we got to talking about that and then we said, “Hey, we should bring some youth together and have some workshops.” And then that turned into a camp, and then we were like, “Let’s bootstrap it.” And I was like, I’m at Stanford, they’ve got lots of resources. And it’d be great for Stanford to provide those resources in a way that isn’t uni-directional and that isn’t an extraction of the culture and knowledge from these communities. Instead, provide support, infrastructure, and knowledge for youth in these communities to do a kind of grassroots building, right, of their own, of what they envision of what the future should be.

Corinne Okada Takara: I have to add there, it was not “do a few workshops.” It was immediately, “Let’s do a camp.” So, Rolando is so passionate and so big picture. After the Uber ride, we kept emailing and he’s like, “Yeah, let’s do a camp.” I’m like, “OK, cool.” And I was thinking maybe a year down the road. He’s like, “No, this summer.” So, it was a pretty quick ramp-up. And I think Rolando’s so good at sharing the vision of what BioJam was that we had access to the bioengineering teaching lab. We had a lot of support from within the department in that way. And year one was a pretty quick ramp-up. And then year two, we were really lucky to have Callie join and really add structure, enable the sustainability of this camp as it moves into what it’s going to be in the future. So that was our start—Uber ride. Let’s not do a workshop, let’s do a camp this summer, and it was great.

Lisa Margonelli: So, tell me a little bit about the collaborative learning that happens at the camp.

Corinne Okada Takara: We didn’t want to do it within the standard framework of “We are the educators, and we are delivering information to you and you are absorbing.” Rather, we are entering a space, we are going to talk about these ideas, raise questions. We are going to engage in some biomaterial design activities, some bioengineering activities, but while you are doing it, we want you to think how you would do this better. So, from the start, we were collaborators, and we wanted them to express how they would redesign the whole program. And at the end of each day, we had an assessment board and thoughts for them to add to. So, they entered the program and left the program, knowing that whatever we were exploring together, they were going to take and redesign and take back into community. And so, I think that really changed the equation of expertise and knowledge because youth are experts in their communities, and they are also the trusted voices in their communities.

And so really elevating that, that they come from that place of expertise while Rolando and I may have experienced in these other areas and more years in it, they can look at it from different lenses. And so, I think that’s one way that we did the frameworks. Year one was different because it was in person. Every year’s been a pilot year, so let me just say that as well. First year was in person, it was just a short one-week experience. And then the next year we had to pivot to online and we delivered kits and same with year three, but we had some in-person. So, we are always reinventing it in a way. But at the core of it, how do we do this mindfully—that we are co-collaborating and it’s generative learning.

Lisa Margonelli: Callie, tell us how you got involved.

Callie Chapell: I was a first-year grad student at Stanford, and I found myself in some spaces where there was just this senior grad student who was just saying really awesome stuff. And that person was Rolando. And I met Rolando and he became a mentor for me, completely independent of BioJam. And one day, got an email from him and was like, “Hey, can you meet up with me and Corinne, this artist, and talk about this program?” And I was like, “Yeah, sure.” So, I meet up with them at Coupa coffee, which is the starting of many great ideas at Stanford, I think. And they were like, “Hey, do you want to come to Salinas with me today?” And I was like, “Sure, I can cancel all my meetings and go to Salinas with you today.” And I saw what they were doing was very aligned with things that I never even imagined that could be possible and was like, I have to support this in any way that I can.

And I feel like my role in BioJam has been to carry on the legacy and the vision that Corinne and Rolando inspired in me and share that out as much as possible. And part of that has been in growing the organization. I think what I’ve tried to bring to the conversation is some organizational management, in how we can expand the impact of the work that we have as broadly as possible. So as Corinne alluded to, the first year was in person, and I actually wasn’t involved that year, but then we pivoted to a virtual camp in 2020 because of the pandemic. And after 2020, we had a real conversation that’s like, “What is the future of BioJam going to be?” And we believed so strongly in what we were doing that we wanted to continue to grow it.

And so, with a lot of really generous sharing from Corinne, we’ve created an infrastructure around BioJam to allow us to continue to grow. And that includes and really is centered in the visions of the community and the needs of the community. And so all of our work is driven by three advisory boards. The first advisory board is the teens. So, teens that have participated in camp, they actually can come back and be teen mentors. So, help leading the curriculum and guiding the direction of camp the next year, but they could also join what we call our teen advisory board, which gives feedback about the long term overall direction of the program. In addition to our teen advisory board, we have a community advisory board with community organizations like community gardens, artists, educators who provide us with the vision for where camp should go and really deeply embed us in the world of the teens in a lot of different ways.

And the last advisory board is the academic advisory board. So, we have an interdisciplinary group of academics that also guide from a framing and paradigm-shaping lens where we should be going. And so those are the kind of vision-setting organizations that lead where we go. And they all interact with each other. We meet on Zoom; we are also working on having several in-person meetings. We also build relationships with everybody, right. So, dropping off cookies at people’s houses, right. Checking in on how people’s kids are doing. Those are all kind of central pieces to how we build trust and move with the speed of trust in our organization.

And then we have a variety of educators who lead the curriculum design, including Corinne and several others now that we’re growing. And the last component is the Stanford component, which is really the follow and support arm of these three groups that I’ve mentioned. And that is a student group of undergraduates, graduate students, staff, faculty, and post-doctoral researchers, that do a lot of the logistics in day-to-day to make sure that camp can run. But really the vision is not in Stanford. The role of Stanford is to really support the vision, direction, dreams, and imagination of the other groups that I’ve mentioned here. And so, we are hoping that with that structure, we can continue to grow BioJam into the future.

Lisa Margonelli: So let me just sum up here. So sometime in 2019, Corrine and Rolando got into an Uber in Pennsylvania and started talking and decided, OK, we are going to change this whole idea of how “science communication”—I’m using air quotes here,—is done. We are going to start something that’s based in a community. It’s going to be in the community of Salinas, and we are going to do a camp. And you did a one-week camp first, then that grew to a year-round program. So, youth came in the camp, and then they leave the camp, and they work year-round on projects in the community.

And then you have sort of a huge network of people. You have a youth board that steers this, you have a community board that provides feedback. You have an academic board that provides feedback. And you also have a group of Stanford students who produce sort of prep work and other things to enable this other experience. So, what started as a little camp turn into a giant web of people, right. This kind of reminds me of a mushroom. I mean, there is a sort of a mycelia sense to this whole thing. You started with a little thing and then it gets woven more and more into communities. Can you tell me, what are you doing with mushrooms and youth?

Rolando Cruz Perez: The first time someone to put a mycelium material object in my hand, it just blew my mind. I’m a bioengineer trained in molecular biology, all these other things, in synthetic biology, but this very simple artifact of taking corn cob waste and mixing it with a mushroom and then forming it into an object, putting it in your hand. I was like, wow, I can actually hold this thing, whereas everything else I do is kind of invisible to the eye. And Corinne was working with mycelium materials, I think already, at the time. And so, we wanted to incorporate that into some of the education or programming or trainings that we were doing. And Corinne, of course, tapping into her amazing art practice and knowledge, came up with the idea of a quilt made of mycelium, which is just ingenious.

And along the way, we engaged with the youth, collaborated with the youth, to, of course, build the curriculum, doing simple things like, “What do we want to learn today? And here’s a choice of things that we can learn today. Which ones would you like to, should we explore?” Terms that would come up or discussions that would come up of social justice. We would sit in lunchtime and breaks and have these discussions with youth. And we would define vocabulary with their own terms. And we named the program through, with input from the youth, and also the quilt as well was derived with Corinne’s feedback and the youth engagement, and I’ll leave Corinne to describe the quilts.

Lisa Margonelli: Yes, let’s turn to Corinne. But before that, just getting the youth to come up with the vocabulary is completely the opposite of the way that science is taught. Science is so often taught where you sit down, and you have to memorize stuff. And the barrier to getting into science is, so frequently, just this enormous vocabulary that you have to learn. And the fact that you sat everybody down and you came up with your own vocabulary to the start, seems like a very, it creates a space for everyone to be in there.

Corinne Okada Takara: I agree with that. And I think personally, I had the privilege of growing up with family that introduced science through making and story sharing on my grandma’s Lanai. We would collect materials from Maui, and you didn’t really know a material or a plant unless you knew its name in Hawaiian and Japanese and its uses and the legends that go with it. And so, the knowledge of science, it really does reside in community and ancestry. Rolando and I had many conversations also during COVID about our parents and grandparents and the knowledge they have in field labor. And so, we wanted to create a space where the students came with science knowledge.

So, for example, Rolando was mentioning the quilt project. Well, while I was working with mycelium from one direction, he was working from another, and we invited the students to bring in, day one, a mason jar full of—or baggies, we had the Mason jars. But to bring in materials from their own community and lives that we could then autoclave and feed to mycelium. And we had them emailing us saying, “Can I bring ramen, because that’s what I eat. Can I bring this or that?” And we are like, “Sure, we don’t know if it will grow or work. We haven’t tried it.” So, I think positioning ourselves as not being experts is really ideal and also positioning ourselves from a place where we are going to experiment and we may fail. So, for that particular quilt project, I had not succeeded in that yet.

I had tried to grow mycelium-grown assemblies through different meshes, and they’d all rotted. And I told them that. And we were going to try again, but we are going to try a synthetic mesh. And so, inviting them to work on something where we didn’t really have the protocols yet, I think was kind of fun and exciting. We didn’t know if it was going to work. It did, and it turns out the ramen grew the best. But it was great story share because they brought in the food of knowledge, a waste stream of knowledge, shared the story of it, and then after Rolando autoclaved them, they were able to share those substrates and then stuff it into their vacuuming form molds—which they also had designed that represented themselves or their culture. And so, it was physical sharing of the substrate to inoculate the mushrooms while also story sharing. And I think that just kind of set up for what Rolando would often describe as this mycelial network of people in community and space.

Lisa Margonelli: So, let me just ask you, can you tell me what the quilt looked like? So, everybody has brought food from home, then you autoclaved it. Then you stuffed it into a vacuum mold that everybody made, for form. And then they inoculated it with mycelium and then something grew and there was a mesh—how did the mesh play into this?

Corinne Okada Takara: I pre-cut nylon mesh and they stuffed their mold forms into the vacuum form mold. And then they clamped down sterilized mesh on top of it. So, the mycelium and the substrate grew together and then grew into the mesh. And then the mesh was such that I could bake it at a low temperature without the mesh melting. And then you had a square that you could hold up and it had the mushroom mycelium there, grown into whatever the substrate was.

Oh, and the mycelium shapes that the students had, they had designed—we had a pre-exercise for them to do. They learned how to design in Tinkercad. They designed their mold form, which is called, in vacuum forming, it’s called a buck. So, they designed their own bucks that we vacuum formed off of Shop-Vac. And we had a kind of DIY setup, but some of them designed one design, a Concha pastry to represent her Mexican heritage. Another one did a Kendama Japanese toy. We had another student do soccer ball, one did an Indian pastry. So just whatever they wanted to do, they represented themselves. And then those are all seen binded along the edges. For those of you who saw, individually because we knew we wanted them to take it home, each one separate, not stitched altogether.

Lisa Margonelli: And that way they could share it with their family. Interesting. And so, they also learn in addition to autoclaving and inoculating and thinking about mesh and experiments, they also were using CAD to design the mold. And then that was, I would assume, 3D printed. So, this is a very super high tech ecosystem. And you are bringing it to places like Salinas, where it could be really hard to access a 3D printer or CAD or some of these other things. Callie, tell us, how did BioJam camp make that transition to working remotely? Were people able to do these same sorts of projects or did you have to kind of rethink it?

Callie Chapell: Yes. We have “Lab on a Cookie Sheet” designed by Corinne. So, imagine at home you are living in one or two rooms, you’ve got tons of siblings running around, where can you have a lab? Well, turns out that all you need is a cookie sheet or a box and the right attitude, and any place can be a lab, whether that’s in your kitchen, whether that’s in a driveway, whether that’s in a garden. And we created these really innovative kits that had lots of different materials, biomaterials, circuitry materials, general making materials, fun tape and markers, scissors to really imagine in your own space. And scavenging for things that might be waste or might be treasures, depending on how you look at it. How you can create with biology at home? We made sure that all of the things that we shared were commonly available things at home.

For example, if you need to sterilize something, you can use isopropyl alcohol, rubbing alcohol, right. And all of the mycelium, for example, that we sent home was food grade. But the most important thing is, I think, doing science at home helps you realize that you’ve always been doing science at home. If you’ve been making yogurt in the kitchen, or if you’ve been growing things in the garden, even asking questions and trying to figure out how to answer them, that is doing science. And so, I think there was a real beauty to having people create in their own homes, even if we were interacting over Zoom as opposed to being together physically. Those people couldn’t imagine the science that already has been happening. And I think our curriculum really tried to emphasize that.

Lisa Margonelli: I think it’s really interesting that part of this education is, you’ve always been a scientist. We have all always been scientists. Corinne, do you want to talk a little bit more about what happened that first year that you all went remote and had this distributed lab in all these different places on cookie sheets?

Corinne Okada Takara: Yes. It was really fun. And I would just like to elevate the BioJam teams from the year prior who helped put these kits together and design them as well as create videos. So, some of the youth from our first year helped and were youth teen mentors and came back. So they helped put these kits together. So, you were asking about what kind of things can you do? And I think that’s a space where biomaterial design can step in as a really great first engagement spot. So, bio art growing bacterial cellulose. You can use a kombucha leather that grows on top, growing mycelium into forced geometries, using different mold forms, doing millifluidics with coffee filters and tape.

Lisa Margonelli: What’s millifluidics?

Corinne Okada Takara: So instead of microfluidics, it’s on the millifluidics scale and you can laser cut coffee filters into channels that you then sandwich between tape and then cut ports and do mixing experiments, pH mixing experiments.

Lisa Margonelli: So, it’s an analog to what you might do in a lab?

Corinne Okada Takara: Exactly.

Lisa Margonelli: So, I want to sort of step back from this a little bit. Two of you come from synthetic biology, I think. Which is sort of two different things. One thing is, it’s a practice of taking genes from one place in nature and sort of putting them in another to sort of enable new possibilities of life. So it might be that you change the way a plant grows, or it might be that you change a microbe’s capability, so it can eat old newspapers and turn it into some other useful chemical. So, on the one hand, there is this sort of researcher lab practice. And then the second aspect of it is really this sort of dream of creating a sustainable economy, where we replace petroleum products with other sorts of synthetic biology-derived chemicals.

But the one thing that nobody discusses is who is going to do it, and what is the world that they create going to look like. And that’s really interesting, because there has been kind of a preconception that the people who are going to do synthetic biology are wearing lab coats. They are people who have gotten access to very high-level labs, or maybe they work in a big industrial situation in a big refinery. And it strikes me that what you are doing at BioJam is changing who does this and who has the ideas and who asks the questions and who figures out which problems to solve. Do you want to talk about this?

Rolando Cruz Perez: The definitions you prescribe, outline for synthetic biology, of course, those are very canonical and accepted conventional definitions. I’ve tried to push my own thinking about it into a more abstract, higher-level layer of the word synthesis or the idea of synthesis, synthesizing together different knowledge or synthesizing something as art. So, to me, synthetic biology has evolved into, for me personally, into this space of synthesizing relationships with living organisms or living matter. For me personally, thinking about synthetic biology of this cultural socio-technical kind of cultural practice, I can then connect that to the development of maize or of potatoes or of “domesticating” animals and even ourselves and our microbiota, for instance.

And so of course, it’s important, absolutely—the who and the where and the what of synthetic biology. And that’s important; maybe one specific reason there, biosecurity, let’s talk about that. That world of distributed or open-source technology, it becomes realized, we’ll have questions in particular in the time of a pandemic right around biosecurity. It’s my, and others’ belief that in that world we are going to all need to love biology. We are all going to need to love synthetic biology because we need to have openness and transparency and care, transformative justice, as opposed to punitive justice, because in a world of punishment, there will be blind spots. There will be dark corners. And we don’t want that with biotechnology because of the intrinsic facts that we are biology and we are the technology—we are made up of it.

Lisa Margonelli: Corinne, do you want to talk a little bit about what it means for everyone to be a biologist?

Corinne Okada Takara: I come from a community arts activist space. and so any engagement that we have with the public I feel is really important, and we are part of the public. And how do we create these new spaces to address these concerns that Rolando has? So, if we can buy plasmas off Etsy, yes, we need to all be scientists, we need to love biology. We need to see ourselves as part of the conversation and not only growing our knowledge of the vocabulary of science, but expanding the vocabulary we are using—including more accessible words, accessible tools, and create those initial engagement spaces much earlier and multi-generationally, in spaces that people can access. And I know a lot of universities and museums do “community outreach.” I’m doing air quotes.

But I really think BioJam is a model for what should happen. And that is science communion, in community spaces, whether that’s a parking lot, a community garden, a farmer’s market, where are the spaces we can interact with people serendipitously in their places of expertise, their spaces. So, they know they can bring the vocabulary and knowledge to these conversations that right now are only engaging a few. So BioJam, I think, at its core, is asking that question, what might these new spaces look like? How can these be multi-generational; how can they span across spaces beyond academia?

Callie Chapell: Well, I think I just want to highlight something that has been brought up earlier, which is that BioJam is activism in changing what we consider scientific knowledge to be. And I’m not really a synthetic biologist. I’m trained as a molecular biologist, but I’m actually an ecologist now. And what I’ve learned was when I was growing up, I’m actually from the rural agricultural Midwest along the shores of Lake Michigan, was that ecology and evolution, ecological communities, and the way that evolution functions in the world is synthetic biology, right. Every time that we modify the atmosphere by emitting CO2, we are bioengineering every organism on the planet, right. We are also bioengineering organisms when we make bread at home, right? And in being an ecologist and thinking about the world from that perspective, you see that, for example, mycelium live in symbiosis with trees, in some cases.

And then when we had these organismal collaborations as artists working with bio-organisms, as scientists, working with other organisms, right, as biologists, or just existing in the world. That we can really expand how we think about what synthetic biology means, what art means, and what science more fundamentally means in this more expansive way. And so, the mission of BioJam is not just to create these experiences for teens that we are working directly with, but really changing the conversation and decentralizing how we think about knowledge production in biology. And not just knowledge production, but also who drives the conversations and who creates the technology. So we are really working towards a vision where everyone feels empowered to make with biology—not make with biology necessarily in academic labs or for companies or corporations—but make with biology for play, in their local communities, in their kitchens and agricultural fields, right. And empty lots.

And when people feel like they are biologists, that’s how we get innovation to address the most pressing challenges that everyone has a say in. Not just the people who see themselves as synthetic biologists. Just briefly, I want to highlight something that has been mentioned by both Rolando and Corinne about this being an intergenerational challenge. And something that we talk about a lot is power and transience, right. That what we think and what we see as liberatory biotechnology or biodesign is very reactive, like Rolando said, to current challenges that we have. But it should always be changing, right? It should always be re-evaluated and who we are liberating and what we are liberating ourselves from and also what our future could be. And I feel I’ve really learned a lot about what liberatory means in this context from youth.

And I think back to a really amazing webinar, they were thought leaders in Salinas on a local webinar. And they described what they envisioned their liberatory bio future to be. And their ideas, I’d never heard, policymakers, right. They are imagining worlds where we are addressing critical issues in climate change, social justice issues, incarceration, dynamics, power dynamics between the Global North and Global South, as it relates to decolonization using biology as a metaphor or as tools to enable those global changes to how the world can operate. And as they describe what they envision their future to be, I hope that we can empower that and actualize that vision.

Lisa Margonelli: That’s fabulous. I want to thank you, Rolando, Corinne, and Callie. This is just a fantastic conversation and hopefully a spore for future work and collaborations between communities and bio scientists. To see the BioJam artwork and learn more about the camp, visit issues.org, to read their essay, Bioengineering Everywhere for Everyone. Visit our show notes to find a link to the camp itself.

Thank you to our listeners for joining us for this episode and this season of The Ongoing Transformation. We will return in September with new episodes. If you have comments, interview suggestions, or questions, please email us at [email protected], and you can also visit us at issues.org for more conversations and more articles. I’m Lisa Margonelli, editor in chief of Issues in Science and Technology. Thank you for joining us for this season of our podcast.