Returning Science and Technology Assessment to Congress
The shuttering of the Office of Technology Assessment (OTA) in 1996, along with contemporaneous cuts to Congress’s policy capacity, created a deep institutional gap in the formation of science and technology (S&T) policy in the United States. A quarter century later, in the wake of a middling response to the COVID-19 pandemic, eroding military superiority, and weakening tech dominance, policymakers and the lay public may finally be waking up to consequences of Congress’s atrophy and dysfunction on S&T.
Timothy Persons positions the Government Accountability Office’s Science, Technology Assessment, and Analytics (STAA) team, which he codirects, as ready and capable of filling that gap. STAA, formed in early 2019 at the direction of Senate appropriators, has made commendable progress in two years. This includes growing to over 100 expert staff, producing numerous technology assessments and other analyses, testifying in hearings, establishing an advisory council, and producing a technology assessment methodology handbook.
Yet STAA still faces major challenges. These include working within GAO’s risk-averse culture and bureaucracy (particularly in the absence of separate authorities for hiring and acquisitions or a funding line item), establishing a reputation in the broader S&T community, educating Congress about what it does, and building trusted relationships with key congressional offices and committees. For STAA to succeed in the long run, all these challenges must be resolved.
Looking back at the genealogy of STAA, there are some useful lessons.
The strategy of rebuilding OTA within the Government Accountability Office emerged not long after OTA was shuttered (after an attempt to put it in the Congressional Research Service was shot down by the Librarian of Congress). In 2002, a pilot TA program was established in GAO. The pilot’s first technology assessment, on biometrics for border security, was favorably reviewed by an external evaluation that concluded GAO “did a very good job” on the report. Yet evaluators presciently raised concerns about bureaucratic hurdles and the challenge of establishing “two necessarily different cultures.”
Seeking to improve on the pilot, US Representatives Rush Holt (D-NJ) and Amory Houghton Jr. (R-NY) introduced legislation in 2004 for a Center for Scientific and Technical Assessment within GAO. The proposed office took the key structures from OTA, including a congressional oversight board, advisory panels, and an advisory council. In addition, it would have explicit authorities and its own line item funding. The proposal went through a review process that incorporated feedback from civil society experts, as well as GAO itself (see github.com/zachgraves/futurecongress).
Although the bill ultimately failed, it offers a fairly complete and vetted model for how STAA might evolve and solve its current structural challenges, while also satisfying critics who want it to look more like OTA. One could even give STAA more autonomy by modeling the relationship between the Congressional Research Service and the Library of Congress.
Importantly, STAA has already passed the biggest hurdle to reviving a technology assessment office in Congress: finding the necessary resources in a highly constrained funding environment. Now, let’s work to keep improving it.
Zach Graves
Head of Policy, Lincoln Network
Visiting Fellow, National Security Institute, George Mason University’s Antonin Scalia Law School
Timothy Persons makes a compelling case that sound congressional decisionmaking depends, in part, on impartial technical analysis of issues that include scientific or technological (S&T) components. He is also correct that the Government Accountability Office is exceptionally well suited to perform this task based on its well-deserved reputation for nonpartisanship and its ability to produce reports that are timely and succinct.
On the other hand, Persons neglects to mention that, beyond technical analysis, Congress is equally in need of assistance in understanding the social, political, and ethical aspects of decisions that include S&T dimensions. Perhaps this omission reflects his recognition that the GAO would not be the right organization to undertake such tasks, because it is impossible to perform them without relying on social and ethical norms to choose, frame, and inform the questions that will be addressed. At present, GAO’s new Science, Technology Assessment, and Analytics (STAA) team, which Persons heads, lacks the right staffing for such an undertaking. More importantly, the team may rightly fear that expanding in this direction would jeopardize its coveted reputation for impartiality.
As a result, besides the STAA team, Congress needs assistance from some other entity that would be able to perform the omitted tasks. The staffing of such an organization must include, among others skill sets, social scientists, historians, philosophers, and ethicists who are equipped to deal with issues that include S&T components. Teams with such expertise have demonstrated a capability to produce reports that exhibit analytic rigor and depth, and—with some practice and pluck—they can learn to reveal openly how their conclusions vary depending upon different normative orientations.
However, even such experts are apt to remain constrained in articulating normative considerations based on their concern to maintain their professional reputations. Another, perhaps even more significant, limitation is that experts’ experiential backgrounds distinguish them, individually and collectively, from everyone else in society. Decades of experience by government offices of technology assessment and by outside teams of scholars have shown that well-structured participatory processes—some including representatives of stakeholder organization and others involving randomly selected laypeople—can fill in where expert analysis falters. Laypeople bring to the table local knowledge, a different range of social and ethical orientations, and a readiness to articulate careful, normatively informed judgments where the experts are apt to say only that “more study is needed.”
Organizational options for providing the needed expert and participatory capabilities have been addressed elsewhere, including in Reinventing Technology Assessment: A 21st Century Model, a report from the Woodrow Wilson International Center for Scholars that I wrote.
Richard Sclove
Cofounder of the Expert and Citizen Assessment of Science and Technology (ECAST) network
Author of the forthcoming book Escaping Maya’s Palace: Decoding an Ancient Myth to Heal the Hidden Madness of Modern Civilization.
I applaud the recent attempt to expand Congress’s capacity to better anticipate and assess both the intended and unintended consequences of technological advance, as Timothy Persons describes in “The Return of Science and Technology Assessment to Congress” (Issues, Fall 2020). But the COVID-19 pandemic brings into clear focus the limits of national technology assessment (TA) efforts in addressing complex technological issues that stretch across national borders.
The TA community has certainly acknowledged these limitations in the past by establishing overarching organizations such as the European Parliamentary Technology Assessment group, which brings together its 23 members for annual meetings and supports information sharing. But now is the time to give serious thought to setting up a global TA Commons to focus specifically on TA issues that are inherently supranational—future pandemics, for sure, but also issues such as global bio- and cybersecurity, the regulation of space, undersea mining, weapons proliferation, or human intrusions into global anthropogenic systems (such as the carbon or nitrogen cycles) through geoengineering. Disruptive technologies will be increasingly developed and deployed by transnational corporations beyond the effective control of single national states.
Zoom meetings, occasional conferences, and document sharing are not enough. How could we create a TA commons? One country could donate the space to house the effort, as Austria did when the International Institute for Applied Systems Analysis (IIASA) was created in 1972 to support scientific exchange during the Cold War. National TA offices would detail one staff member to this new entity to create rotating groups of experts working on global TA issues.
IIASA could, in fact, provide a good home for such an effort given its focus on global systems challenges, but a range of options should be explored, including expanding the global TA efforts coordinated by the German Institute for Technology Assessment and Systems Analysis at the Karlsruhe Institute for Technology.
A supranational focus opens up the possibility to develop nested scenarios, where global scenarios provide a wider context for the development and coordination of regional and national technology policies. A global TA Commons could improve participatory technology assessment exercises by establishing a structured and continual means for commons- based peer-production of assessments. It could focus on providing input on technological choices facing global policymaking organizations such as the World Health Organization, the UN Food and Agriculture Organization, the International Atomic Energy Agency, the World Bank, and the International Monetary Fund, as well as nongovernmental organizations working to address planetary challenges.
This vision, however, is not likely to happen if it is dependent solely on episodic funding and occasional meetings. But a few forward-looking philanthropies and governments could create and evaluate a global TA Commons with a $10 million to $15 million multiyear investment. In the future, the world will face an increasing proliferation and acceleration of global risks, many related to our use, or misuse, of technology. The catastrophic lack of systems thinking and policy coordination during the pandemic, even evident around relatively simple technologies such as test kits and protective gear, is a warning.
David Rejeski
Visiting Scholar, Environmental Law Institute
Fellow of the National Academy of Public Administration