Surviving the Techstorm
Review of
A Dangerous Master: How to Keep Technology from Slipping Beyond Our Control
New York, NY: Basic Books, 2015, 336 pp.
In A Dangerous Master, Wendell Wallach, a scholar at Yale University’s Interdisciplinary Center for Bioethics, tells the story of modern society’s struggle to cope with technology’s complexity and uncertainty. In the course of telling this story, Wallach questions the terms of the social contract under which we as a society predict, weigh, and manage the potential harms and benefits that result from the adoption of emerging technologies. In urgent prose, he argues that we need different epistemic, cultural, and political approaches to adequately assess emerging technologies’ risks and their broader social impacts. Wallach promotes public deliberation as one of these approaches, which provides citizens and experts the opportunity to distinguish the technological hype from the hope, the speculation from reality, and in so doing shape their technological futures.
There is a whiff of science fiction in Wallach’s prose—maybe more fiction than science. He envisions a future where autonomous robots, designer babies, homemade pathogenic organisms, and mindclones confront our hopes, fears, and inner conflicts. An ethicist by training, Wallach uses these future scenarios to explore normative questions borne of the pervasive entanglements of technology and society. One of the principal questions at the heart of A Dangerous Master is whether we will find ourselves governed by transformative technologies whose ramifications we did not envision and whose development we did not examine. This question is one that always emerges when we fail to anticipate or develop adequate controls for complex technological systems. To preemptively counter these questions, scientists and engineers usually reach for concepts and metaphors that portray their artifacts—physical and biological systems—as reliable and controllable.
Wallach pulls a powerful yet simple interrogation from the sidelines of the global conversation on science policy: What is our purpose in developing a particular technology, and is that purpose one we ought to pursue? Wallach warns us that without adequate reflection, we risk being swept into an incessant storm of groundbreaking scientific discoveries and technological applications that outpace society’s capacity for control. What Wallach refers to as a “techstorm” is essentially a period of overzealous technological innovation that can have destructive consequences. Wallach offers a useful historical analog for a techstorm in the tools and technologies that gave rise to the Industrial Revolution. Disruptive technologies, including steam power, the cotton gin, and more efficient iron production, radically transformed society, upending traditional hierarchies, reshaping economies, and even modifying relationships with the natural world. He reminds us that the benefits reaped from the new manufacturing era were preceded by a period that immiserated much of the workforce and included among its harms child labor and unsafe working conditions.
In the wake of the manufacturing era, came the rise of industrial capitalism, based on individual rights, private ownership, and free markets. Capitalism and the techstorm that enabled it brought with them growing disparities in wealth and opportunity, both among and within countries. In the current context of four decades of wage stagnation and wealth inequality approaching levels not seen since the early twentieth century, Wallach imagines what would happen if technology permanently replaced a great deal of human work, as many digital age observers have predicted, and suggests that rethinking capital distribution will become necessary if society wishes to avoid a crisis.
Wallach identifies another potential fissure in the social contract in the form of a genetically stratified future, where a select few are capable of further distancing themselves from the majority through the use of genetic enhancements. He also analyzes transhumanist philosophy, which idealizes a post-human future in which humans master their own evolution using technological enhancements: “The best of what it means to be a feeling, caring, and self-aware person will be fully expressed in technologically enhanced cyborgs and techno sapiens.” Wallach characterizes the critique of transhumanism as “buy[ing] into a representation of what it means to be human that is incomplete and mechanizes, biologizes, and pathologizes human nature.”
I think Wallach’s perspective on this debate is unduly Manichean, but I share the notion that we need to be vigilant of reductionist discourses. These conversations imperceptibly close rather than open the prospect for us to decide what we want to become and what we want our futures to be. Such discourses also obscure rather than illuminate the deepest sources of social ills, which shape the evolution of our genes, bodies, and identities. Our biological existences are profoundly influenced by where we live and how much money we have to better ourselves through education and access to care. Reductionist discourses that ignore social ills and contingencies will tend to crystallize our genetic and digital divides and, in turn, limit our opportunities to bridge them.
Techstorms are enabled by rhetoric that hypes immediate benefits while downplaying risks. This premise allows technology creators, such as those in the biomedical industry and the military-industrial complex, to evade scrutiny and quickly entrench themselves in society. Interestingly, Wallach’s emphasis on security and defense research as a driver of the techstorm is in line with what we see happening in cutting-edge biotech, as accelerated development and the glacial pace of regulatory agencies and courts outpace society’s ability to adequately evaluate the technologies’ effects. This paradigm of fast-paced innovation, driven by economic competition and security imperatives, incorporates little input from the public beyond consumers’ market preferences. Given that these technologies have the potential to disrupt many aspects of society, should we not look to uphold the much-espoused principle of democracy?
Wallach’s purpose in critiquing the techstorm is not to stifle innovation but to slow the speed of innovation to a “humanly manageable pace.” This pace is described as one that’s in line with informed decision making by individuals, institutions, governments, and humanity as a whole. As Richard Denison of the Environmental Defense Fund stated in his blog, “The real danger is continuation of a government oversight system that is too antiquated, resource-starved, legally shackled, and industry-stymied to provide to the public and the marketplace any assurance of the safety of these new materials as they flood into commerce.” A manageable pace that incorporates broad social values while ensuring human safety begins with the need to discover inflection points, which are historical junctures in technological innovation that are followed by either positive or negative consequences. Secondary inflection points can be thought of as rate adjustments in the technology’s research trajectory.
Again, Wallach’s analysis is enlightening, as our global society currently grapples with the benefits and risks of genomic editing and artificial intelligence. When it comes to mastering the human genome, a first inflection point is our will to question who should own our genes and their secrets. The Supreme Court’s unanimous 2013 ruling barring the patenting of human genes was a wise and balanced decision that cleared a major barrier to innovation in biotechnology, drug development, and medical research. But as with any inflection point, the court’s decision was only a first step toward finding the right balance between protecting legitimate intellectual property and securing an open future for personalized medicine based on genomics.
By 2015, the scientific community was confronted with one of the most important disruptions in genomics research since the 1975 Asilomar Conference on Recombinant DNA. The advent of CRISPR technology, which allows gene editing at specific loci on the genome, drastically accelerates the potential for engineering human and non-human biology. I welcome Wallach’s critique of scientific entrepreneurship and its tendency to simplistic boosterism, which saturates genomics research and policy discussions. For example, those who support gene editing often describe it as a pair of molecular scissors that cut out harmful DNA sequences on a chromosome, thus “editing out” disease. Images such as this make the gene-editing process seem easier and cleaner than it really is, and assume a control over our germline we do not yet have. They gloss over the potential for off-target edits, which can create unintended mutations in the genome. Another characteristic of the momentum around gene editing is the lack of clear understanding of the role citizens are invited to play. As Wallach might suggest, experts’ call for a moratorium on germline gene editing is no substitute for more inclusive public debates on the promises and risks of our biotech futures.
Inflection points such as those apparent with CRISPR are opportunities that allow society to exert a degree of control of the future we create. Once this window of opportunity passes, it becomes extremely difficult to overcome the ensuing technological lock-in. To avoid this fate, Wallach posits that it is necessary for oversight to comprise a combination of hard and soft regulations. In other words, effective oversight requires both nimble governance (industry standards, codes of conduct, statements of principle, and so on), and the authority of government to enforce appropriate research practices. To create this kind of oversight, Wallach advocates for the creation of governance coordinating committees (GCCs).
A GCC would act as a good-faith broker searching for gaps, overlaps, conflicts, inconsistencies, and synergistic opportunities among the various public and private entities already involved in shaping the course of an emerging field. Its primary task will be to forge a robust set of mechanisms that comprehensively address challenges arising from new innovations, while remaining flexible enough to adapt those mechanisms as an industry develops and changes.
The GCCs would be led by “accomplished elders” who have achieved wide respect (Wallach doesn’t specify their exact qualifications) and would work together with all interested stakeholders to monitor development and propose solutions to perceived problems.
Public engagement is another area Wallach addresses, highlighting the use of citizen panels as a way to involve a representative cross section of citizens to tap into their knowledge and provide input to lawmakers. Such a group would allow citizens to receive information and opinions that are different from those traditionally offered by politicians, experts, and interest groups. It would also provide a better way to generate informed attitudes that can be clearly expressed to decision makers.
Without proper citizen engagement and a means to contain the power of minority interests, technological development will proceed unhindered, for better or, quite possibly, worse. The sheer speed of change will assuredly result not just in people who surrender their lives to gadgets and machines they may not want, but in vast disruptions that society cannot mitigate. Although such a view may mark me as a Luddite, it’s appropriate to remember that the purpose of technology is to benefit our quality of life. Technological progress alone is not a means of transcending the human condition or a goal in itself; it is a tool for improving the human lot, and like any tool, it can cause serious harm with improper use. In the end, the most important message that Wallach shares—and I appreciate his effort to do so with elegance and perspicacity throughout the book—is that we need to harness the full force of our democracy to shape technological progress according to our values. Otherwise, we will end up controlled by technologies whose ramifications we did not foresee and whose path we neglected.