Rebecca Rutstein and the Ocean Memory Project, "Blue Dreams" (2023), still from the 2 minute and 40 second digital video.

Innovation’s New Catechism

Since the mid-1970s, DARPA, the oft-copied Defense Advanced Research Projects Agency, has used a series of eight questions to steer investment decisions. What are you trying to do? Who cares? If you are successful, what difference will it make? What are the risks? How much will it cost? How long will it take? Called the Heilmeier Catechism, these questions have been adopted by the wider world of investors and entrepreneurs as the creed that enables the high priests of scientific innovation to soar above the crowd.

But George Heilmeier, director of the agency from 1975 to 1977, actually invented the questions for the purpose of reining in DARPA’s entitled culture, which he compared to a “big cashier booth” in a 1991 interview with historian Arthur L. Norberg. He described receiving vague proposals for artificial intelligence that implicitly said, “Give us the money and we’ll do good things.” To push back, Heilmeier read every proposal, scrawled them with questions, and demanded answers before approving. Over time, he refined his now-legendary questions. But some DARPA directors complained, he told Norberg. “‘Heilmeier, you don’t know what you are doing. Your job is to get the money to the good people and don’t ask any questions.’ And my reaction was, ‘That’s pure bullshit. That’s irresponsible.’”

The son of a janitor, Heilmeier played a key role in creating liquid-crystal display (LCD) technology at RCA Laboratories and earned 15 patents in his career. His catechism was squarely aimed at scientists, the insiders who did the important work of technological and scientific innovation at the time.

But when the new health-focused innovation agency, ARPA-H, launched recently, it added two new questions to the catechism that take the public into consideration, as Jassi Pannu, Janika Schmitt, and Jacob Swett explain in this issue. The first question reorients the agency from solving the problems of “warfighters”—the targets of DARPA’s R&D investments—toward health: “To ensure equitable access to all people, how will cost, accessibility, and user experience be addressed?” This question is very much in keeping with a growing sense of the importance of reforming the diffuse and complex health system by removing barriers, including high prices and inaccessibility, and bringing equity where great disparities exist.

ARPA-H’s other new question, though, reveals a confounding way that the sphere of science has changed since the 1970s: “How might this program be misperceived or misused (and how can we prevent that from happening)?” This question explicitly awards nonscientists agency in innovation that the previous questions didn’t. But in the document laying out ARPA-H’s Heilmeier questions, the “good people” making “good things” who Heilmeier encountered in the 1970s have been joined (in a section called “Considerations”) by “bad actors” with “nefarious intentions.” The question is certainly realistic for innovation today, but it raises questions about the future of the catechism.

How could a program be appropriated by groups of people, scaled, and used to accomplish things that surpass the creator’s intentions and wildest hopes? What might be possible when the walls around the church of innovation come down?

I asked around about the origins of that last question and was pointed to a 2018 report by Richard Danzig, a former secretary of the Navy, called Technology Roulette: Managing Loss of Control as Many Militaries Pursue Technological Superiority. The report argues that in the near future,“the most reasonable expectation is that the introduction of complex, opaque, novel, and interactive technologies will produce accidents, emergent effects, and sabotage. In sum, on a number of occasions and in a number of ways, the American national security establishment will lose control of what it creates.” At the end of the report there is another list of questions, a sort of alternative catechism focused on anticipating how technology might slip the bounds and intent of its makers. Developed by Jason Matheny when he was director of IARPA, the intelligence-focused ARPA, the list includes questions that attempt to understand what could go wrong, such as: “If the technology is leaked, stolen, or copied, would we regret having developed it?”

Although the original Heilmeier questions had a more neutral viewpoint, the latest addition to ARPA-H’s set of questions has an explicit framing around security, danger, losing control, and even regret. This is perfectly reasonable for intelligence and defense innovation. And in the context of the COVID-19 pandemic, during which people died because of bad information about vaccines and misuse of inappropriate drugs, a security lens on health seems warranted.

But there are other ways to consider what happens when innovation moves beyond the confines of its creators. It can do good as well as bad. And here, we might also stand question 10 on its head: How could a program be appropriated by groups of people, scaled, and used to accomplish things that surpass the creator’s intentions and wildest hopes? What might be possible when the walls around the church of innovation come down? 

The destabilizing potential of mass innovation is already visible. After Russia invaded, Ukraine initiated a loosely coordinated effort to retrofit off-the-shelf consumer drones for use in combat. Then they began 3D printing equipment to give these drones new capabilities while developing a philosophy for using them effectively in battle. The movement depends on social uptake as much as on equipment. In the last year, the country’s Army of Drones, with Star Wars actor Mark Hamill as a spokesperson, has worked with private groups to train 10,000 drone pilots and plans to train another 10,000 in the next six months.

The traditional model of military superiority, the one DARPA was designed for, involves all-powerful technology, soldiers trained in the use of weaponry, and a shared ideology. By contrast, the drones in Ukraine are inherently fragile. Today’s hybrid socio-technological forces gain strength from the proliferation of nonsuperior tech and the buy-in of ragtag enthusiasts. Goodbye to the best of the best, and hello to the motivated hordes. What they’re creating may be just good enough to make a difference in a brutal war.

Other unexpected aspects of innovation outside the church are showing up throughout the culture, for good and ill. The unsafe mixture of engineering, science, and tourism in the Titan submersible that recently killed five people near the Titanic wreckage is an example. The vessel’s owner, Stockton Rush, styled himself as a big thinker bucking a sclerotic innovation system. When journalist David Pogue interviewed him in 2022, he bragged that the craft was piloted with an off-the-shelf game controller. “One of our earlier subs, we developed a controller and it was $10,000, and it was big and bulky. But this thing is made for a 16-year-old to throw it around, and we keep a couple of spares. And so the neat thing is it’s Bluetooth; I can hand it to anyone.” But when Pogue questioned the choice of technology, Rush’s reply showed his larger consideration was social attitudes: “I like messing with people’s heads.”

As Issues has covered, the unpredictability of technology applied outside normal channels is starting to scramble some long-established governance structures, such as the concept of dual use and the mixture of formal rules and informal norms for safety, including biosafety. It would be difficult to anticipate the harm that the carbon fiber, titanium, and an off-the-shelf game controller could cause when juxtaposed together on the Titan. But part of what allowed it to evade safety norms was that it appeared in the realm of high-budget adventure tourism rather than in that of science or industrial practice. In 2018, Technology Roulette argued that it is necessary to anticipate technological co-optation; to the extent that that is possible, it will require a better sense of the social aspects—anticipating what will happen when new players with new tools start “messing with people’s heads.”

Insight into the social process of innovation can be found in places where it’s already active. In her piece for this issue, “From Bedside to Bench and Back,” Tania Simoncelli describes the Rare As One project, an ambitious effort by the Chan Zuckerberg Initiative to bring scores of patient advocacy groups into the research sphere to make faster progress on treatments and cures for rare diseases. With seed funding and managerial resources, they are creating patient registries, driving research agendas, and advancing promising treatments. Their involvement is accelerating work that would normally take decades, if it happened at all.

The unpredictability of technology applied outside normal channels is starting to scramble some long-established governance structures, such as the concept of dual use and the mixture of formal rules and informal norms for safety, including biosafety.

Part if this is, of course, driven by passion. Tracy Dixon-Salazar, the mother of a child diagnosed with a rare form of epilepsy, eventually earned a doctorate studying neuroscience and genetics in an effort to understand her daughter’s disease. When her postdoc advisor suggested that they sequence her daughter’s genes, Dixon-Salazar went to work with fervor, in a process she described at a Story Collider event in Aspen: “Every time a new analysis tool would come out by the scientists, every time I would learn something else about brain development, every time Savannah would have a particularly bad day and I was having difficulty coping, I would reanalyze her data. One day, I analyzed it in a way I now have deemed ‘Crazy-Mother Analysis.’” By identifying all of the mutations unique to her daughter’s genome, and grouping them according to signaling pathways, she noticed that one pathway for calcium had a large number of mutations. Ultimately, that led to finding a drug that drastically reduced the frequency of her daughter’s seizures.

Although the research community highly values dispassionate, fact-based research, it has long leaned into the myth of the “mad scientist” driven by scientific passion. This is a popular framing for journalists’ profiles, and it reinforces the concept of science as a nearly religious calling. Dixon-Salazar brings the “crazy mother” geneticist to this pantheon—enabled by science but driven by love. This is a potent addition, and scientists themselves say that working with patient groups has changed how they perceive problems and missions. Getting a deeper understanding of this dynamic of passion, and finding ways to invite more people to use the tools of science and technology, is just as important as trying to prevent misuse.

Patient groups are also inviting lay people into research as more than “subjects.” A program in Puerto Rico, where there is a high incidence of Hermansky-Pudlak syndrome, sends mail to patient families who don’t have access to the internet, organizes transportation to bring patients to conferences, and does in-person outreach after hurricanes to ensure no one is left behind.

And patient groups are creating research tools that are faster, cheaper, and more accessible. To quickly build a natural history survey from patients who were dispersed around the world, the FOXG1 Research Foundation partnered with the company Ciitizen, which is using machine learning to extract data from years of medical records. Through this process, a detailed natural history study of 100 patients was compiled quickly, at a fraction of the usual cost—enabling better diagnosis and making it possible to evaluate treatments. The approach was then scaled to 50 rare neurological conditions.

This is the great promise of innovation outside the walls of the church: infused with passion and adopted by groups of people motivated by common goals, a technology built for another purpose is modified to work at a social scale to accomplish what was previously impossible.

Working in this new sphere will require new catechisms, based on human connection and diffusion rather than walls. And it will also require a shift in the worldview of the scientific enterprise from a fear of losing control to one that is prepared to usher new people into the sanctum, giving them tools to work toward the greater good. As Danzig wrote in Technology Roulette, “the difficulty of taking these important steps should remind us that our greatest challenges are not in constructing our relationships to technologies, it is in constructing our relationships with each other.”

Cite this Article

Margonelli, Lisa. “Innovation’s New Catechism.” Issues in Science and Technology 39, no. 4 (Summer 2023): 22–24.

Vol. XXXIX, No. 4, Summer 2023