Deliberating Autonomous Weapons


Banning Lethal Autonomous Weapons: An Education
Read Responses From

For those of us having spent many years in diplomacy at the United Nations, Stuart Russell’s account of his involvement in the automated weapons systems (AWS) policy discussion, “Banning Lethal Autonomous Weapons: An Education” (Issues, Spring 2022), was a poignant reminder of the difficulties of achieving results in multilateral deliberations.

As UN High Representative for Disarmament, I met with Christof Heyns shortly after the 2013 publication of his report on lethal autonomous robotics and the protection of life, and subsequently brought it to the attention of representatives of member states. Heyns was a Human Rights Special Rapporteur, and reports such as his are not staple reading material for arms control officials. My interventions stimulated interest. But we needed a “home” for the discussions and found it in one of the Geneva Conventions—generally known as CCW, for simplicity’s sake. CCW is the umbrella convention for five protocols, one of which deals with blinding laser weapons that were outlawed before being fully developed. It was a perfect fit, everyone thought.

Open-ended working groups (in which any member state can participate) convened between 2014 and 2016. Few government delegates were familiar with the issue of AWS. Sharing information and substantive briefings were important to acquaint diplomats with the issues, and Russell participated, explained autonomy, and patiently answered many questions. Another briefer from the British artificial intelligence company Deep Mind also took part in the proceedings, but became disillusioned when he was given only a 30-minute time slot: not worth the travel to Geneva, was his feedback.

Where are we now, nine years after the discussions started? Following the activities of the Open-Ended Working Group, its name (and format) was changed in 2016 to a “Group of Governmental Experts,” a diplomatic construct that allows adoption of reports and documents only by consensus, essentially blocking the will of the majority by giving a veto power to every participating state.

More years spent debating this issue may not yield the desired result; public pressure and advocacy, however, may.

The limited time in AWS meetings was spent debating key definitions, compliance with international humanitarian law, the relevance of ethical principles, the need for human command, and whether nonbinding principles and practices would suffice or were legally binding rules needed. These differences are nowhere near resolution, and states have coalesced around political positions. There are those trying to table a proposal that could find consensus, such as a normative and operational framework. Others propose a political declaration. Over 30 countries have called for a total global ban on AWS.

When a meeting in December 2021 could not agree on a way forward, a group of 23 states delivered a statement highlighting the urgency of an outcome, stating that “in order for the CCW to remain a viable forum to address the challenges posed by LAWS [legal autonomous weapons systems] its deliberations must result in a substantive outcome commensurate with the urgency of the issue.” With a total of 10 days of meetings scheduled for 2022, it is doubtful that the plea for urgency will be heeded.

Russell sets out the complexities of AWS and shares his education in diplomacy. It should be mandatory reading for AI scientists, diplomats, advocacy groups, and the general public. More years spent debating this issue may not yield the desired result; public pressure and advocacy, however, may.

Vice President, International Institute for Peace, Vienna

Former Under-Secretary-General of the United Nations

Stuart Russell’s writings on the problems of aligning artificial intelligence with human values and regulating autonomous weapons systems have had a seminal influence on me and many others. I was thus glad to read about his “education” in the difficulties of effective arms control in the area.

I wish him every success in these endeavors. I also want to suggest an alternative approach to governing autonomous weapons. As Russell notes, no agreement has emerged from almost a decade of meetings under the United Nations Convention on Certain Conventional Weapons. None will in the foreseeable future. Yet these meetings and the civil society groups that draw attention to the topic may nevertheless have had some positive effect. They help create a moral sanction against using such weapons.

If effective versions of these technologies one day spread widely, however, rivals will employ them, and norms against their use will likely lose their power. So it makes sense to consider ways to prevent these technologies—particularly advanced, effective versions of them—from spreading. Just as in the realm of nuclear weapons, and increasingly across a range of technologies, it makes sense to consider the strategy of nonproliferation first, then norms governing use second.

Nonproliferation can be more successful than an outright ban because the major powers do not, as a rule, agree to give up development of militarily important technologies for which they have no substitutes. In the case of the Biological Weapons Convention, for instance, major powers were willing to sign it because they had another technology that was viewed explicitly as more effective in the same tactical-strategic niche: namely, nuclear weapons. Thus, when the Soviet Union violated the treaty with a major biological weapons program, the security of other signatories was not too significantly impacted. Treaty verification mechanisms might seem to be the solution, but major military powers have been willing to allow only less invasive mechanisms and to trust in verification only when the consequences of failure to detect treaty violations are relatively insignificant.

Nonproliferation can be more successful than an outright ban because the major powers do not, as a rule, agree to give up development of militarily important technologies for which they have no substitutes.

In the case of autonomous weapons, the military effectiveness of future generations of the technology is unknown, and it appears likely that it will eventually perform functions that no other current technology can. Thus, major powers will not give them up. Even minor powers will not give them up if they are worried that their rivals will not.

The result is that a mutually supporting nonproliferation regime and norm of use is the sort of arms control that might be made to work, just as in the nuclear case. There, norms around nuclear weapons culminated in the Treaty on the Prohibition of Nuclear Weapons. Yet this treaty and those norms might not exist without the nonproliferation regime. They are mutually supporting.

Nonproliferation is not simple, and it is not the ideal. It would require significant focus on the part of major powers, including security guarantees for countries that give up a means of defense. It might not work, for a variety of reasons. But for a variety of other reasons, it might be worth trying.

Associate Professor of Political Science

University of California, Los Angeles

Strategic Modeling Team Lead

Centre for the Governance of AI

Cite this Article

“Deliberating Autonomous Weapons.” Issues in Science and Technology 38, no. 4 (Summer 2022).

Vol. XXXVIII, No. 4, Summer 2022