© 2022 Brian Stauffer for Human Rights Watch
Governments should move the discussions of a treaty on autonomous weapons systems, also known as “killer robots,” away from meetings of an existing weapons treaty, where they are stalled.
Such weapons systems operate without meaningful human control, delegating life-and-death decisions to machines. Several countries are investing in the technology to develop them.
Governments should initiate negotiations of a new treaty, either outside the UN, like the landmine or cluster munitions treaties, or in the UN General Assembly, like the nuclear weapons ban treaty.
(Geneva, November 10, 2022) – Governments should move the stalled discussions of a treaty on autonomous weapons systems, known as “killer robots,” to a new international forum, Human Rights Watch said in a report released today. Such weapons systems operate without meaningful human control, delegating life-and-death decisions to machines.
The 40-page report, “An Agenda for Action: Alternative Processes for Negotiating a Killer Robots Treaty,” is copublished by Human Rights Watch and the Harvard Law School International Human Rights Clinic. It proposes that countries initiate a treaty-making process based on past humanitarian disarmament models, such as for the treaty banning cluster munitions.
“A new international treaty that addresses autonomous weapons systems needs a more appropriate forum for negotiations,” said Bonnie Docherty, senior arms researcher at Human Rights Watch, associate director of armed conflict and civilian protection at the Harvard Human Rights Clinic, and lead author of the report. “There’s ample precedent to show that an alternative process to create legal rules on killer robots is viable and desirable, and countries need to act now to keep pace with technological developments.”
More than 70 countries as well as nongovernmental organizations and the International Committee of the Red Cross regard a new treaty with prohibitions and restrictions as necessary, urgent, and achievable. United Nations Secretary-General António Guterres called for “internationally agreed limits” on weapons systems that could, by themselves, target and attack human beings, describing such weapons as “morally repugnant and politically unacceptable.”
Talks on concerns about lethal autonomous weapons systems have been underway under the auspices of the Convention on Conventional Weapons (CCW) since 2014. Countries will reconvene at the UN in Geneva on November 16-18, 2022, for the treaty’s annual meeting, but there is no indication they will agree to negotiate a new legally binding instrument via the CCW in 2023 or in the near future.
The main reason for the lack of progress under the CCW is that its member countries rely on a consensus approach to decision-making, which means a single country can reject a proposal, even if every other country agrees to it. A handful of major military powers have repeatedly blocked proposals to move to negotiations, notably India and Russia over the past year. Both countries also attempted to block nongovernmental organizations from participating in discussions in 2022.
India and Russia, as well as Australia, China, Iran, Israel, South Korea, Turkey, the United Kingdom, and the United States are investing heavily in the military applications of artificial intelligence and related technologies to develop air, land, and sea-based autonomous weapons systems.
Given the shortcomings of the CCW forum, alternative processes for negotiating a new treaty should be explored, Human Rights Watch and the Harvard Human Rights Clinic said. One option is an independent process outside of the UN, as was used for the treaties banning antipersonnel landmines and cluster munitions. Another is via the UN General Assembly, which initiated negotiations of the nuclear weapons ban treaty.
Four characteristics of these alternative processes are particularly conducive to achieving strong treaties in a timely fashion: a common purpose; voting-based decision-making; clear and ambitious deadlines; and a commitment to inclusivity, Human Rights Watch and the Harvard Human Rights Clinic said.
Countries have already expressed broad support for essential elements needed to address concerns over removing human control from the use of force. A new international treaty should prohibit autonomous weapons systems that inherently lack meaningful human control as well as systems that target people. It should contain positive obligations to ensure meaningful human control in other weapons systems with autonomy. “Meaningful human control” is widely understood to require that technology is understandable, predictable, and constrained in space and time.
In October, 70 countries expressed their support for “internationally agreed rules and limits” on autonomous weapons systems in a joint statement to the UN General Assembly’s First Committee on Disarmament and International Security.
There have also been more expressions of support for regulation from industry. In October, Boston Dynamics and five other robotics companies pledged not to weaponize their advanced mobile robots and called on others to “make similar pledges not to build, authorize, support, or enable the attachment of weaponry to such robots.”
Human Rights Watch is a cofounder of Stop Killer Robots, the coalition of more than 190 nongovernmental organizations in 67 countries that advocates for new international law on autonomy in weapons systems.
“The longer the killer robots issue stays stuck in the current forum, the more time developers of autonomous weapons systems have to hone new technologies and achieve commercial viability,” Docherty said. “A new treaty would help stem arms races and avoid proliferation by stigmatizing the removal of human control.”