Skip to main content

Autonomous Weapons: Treaties

Question for Ministry of Defence

UIN HL2033, tabled on 21 July 2022

To ask Her Majesty's Government, further to their policy paper Ambitious, Safe, Responsible: Our approach to the delivery of AI enabled capability in Defence, published on 15 June, which states that weapons that identify, select and attack targets without context-appropriate human involvement "are not acceptable", whether they will be supporting the negotiation of a legally binding international instrument that both (1) prohibits autonomous weapons that identify, select and attack targets without context-appropriate human involvement, and (2) regulates other autonomous weapons systems to ensure meaningful human control over the use of force.

Answered on

4 August 2022

The UK does not support calls for further legally binding rules that prohibit autonomous weapons that identify, select and attack targets without context-appropriate human involvement and regulate other autonomous systems. International Humanitarian Law already provides a robust, principle-based framework for the regulation of development and use of all weapons systems including weapons that contain autonomous functions.

Without international consensus on the definitions or characteristics of weapons with levels of autonomy, a legal instrument would have to ban undefined systems, which would present difficulties in the application of any such ban and which could severely impact legitimate research and development of AI or autonomous technologies.