Revisiting Arms Control Paradigms in the Age of LAWS: Towards a Framework for Human Control

Executive Summary

Lethal Autonomous Weapons Systems, or LAWS, have been the subject of much debate and controversy because they transfer critical decisions from humans to machines, leading to concerns about ethics, transparency and accountability. While autonomous functions already exist in various forms across militaries worldwide, the idea of automating part of the decision-cycle for lethal action has sparked debates about the potential for mis-targeting. However, states with advanced artificial intelligence (AI) capabilities continue to develop the capacity to design and deploy such systems. Arms control discussions on this subject have been challenging both because states are unable to agree on a concrete definition of what actually constitutes LAWS and because there are differing perspectives on the nature of the arms control mechanisms that need to be established. This paper presents the different stances of governments on the governance of LAWS. It also explores how previous paradigms of arms control have operated and what the specific challenges that LAWS present. Finally, it explores India’s considerations and how it can move forward in these discussions.

The key takeaways are as follows:

● There is a significant debate regarding the necessity of a legally binding instrument to govern LAWS. Some governments advocate for such an instrument, while others prefer non-binding measures. Additionally, although there is a consensus on the importance of "meaningful human control" over LAWS, the definition and implementation of this concept remain contested.

● Historical arms control efforts have taken various forms, including non-proliferation treaties, agreements regulating weapon use, outright bans (often driven by humanitarian concerns), and arms-limitation treaties. These paradigms offer potential frameworks but also highlight the unique challenges LAWS pose to the international system.

● Governing LAWS presents several unique challenges. AI, the underlying technology, is dual-use, making monitoring difficult. The lack of a universal definition for LAWS hinders the creation of specific legal instruments or widely accepted norms. Furthermore, the perceived military advantage LAWS offer makes governments hesitant to agree to strict limitations.

● The ability of some autonomous and semiautonomous systems to loiter for extended periods heightens some of the risks associated with error and target-misidentification, and the autonomous capabilities of such systems need to be regulated accordingly.

Authors

Previous
Previous

Taming the Skies: Managing the Unmanned Airspace and Countering Rogue Drones

Next
Next

Comments on Demand for Grants (DFGs) by Ministry of Science and Technology in the Union Budget (FY 2025-26)