Skip to yearly menu bar Skip to main content


Poster

Position: AI-Powered Autonomous Weapons Risk Geopolitical Instability and Threaten AI Research

Riley Simmons-Edler · Ryan Badman · Shayne Longpre · Kanaka Rajan

Hall C 4-9 #2502
[ ] [ Paper PDF ]
[ Slides [ Poster
Tue 23 Jul 4:30 a.m. PDT — 6 a.m. PDT
 
Oral presentation: Oral 2B Positions on AI Opportunities and Risks for Society
Tue 23 Jul 7:30 a.m. PDT — 8:30 a.m. PDT

Abstract:

The recent embrace of machine learning (ML) in the development of autonomous weapons systems (AWS) creates serious risks to geopolitical stability and the free exchange of ideas in AI research. This topic has received comparatively little attention of late compared to risks stemming from superintelligent artificial general intelligence (AGI), but requires fewer assumptions about the course of technological development and is thus a nearer-future issue. ML is already enabling the substitution of AWS for human soldiers in many battlefield roles, reducing the upfront human cost, and thus political cost, of waging offensive war. In the case of peer adversaries, this increases the likelihood of "low intensity" conflicts which risk escalation to broader warfare. In the case of non-peer adversaries, it reduces the domestic blowback to wars of aggression. This effect can occur regardless of other ethical issues around the use of military AI such as the risk of civilian casualties, and does not require any superhuman AI capabilities. Further, the military value of AWS raises the specter of an AI-powered arms race and the misguided imposition of national security restrictions on AI research. Our goal in this paper is to raise awareness among the public and ML researchers on the near-future risks posed by full or near-full autonomy in military technology, and we provide regulatory suggestions to mitigate these risks. We call upon AI policy experts and the defense AI community in particular to embrace transparency and caution in their development and deployment of AWS to avoid the negative effects on global stability and AI research that we highlight here.

Chat is not available.