By James Vincent on July 27, 2015
Leading artificial intelligence researchers have warned that an "AI arms race" could be disastrous for humanity, and are urging the UN to consider a ban on "offensive autonomous weapons." An open letter published by the Future of Life Institute (FLI) and signed by high-profile figures including Stephen Hawking, Elon Musk, and Noam Chomsky, warns that weapons that automatically "select and engage targets without human intervention" could become the "Kalashnikovs of tomorrow," fueling war, terrorism, and global instability.
"Artificial Intelligence (AI) technology has reached a point where the deployment of such systems is — practically if not legally — feasible within years, not decades," states the letter, citing armed quadcopters (a technology that has already been deployed in a very crude fashion) as an example.
The letter notes that although it's possible that the use of autonomous weapons could reduce human casualties on the battlefield, this itself could be a mistake as it would "[lower] the threshold" for going to war.
"Autonomous weapons are ideal for assassinations [and] destabilizing nations."
"Unlike nuclear weapons, [autonomous weapons] require no costly or hard-to-obtain raw materials, so they will become ubiquitous and cheap for all significant military powers to mass-produce," states the letter. "It will only be a matter of time until they appear on the black market and in the hands of terrorists, dictators wishing to better control their populace, warlords wishing to perpetrate ethnic cleansing, etc. Autonomous weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group."
A backlash against AI weapons could hamper research
The letter ends by warning that the development of autonomous weapons could tarnish the field of artificial intelligence and create a "major public backlash" that would impede potentially beneficial AI research. The authors conclude that this "should be prevented by a ban on offensive autonomous weapons beyond meaningful human control" and are urging the UN to take action.
The FLI is not the only organization campaigning in this area, but semi-autonomous weapon systems are already proliferating, with the US Air Force predicting that "by 2030 machine capabilities will have increased to the point that humans will have become the weakest component in a wide array of systems." Critics and proponents alike have also noted that it can be difficult to draw the line when it comes to what is and what isn't an autonomous weapons system.
Other signatories to FLI's open letter include Apple co-founder Steve Wozniak, Skype co-founder Jaan Talinn, and Demis Hassabis, the CEO of DeepMind, a British artificial intelligence company acquired by Google last year. The FLI has previously published open letters on similar topics, including one in January this year calling on researchers to focus on the "societal benefits" of AI.