Emerging Technologies

Elon Musk leads 116 experts calling for a ban on autonomous weapons

The campaign to stop 'killer robots' is calling on the United Nations for strict oversight of autonomous weapons.

Image: REUTERS/Fabrizio Bensch

One hundred and sixteen roboticists and AI researchers, including SpaceX founder Elon Musk and Google Deepmind co-founder Mustafa Suleyman, have signed a letter to the United Nations calling for strict oversight of autonomous weapons, a.k.a. "killer robots." Though the letter itself is more circumspect, an accompanying press release says the group wants "a ban on their use internationally."

Other signatories of the letter include executives and founders from Denmark’s Universal Robotics, Canada’s Element AI, and France’s Aldebaran Robotics.

Image: Campaign to Stop Killer Robots

The letter describes the risks of robotic weaponry in dire terms, and says that the need for strong action is urgent. It is aimed at a group of UN officials considering adding robotic weapons to the UN’s Convention on Certain Conventional Weapons. Dating back to 1981, the Convention and parallel treaties currently restrict chemical weapons, blinding laser weapons, mines, and other weapons deemed to cause “unnecessary or unjustifiable suffering to combatants or to affect civilians indiscriminately.”

Robotic warriors could arguably reduce casualties among human soldiers – at least, those of the wealthiest and most advanced nations. But the risk to civilians is the headline concern of Musk and Suleyman’s group, who write that “these can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways."

The letter also warns that failure to act swiftly will lead to an “arms race” towards killer robots – but that’s arguably already underway. Autonomous weapons systems or precursor technologies are available or under development from firms including Raytheon, Dassault, MiG, and BAE Systems.

Element AI founder Yoshua Bengio had another intriguing warning – that weaponizing AI could actually “hurt the further development of AI’s good applications.” That’s precisely the scenario foreseen in Frank Herbert’s sci-fi novel Dune, set in a universe where all thinking machines are banned because of their role in past wars.

The UN weapons group was due to meet on Monday, August 21, but that meeting has reportedly been delayed until November.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Share:
World Economic Forum logo

Forum Stories newsletter

Bringing you weekly curated insights and analysis on the global issues that matter.