Emerging Technologies

Elon Musk leads 116 experts calling for a ban on autonomous weapons

The humanoid robot AILA (artificial intelligence lightweight android) operates a switchboard during a demonstration by the German research centre for artificial intelligence at the CeBit computer fair in Hanover March, 5, 2013. The biggest fair of its kind open its doors to the public on March 5 and will run till March 9, 2013.  REUTERS/Fabrizio Bensch (GERMANY - Tags: BUSINESS SCIENCE TECHNOLOGY) - RTR3ELOF

The campaign to stop 'killer robots' is calling on the United Nations for strict oversight of autonomous weapons. Image: REUTERS/Fabrizio Bensch

David Z. Morris
Technology Writer, Fortune

One hundred and sixteen roboticists and AI researchers, including SpaceX founder Elon Musk and Google Deepmind co-founder Mustafa Suleyman, have signed a letter to the United Nations calling for strict oversight of autonomous weapons, a.k.a. "killer robots." Though the letter itself is more circumspect, an accompanying press release says the group wants "a ban on their use internationally."

Other signatories of the letter include executives and founders from Denmark’s Universal Robotics, Canada’s Element AI, and France’s Aldebaran Robotics.

Image: Campaign to Stop Killer Robots

The letter describes the risks of robotic weaponry in dire terms, and says that the need for strong action is urgent. It is aimed at a group of UN officials considering adding robotic weapons to the UN’s Convention on Certain Conventional Weapons. Dating back to 1981, the Convention and parallel treaties currently restrict chemical weapons, blinding laser weapons, mines, and other weapons deemed to cause “unnecessary or unjustifiable suffering to combatants or to affect civilians indiscriminately.”

Have you read?

Robotic warriors could arguably reduce casualties among human soldiers – at least, those of the wealthiest and most advanced nations. But the risk to civilians is the headline concern of Musk and Suleyman’s group, who write that “these can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways."

The letter also warns that failure to act swiftly will lead to an “arms race” towards killer robots – but that’s arguably already underway. Autonomous weapons systems or precursor technologies are available or under development from firms including Raytheon, Dassault, MiG, and BAE Systems.

Element AI founder Yoshua Bengio had another intriguing warning – that weaponizing AI could actually “hurt the further development of AI’s good applications.” That’s precisely the scenario foreseen in Frank Herbert’s sci-fi novel Dune, set in a universe where all thinking machines are banned because of their role in past wars.

The UN weapons group was due to meet on Monday, August 21, but that meeting has reportedly been delayed until November.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Stay up to date:

Artificial Intelligence

Related topics:
Emerging TechnologiesManufacturing and Value Chains
Share:
The Big Picture
Explore and monitor how Artificial Intelligence is affecting economies, industries and global issues
World Economic Forum logo

Forum Stories newsletter

Bringing you weekly curated insights and analysis on the global issues that matter.

Subscribe today

Here’s why it’s important to build long-term cryptographic resilience

Michele Mosca and Donna Dodson

December 20, 2024

How digital platforms and AI are empowering individual investors

About us

Engage with us

  • Sign in
  • Partner with us
  • Become a member
  • Sign up for our press releases
  • Subscribe to our newsletters
  • Contact us

Quick links

Language editions

Privacy Policy & Terms of Service

Sitemap

© 2024 World Economic Forum