Elon Musk and 115 other experts ask the UN to ban killer robots in open letter

Elon Musk, Google DeepMind co-founder Mustafa Suleyman, and 114 other leading AI and robotics experts have joined together to ask the UN to ban the use of so-called killer robots in an open letter published today.

The group is concerned about the potential use of lethal autonomous weapons and how they might be applied in the future, and they penned a short note released by the Future of Life Institute. The text was made public to kick off the opening of the International Joint Conference on Artificial Intelligence (IJCAI 2017) in Melbourne, Australia, according to a press release.

"Lethal autonomous weapons" refers to the drones, autonomous machine guns, tanks, and other forms of weaponry controlled by AI on next-generation battlefields. 

Musk, for one, is famously wary of AI's potential to go bad, recently calling it "the greatest threat we face as a civilization," above even nuclear weapons — but the open letter is the first time a group of AI and robotics companies have joined forces to petition the UN specifically about autonomous weapons, according to the release. 

SEE ALSO: The world's most automated country moves toward setting a 'robot tax'

The UN’s Review Conference of the Convention on Conventional Weapons had unanimously agreed to start formal discussions on the prohibition of autonomous weapons, and 19 of the member countries have already supported banning the killer robots outright. The group was slated to meet on Aug. 21, but has been delayed until November, according to Fortune.

The open letter, which was signed by representatives from companies worth collectively billions of dollars across 26 countries, could put even more pressure to make a prohibition happen.

One of the autonomous lethal weapons already out in the world.
One of the autonomous lethal weapons already out in the world.

Image: future of life institute

The actual text of the letter is short and stark. You can read it here, but we've included the most essential passage below:

Co-signer Yoshua Bengio, a deep learning expert who founded Element AI, is concerned about more than just the immediate damage lethal autonomous weapons might cause. He cited the potential to "hurt the further development of AI’s good applications" by focusing on warfare and the inevitable backlash against the technology as a major reason for his participation in the effort. 

The Future of Life Institute published a similar letter in 2015, which was signed by Musk, Stephen Hawking, and others with a message warning against the broader dangers of AI, not just those created for warfare. 

The danger posed by non-military AI is much less pressing, which makes some of Musk's statements feel overblown and ridiculous and his self-important spat with Mark Zuckerberg more of a media spectacle than a debate with real stakes. But the potential for autonomous weapons to do damage, as the open letter states, is here now. Hopefully, the UN listens to the experts. 

WATCH: Elon Musk's self-taught AI bot destroyed an esports pro in 'Dota 2'

Https%3a%2f%2fblueprint api production.s3.amazonaws.com%2fuploads%2fvideo uploaders%2fdistribution thumb%2fimage%2f81285%2f19f38460 5242 4c75 bbd7 376df0beef6e
Https%3a%2f%2fblueprint api production.s3.amazonaws.com%2fuploads%2fvideo uploaders%2fdistribution thumb%2fimage%2f81285%2f19f38460 5242 4c75 bbd7 376df0beef6e