U.N. Weighing Laws for ‘Killer Robots’ of the Future

http://l.yimg.com/os/publish-images/news/2014-04-20/de663390-c8c1-11e3-a77a-3dd86f1da60e_Terminator-eyes-20042014.jpg
http://l.yimg.com/os/publish-images/news/2014-04-20/de663390-c8c1-11e3-a77a-3dd86f1da60e_Terminator-eyes-20042014.jpg

Armies of Terminator-like warriors fan out across the battlefield, destroying everything in their path, as swarms of fellow robots rain fire from the skies.

That dark vision could all too easily shift from science fiction to fact, some warn, with disastrous consequences for the human race, unless such weapons are banned before they leap from the drawing board to the arsenal.

On Tuesday, governments began the first-ever talks exclusively on so-called “lethal autonomous weapons systems” — though opponents prefer the label “killer robots.”

“I urge delegates to take bold action,” said Michael Møller, head of the U.N.’s Conference on Disarmament.

“All too often international law only responds to atrocities and suffering once it has happened. You have the opportunity to take preemptive action and ensure that the ultimate decision to end life remains firmly under human control,” he said.

That was echoed by the International Committee of the Red Cross, guardian of the Geneva Conventions on warfare.

“The central issue is the potential absence of human control over the critical functions of identifying and attacking targets, including human targets,” said Kathleen Lawand, head of the ICRC’s arms unit.

“There is a sense of deep discomfort with the idea of allowing machines to make life-and-death decisions on the battlefield with little or no human involvement,” she added.

The four-day meeting in Geneva aims to chart the path toward more in-depth talks in November.

“Killer robots would threaten the most fundamental of rights and principles in international law,” Steve Goose, arms division director at Human Rights Watch, told reporters.

“The only answer is a preemptive ban,” he added.

U.N.-brokered talks have yielded such bans before: Blinding laser weapons were forbidden by international law in 1998, before they were ever deployed on the battlefield.

Automated weapons are already deployed around the globe.

The best known are drones, unmanned aircraft whose human controllers push the trigger from a far-distant base. Controversy rages, especially over the civilian collateral damage caused when the United States strikes alleged Islamist militants.

Perhaps closest to the Terminator-type killing machine portrayed in Arnold Schwarzenegger’s action films is a Samsung sentry robot used in South Korea, with the ability to spot unusual activity, talk to intruders, and, when authorized by a human controller, shoot them.

Other countries at the cutting edge include Britain, Israel, China, Russia, and Taiwan.

But it’s the next step, the power to kill without a human handler, that rattles opponents the most.

Experts predict that military research could produce such killers within 20 years.

“Lethal autonomous weapons systems are rightly described as the next revolution in military technology, on par with the introduction of gunpowder and nuclear weapons,” Pakistan’s U.N. ambassador Zamir Akram told the meeting.

“In the absence of any human intervention, such weapons in fact fundamentally change the nature of war,” he said, warning that they could undermine global peace and security.

The goal, diplomats said, is not to ban the technology outright.

“We need to keep in mind that these are dual technologies and could have numerous civilian, peaceful, and legitimate uses. This must not be about restricting research in this field,” said French ambassador Jean-Hugues Simon-Michel, chairman of the talks.

Robotics research is also being deployed for firefighting and bomb disposal, for example.

Campaigner Noel Sharkey, emeritus professor of robotics and artificial intelligence at Britain’s University of Sheffield, underlined that autonomy is not the problem in itself.

“I have a robot vacuum cleaner at home; it’s fully autonomous and I do not want it stopped. There is just one thing that we don’t want, and that’s what we call the kill function,” he said.

Supporters of robot weapons say they offer life-saving potential in warfare, being able to get closer than troops to assess a threat properly, without letting emotion cloud their decision making.

But that is precisely what worries their critics.

“If we don’t inject a moral and ethical discussion into this, we won’t control warfare,” said Jody Williams, who received the 1997 Nobel Peace Prize for her campaign for a landmine ban treaty.