APR 13, 2015 09:15 AM PDT

UN urged to ban 'killer robots' before they can be developed

A protest takes place outside the London offices of the defence contractor General Atomics against drones and killer robots.Fully autonomous weapons should be banned by international treaty, says a report by Human Rights Watch and Harvard Law School

Fully autonomous weapons, already denounced as "killer robots", should be banned by international treaty before they can be developed, a new report urges the United Nations .

Under existing laws, computer programmers, manufacturers and military commanders would all escape liability for deaths caused by such machines, according to the study published on Thursday by Human Rights Watch and Harvard Law School.

Nor is there likely to be any clear legal framework in future that would establish the responsibility of those involved in producing or operating advanced weapons systems, say the authors of Mind the Gap: The Lack of Accountability for Killer Robots.

The report is released ahead of an international meeting on lethal autonomous weapons systems at the UN in Geneva starting on 13 April. The session will discuss additions to the convention on certain conventional weapons.

Also known as the inhumane weapons convention, the treaty has been regularly reinforced by new protocols on emerging military technology. Blinding laser weapons were pre-emptively outlawed in 1995 and combatant nations since 2006 have been required to remove unexploded cluster bombs.

Military deployment of the current generation of drones is defended by the Ministry of Defence and other governments on the grounds that there is always a man or woman "in the loop", ultimately deciding whether or not to trigger a missile.
Rapid technical progress towards the next stage of automation, in which weapons may select their own targets, has alarmed scientists and human rights campaigners.

"Fully autonomous weapons do not yet exist," the report acknowledges. "But technology is moving in their direction, and precursors are already in use or development. For example, many countries use weapons defence systems - such as the Israeli Iron Dome and the US Phalanx and C-RAM - that are programmed to respond automatically to threats from incoming munitions.

"Prototypes exist for planes that could autonomously fly on intercontinental missions [the UK's Taranis] or take off and land on an aircraft carrier [the US's X-47B].

"The lack of meaningful human control places fully autonomous weapons in an ambiguous and troubling position. On the one hand, while traditional weapons are tools in the hands of human beings, fully autonomous weapons, once deployed, would make their own determinations about the use of lethal force.

"They would thus challenge longstanding notions of the role of arms in armed conflict, and for some legal analyses, they would be more akin to a human soldier than to an inanimate weapon. On the other hand, fully autonomous weapons would fall far short of being human."

The report calls for a prohibition "on the development, production and use of fully autonomous weapons through an international legally binding" agreement, and urges states to adopt similar domestic laws.

The hurdles to accountability for the production and use of fully autonomous weapons under current law are monumental, the report states. "Weapons could not be held accountable for their conduct because they could not act with criminal intent, would fall outside the jurisdiction of international tribunals and could not be punished.

"Criminal liability would likely apply only in situations where humans specifically intended to use the robots to violate the law. In the United States at least, civil liability would be virtually impossible due to the immunity granted by law to the military and its contractors and the evidentiary obstacles to products liability suits."

Bonnie Docherty, HRW's senior arms division researcher and the report's lead author, said: "No accountability means no deterrence of future crimes, no retribution for victims, no social condemnation of the responsible party. The many obstacles to justice for potential victims show why we urgently need to ban fully autonomous weapons."

• Human Rights Watch is a co-founder of the Campaign to Stop Killer Robots, which is supported by more than 50 NGOs and supports a preemptive ban on the development, production and use of fully autonomous weapons.

(Source: theguardian.com)
About the Author
You May Also Like
OCT 18, 2019
Space & Astronomy
OCT 18, 2019
Here's Why NASA's Upcoming Dragonfly Mission is So Exciting
NASA’s had an interest in Saturn’s moon Titan for a very long time. Previous space-centric missions including Cassini-Huygens and Voyager 2 mad...
OCT 18, 2019
Chemistry & Physics
OCT 18, 2019
Quantum Darwinism: the Quirky Marriage of Two Distant Theories
Charles Darwin wouldn't have foreseen that one day his theory of natural selection would be borrowed to explain a puzzle that has bothered quantum phys...
OCT 18, 2019
Space & Astronomy
OCT 18, 2019
Watch SpaceX Fly its Starhopper Prototype 150 Meters in the Air
SpaceX conducted another test flight of its Starhopper prototype starship on Tuesday, this time flying it more than 150 meters in the air. Tuesday’s...
OCT 18, 2019
Space & Astronomy
OCT 18, 2019
How is NASA's InSight Mission on Mars Doing?
NASA’s InSight mission officially landed on Mars last November, and perhaps unsurprisingly, the media hyped this mission’s purpose on the red p...
OCT 18, 2019
Neuroscience
OCT 18, 2019
Does free will exist? Neuroscientists debunk popular argument against free will
For decades, a groundbreaking brain study fueled opinions about whether we have free will over our actions. The conclusions drawn from this experiment, how...
OCT 18, 2019
Technology
OCT 18, 2019
Electrical Technology To Treat Baldness?
Hair loss still remains a fear among most men. However, reversing baldness may someday be simple as wearing a hat. Now, researchers at the University of Wi...
Loading Comments...