APR 13, 2015 9:15 AM PDT

UN urged to ban 'killer robots' before they can be developed

WRITTEN BY: Robert Woodard
A protest takes place outside the London offices of the defence contractor General Atomics against drones and killer robots.Fully autonomous weapons should be banned by international treaty, says a report by Human Rights Watch and Harvard Law School

Fully autonomous weapons, already denounced as "killer robots", should be banned by international treaty before they can be developed, a new report urges the United Nations .

Under existing laws, computer programmers, manufacturers and military commanders would all escape liability for deaths caused by such machines, according to the study published on Thursday by Human Rights Watch and Harvard Law School.

Nor is there likely to be any clear legal framework in future that would establish the responsibility of those involved in producing or operating advanced weapons systems, say the authors of Mind the Gap: The Lack of Accountability for Killer Robots.

The report is released ahead of an international meeting on lethal autonomous weapons systems at the UN in Geneva starting on 13 April. The session will discuss additions to the convention on certain conventional weapons.

Also known as the inhumane weapons convention, the treaty has been regularly reinforced by new protocols on emerging military technology. Blinding laser weapons were pre-emptively outlawed in 1995 and combatant nations since 2006 have been required to remove unexploded cluster bombs.

Military deployment of the current generation of drones is defended by the Ministry of Defence and other governments on the grounds that there is always a man or woman "in the loop", ultimately deciding whether or not to trigger a missile.
Rapid technical progress towards the next stage of automation, in which weapons may select their own targets, has alarmed scientists and human rights campaigners.

"Fully autonomous weapons do not yet exist," the report acknowledges. "But technology is moving in their direction, and precursors are already in use or development. For example, many countries use weapons defence systems - such as the Israeli Iron Dome and the US Phalanx and C-RAM - that are programmed to respond automatically to threats from incoming munitions.

"Prototypes exist for planes that could autonomously fly on intercontinental missions [the UK's Taranis] or take off and land on an aircraft carrier [the US's X-47B].

"The lack of meaningful human control places fully autonomous weapons in an ambiguous and troubling position. On the one hand, while traditional weapons are tools in the hands of human beings, fully autonomous weapons, once deployed, would make their own determinations about the use of lethal force.

"They would thus challenge longstanding notions of the role of arms in armed conflict, and for some legal analyses, they would be more akin to a human soldier than to an inanimate weapon. On the other hand, fully autonomous weapons would fall far short of being human."

The report calls for a prohibition "on the development, production and use of fully autonomous weapons through an international legally binding" agreement, and urges states to adopt similar domestic laws.

The hurdles to accountability for the production and use of fully autonomous weapons under current law are monumental, the report states. "Weapons could not be held accountable for their conduct because they could not act with criminal intent, would fall outside the jurisdiction of international tribunals and could not be punished.

"Criminal liability would likely apply only in situations where humans specifically intended to use the robots to violate the law. In the United States at least, civil liability would be virtually impossible due to the immunity granted by law to the military and its contractors and the evidentiary obstacles to products liability suits."

Bonnie Docherty, HRW's senior arms division researcher and the report's lead author, said: "No accountability means no deterrence of future crimes, no retribution for victims, no social condemnation of the responsible party. The many obstacles to justice for potential victims show why we urgently need to ban fully autonomous weapons."

• Human Rights Watch is a co-founder of the Campaign to Stop Killer Robots, which is supported by more than 50 NGOs and supports a preemptive ban on the development, production and use of fully autonomous weapons.

(Source: theguardian.com)
About the Author
Bachelor's (BA/BS/Other)
You May Also Like
Loading Comments...