APR 13, 2015 09:15 AM PDT

UN urged to ban 'killer robots' before they can be developed

A protest takes place outside the London offices of the defence contractor General Atomics against drones and killer robots.Fully autonomous weapons should be banned by international treaty, says a report by Human Rights Watch and Harvard Law School

Fully autonomous weapons, already denounced as "killer robots", should be banned by international treaty before they can be developed, a new report urges the United Nations .

Under existing laws, computer programmers, manufacturers and military commanders would all escape liability for deaths caused by such machines, according to the study published on Thursday by Human Rights Watch and Harvard Law School.

Nor is there likely to be any clear legal framework in future that would establish the responsibility of those involved in producing or operating advanced weapons systems, say the authors of Mind the Gap: The Lack of Accountability for Killer Robots.

The report is released ahead of an international meeting on lethal autonomous weapons systems at the UN in Geneva starting on 13 April. The session will discuss additions to the convention on certain conventional weapons.

Also known as the inhumane weapons convention, the treaty has been regularly reinforced by new protocols on emerging military technology. Blinding laser weapons were pre-emptively outlawed in 1995 and combatant nations since 2006 have been required to remove unexploded cluster bombs.

Military deployment of the current generation of drones is defended by the Ministry of Defence and other governments on the grounds that there is always a man or woman "in the loop", ultimately deciding whether or not to trigger a missile.
Rapid technical progress towards the next stage of automation, in which weapons may select their own targets, has alarmed scientists and human rights campaigners.

"Fully autonomous weapons do not yet exist," the report acknowledges. "But technology is moving in their direction, and precursors are already in use or development. For example, many countries use weapons defence systems - such as the Israeli Iron Dome and the US Phalanx and C-RAM - that are programmed to respond automatically to threats from incoming munitions.

"Prototypes exist for planes that could autonomously fly on intercontinental missions [the UK's Taranis] or take off and land on an aircraft carrier [the US's X-47B].

"The lack of meaningful human control places fully autonomous weapons in an ambiguous and troubling position. On the one hand, while traditional weapons are tools in the hands of human beings, fully autonomous weapons, once deployed, would make their own determinations about the use of lethal force.

"They would thus challenge longstanding notions of the role of arms in armed conflict, and for some legal analyses, they would be more akin to a human soldier than to an inanimate weapon. On the other hand, fully autonomous weapons would fall far short of being human."

The report calls for a prohibition "on the development, production and use of fully autonomous weapons through an international legally binding" agreement, and urges states to adopt similar domestic laws.

The hurdles to accountability for the production and use of fully autonomous weapons under current law are monumental, the report states. "Weapons could not be held accountable for their conduct because they could not act with criminal intent, would fall outside the jurisdiction of international tribunals and could not be punished.

"Criminal liability would likely apply only in situations where humans specifically intended to use the robots to violate the law. In the United States at least, civil liability would be virtually impossible due to the immunity granted by law to the military and its contractors and the evidentiary obstacles to products liability suits."

Bonnie Docherty, HRW's senior arms division researcher and the report's lead author, said: "No accountability means no deterrence of future crimes, no retribution for victims, no social condemnation of the responsible party. The many obstacles to justice for potential victims show why we urgently need to ban fully autonomous weapons."

• Human Rights Watch is a co-founder of the Campaign to Stop Killer Robots, which is supported by more than 50 NGOs and supports a preemptive ban on the development, production and use of fully autonomous weapons.

(Source: theguardian.com)
About the Author
You May Also Like
SEP 27, 2018
Videos
SEP 27, 2018
Why haven't Tesla's solar roofs taken off yet?
In October of 2016, Tesla came out with prototypes of solar roof panels that were meant to revolutionize the way we think about producing energy for our ho...
SEP 30, 2018
Technology
SEP 30, 2018
Bridging Psychology and Gamification
To significantly impact learning efforts in user experience design, healthcare, and government, a research team is seeking to close the gap between psychol...
OCT 19, 2018
Chemistry & Physics
OCT 19, 2018
World's Fastest Camera Captures 10 Trillion Frames Per Second in a Single Shot
Capturing the swift passing of light in a scattering medium, such as human tissues,  has a lot of potentials in biomedical imaging. But the existing i...
NOV 03, 2018
Technology
NOV 03, 2018
Neurotechnology Treats Paralysis
The latest study at the intersection of technology and neuroscience is the STIMO (STImulation Movement Overground) study, which has established the ne...
NOV 10, 2018
Technology
NOV 10, 2018
Creating Effective Wind Turbine Technologies
A rise in energy demands prompted scientists at Penn State Behrend and the University of Tabriz, Iran to create an algorithm for designing more efficient w...
NOV 16, 2018
Technology
NOV 16, 2018
No More Charging Smart Devices?
Someday it may become true when our cell phones, tablets, and other smart devices do not need charging—thanks to research performed at the University...
Loading Comments...