Are These 'Killer Robots' or a Smart Move?

Human Rights Watch opposes the use of autonomous weapons

Nov. 22, 2012— -- A new report by Human Rights Watch warns against the possible use of "killer robots," autonomous weapons that would work without human involvement. Such weapons are not currently in use, but the report suggests that they could be developed within the next 20 to 30 years.

According to the report, entitled "Losing Humanity: The Case Against Killer Robots," military officials say that humans will retain some control over decisions to use such weapons lethally, but "their statements often leave open the possibility that robots could one day have the ability to make such choices on their own."

Human Rights Watch is most concerned with the protection of civilians during wars. The organization worries that the technology used to create autonomous weapons might not comply with international humanitarian law and that it would "also undermine essential non-legal safeguards for civilians."

Among the larger issues cited is the inability of a robot to feel emotion and problems with accountability should something go wrong. While the idea of such a future may not seem imminent, machines like drones suggest it's not out of the question. Below, we took a look at what makes a robot a "killer robot," how it works, and the reasoning for and against such weapons.

What Is a So-Called 'Killer Robot'?

The report divides robotic weapons into three categories:

1. Human-in-the-loop weapons: These weapons require human command to select targets and deliver force.

2. Human-on-the-loop weapons: These weapons select targets and deliver force with oversight from a human who can override their actions.

3. Human-out-of-the-loop weapons: These weapons select targets and deliver force without any human help or interaction.

How Would They Work?

An autonomous weapon would, according to the report, "sense an incoming munition, such as a missile or rocket, and…respond automatically to neutralize the threat. Human involvement, when it exists at all, is limited to accepting or overriding the computer's plan of action in a matter of seconds."

All lethal weapons currently in use still involve some human interaction or decision-making, including drones. "Often piloted from halfway around the globe, these robotic aerial vehicles provide surveillance and identify targets before a human decides to pull the trigger, commanding the drone to deliver lethal force."

Another example of this is the Counter Rocket, Artillery, and Mortar System (C-RAM), which the U.S. deployed in Iraq in 2005. According to the report, the system helped intercept rockets and other weapons, and provided warnings to troops. The C-RAM detects a threat and then "a human operator certif[ies] the target," reads the report. (BUT it still involves a human.)

Other countries have weapons defense systems with similar capabilities. Israel has deployed an Iron Dome that uses radar to identify threats. It is armed with response missiles that can be deployed with operator approval. South Korea has also developed and started using sentry robots that have autonomous surveillance capabilities, but they rely on human command to fire.

According to a spokesman for the Department of Defense, weapons are not currently capable of making "suggestions" about targets.

"Weapons used in current unmanned systems, like the Predator, are entirely controlled by a human operator, albeit remotely," wrote U.S. Army Lt. Col. James Gregory in an email. "It would be incorrect to say that those types of weapon systems make 'suggestions' about targets. They do not have this capability."

Israel has deployed an Iron Dome that uses radar to identify threats, and it is armed with response missiles that can be deployed with operator approval.

Why Human Rights Watch Opposes Fully Autonomous Weapons

"Human Rights Watch and Harvard Law School's International Human Rights Clinic believe that such revolutionary weapons would not be consistent with international humanitarian law and would increase the risk of death or injury to civilians during armed conflict. A preemptive prohibition of their development and use is needed."

The report notes that roboticists have suggested developing robots that would be able to use algorithms to "analyze combat situations," and the use of artificial intelligence that would attempt to mimic human thought, but points out that "these rules can be complex and entail subjective decision making, and their observance requires human judgment."

Human Rights Watch argues that robots would not have the restraint provided by human emotion or the capacity for compassion, and goes so far as to suggest that they could "serve as tools of repressive dictators seeking to crack down on their own people without fear their troops would turn on them."

All lethal weapons currently in use still involve some human interaction.

Why the U.S. Military Thinks There are Benefits

"Militaries value these weapons because they require less manpower, reduce the risks to their own soldiers, and can expedite response time," according to the report.

The lack of emotion the Human Rights Watch named as a concern could also work in the military's favor. Proponents argue that automated weapons would not be able to kill out of fear or rage, and therefore less likely to kill irrationally.

The report notes that the U.S. Department of Defense wrote in "Unmanned Systems Integrated Roadmap FY 2011-2036" that it "envisions unmanned systems seamlessly operating with manned systems while gradually reducing the degree of human control and decision making required for the unmanned portion of the force structure."

U.S. Army Lt. Col. James Gregory said during an interview with ABC/Univision News that the Department of Defense "is not currently reviewing any autonomous weapon systems."

He added in an email that "operators controlling [weapons such as the Predator drone] undergo extensive protocols whenever any lethal force is employed. Regardless of their physical location, weapon system operators must comply with the same standards for the use of force -- the law of war and applicable rules of engagement."

As the report acknowledges, both the U.S. Department of Defense and the U.K. Ministry of Defense have said that they don't plan, for the foreseeable future, to remove human control from the use of unmanned weapons.