Autonomous weapons are defined by the group as artillery that can "search for and eliminate people meeting certain pre-defined criteria, but do not include cruise missiles or remotely piloted drones for which humans make all targeting decisions."
"Artificial Intelligence (AI) technology has reached a point where the deployment of such systems is -- practically if not legally -- feasible within years, not decades, and the stakes are high: autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms," the letter, posted on the Future of Life Institute's website says.
If one country pushes ahead with the creation of robotic killers, the group wrote it fears it will spur a global arms race that could spell disaster for humanity.
"Autonomous weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group," the letter says. "We therefore believe that a military AI arms race would not be beneficial for humanity. There are many ways in which AI can make battlefields safer for humans, especially civilians, without creating new tools for killing people."
While the group warns of the potential carnage killer robots could inflict, they also stress they aren't against certain advances in artificial intelligence.
"We believe that AI has great potential to benefit humanity in many ways, and that the goal of the field should be to do so," the letter says. "Starting a military AI arms race is a bad idea, and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control."