"Killer robots" should be banned by international law before the machines see the battlefield, a new report warns the United Nations. Allowing fully autonomous weapons to attack targets without human input could lead to an "accountability gap" where nobody is legally responsible for the deaths the robots might cause, the report said. It was released on Thursday by the Human Rights Program at Harvard Law School and the non-governmental organization Human Rights Watch.The report isn't talking about the drones used in battle today. Those have someone pulling the trigger. Instead, it's talking about machines that would "select and engage targets without meaningful human control."
The report warns of a potential future "arms race" in which nation-states and militant groups release robots that "could be programmed to indiscriminately kill" the enemy population. Since you can't prosecute a machine, it would be unclear who would be responsible for wrongful deaths. “No accountability means no deterrence of future crimes, no retribution for victims, no social condemnation of the responsible party,” Bonnie Docherty, senior arms division researcher at Human Rights Watch, said in a statement. The 38-page report was released ahead of a U.N. meeting on lethal autonomous weapons systems next week in Geneva.
- Future Tech? Autonomous Killer Robots Are Already Here
- Scientists Debate Killer Robots at U.N. Conference
- Mind the Gap: Lethal Autonomous Weapons Systems (Report)