IE 11 is not supported. For an optimal experience visit our site on another browser.

Killer Robots Condemned in New UN Report

Killer robots might sound like the stuff of science fiction, but they're alarmingly close to becoming a reality. A new report from the United Nations Human Rights Commission suggests that lethal autonomous robots need to be regulated before they become the military weapons of the future.
/ Source: TechNewsDaily

Killer robots might sound like the stuff of science fiction, but they're alarmingly close to becoming a reality. A new report from the United Nations Human Rights Commission suggests that lethal autonomous robots need to be regulated before they become the military weapons of the future.

The report — which was debated at the Human Rights Council in Geneva on May 29 — states that the United States, Israel, the United Kingdom, South Korea and Japan all possess lethal robots that are either fully or semi-autonomous.

Some of these machines — or "lethal autonomous robotics" (LARS), as they are called in the report — can allegedly choose and execute their own targets without human input.

The author of the report, South African human rights professor Christof Heyns, calls for a worldwide moratorium on the "testing, production, assembly, transfer, acquisition, deployment and use" of these killer robots until further regulations are put in place to govern their use.

According to the Associated Press, the report cites at least four examples of fully or semiautonomous weapons that have already been developed around the world. The report includes the U.S. Phalanx system for Aegis-class cruisers, which automatically detects, tracks and engages antiship aircraft.

Other examples of existing LARS include Israel's Harpy, an autonomous weapon that detects, attacks and destroys radar emitters; the U.K's Taranis, a jet-propelled drone that can autonomously locate targets; and South Korea's Samsung Techwin surveillance system, which autonomously detects targets in the demilitarized zone between North and South Korea.

While the U.N. report focuses mainly on LARS, it also decries the recent upsurge in the use of unmanned aerial vehicles — or drones — by the U.S. military, and other nations.

"[Drones] enable those who control lethal force not to be physically present when it is deployed, but rather activate it while sitting behind computers in faraway places, and stay out of the line of fire," Heyns wrote.

"Lethal autonomous robotics, if added to the arsenals of States, would add a new dimension to this distancing, in that targeting decisions could be taken by the robots themselves. In addition to being physically removed from the kinetic action, humans would also become more detached from decisions to kill — and their execution."

The use of unmanned aircraft to carry out bombing missions in the Middle East is already a hotbed issue in the U.S. And recently, killer robots have also been receiving attention from several groups that wish to bring an end to their ongoing development.

In November 2012, Human Rights Watch called for an international ban on fully autonomous robots. And just last month, the Campaign to Stop Killer Robots was launched in London by a coalition of human rights groups demanding a ban on the future development of autonomous weapons.

The argument against autonomous weapons is summed up by Heyns in the U.N.'s new report.

"Decisions over life and death in armed conflict may require compassion and intuition," Heyns wrote. "Humans — while they are fallible — at least might possess these qualities, whereas robots definitely do not."

There are, however those who argue for the use of drones precisely because of their lack of human emotions, a point of view that Heyns includes in the Human Rights Commission's findings.

"[LARS] will not be susceptible to some of the human shortcomings that may undermine the protection of life," Heyns wrote. "Typically they would not act out of revenge, panic, anger, spite, prejudice or fear.

Moreover, unless specifically programmed to do so, robots would not cause intentional suffering on civilian populations; for example, through torture. Robots also do not rape." [See also: Military Struggles to Find Limits of Robot Autonomy ]

Email asklizzyp@gmail.com  or follow her  @techEpalermo. Follow us @TechNewsDaily, on Facebook or on  Google+.