IE 11 is not supported. For an optimal experience visit our site on another browser.

Attack of the Killer Robots! Can — and Should — They Be Stopped?

Autonomous weapons — machines that can self-direct to kill human targets — do not exist yet, and if one British group has its way, they never will. The Campaign to Stop Killer Robots launched in London today (April 24): a coalition of human rights groups dedicated to keeping warfare between human or human-directed combatants.
/ Source: TechNewsDaily

Autonomous weapons — machines that can self-direct to kill human targets — do not exist yet, and if one British group has its way, they never will. The Campaign to Stop Killer Robots launched in London today (April 24): a coalition of human rights groups dedicated to keeping warfare between human or human-directed combatants.

The Campaign to Stop Killer Robots urges the global community to draft a treaty that would ban all future development or deployment of autonomous weapons. "Allowing life-or-death decisions on the battlefield to be made by machines crosses a fundamental moral line and represents unacceptable application of technology," Jody Williams, political activist and Nobel Peace Laureate, said in a statement for the group. Williams believes that weapons should remain firmly in human hands, and the rest of the organization agrees with her.

The campaign draws membership from a variety of sources, including the International Committee for Robot Arms Control, the Nobel Women's Initiative and Mines Action Canada. (The group has considerable support from anti-landmine activists because they target similar "autonomous" killing devices.)

Although the Campaign to Stop Killer Robots objects to autonomous weapons on primarily moral grounds, it also lays out a number of legal and technical arguments against the machines. Robots, the group claims, would have difficulty distinguishing between military and civilian targets, erase human accountability for war crimes, spark an arms race and make war more attractive by reducing the amount of human lives lost in the process.

Despite the media's recent interest in military drones, the group does not appear to object, in principle, to unmanned aerial vehicles. These machines are semi-autonomous but still require a human controller. The group does not, in theory, object to sophisticated future robots with the capacity for thought or feeling who could hypothetically decide to participate in warfare. However, the Campaign to Stop Killer Robots does assert that the current technology is nowhere near nuanced enough for the group to support the idea of robot combatants. [See also: 5 Reasons to Fear Robots ]

"Killer robots are not self-willed 'Terminator'-style robots," Noel Sharkey, chairman of the International Committee for Robot Arms Control, said in a statement. "Computer-controlled devices can be hacked, jammed, spoofed or can be simply fooled and misdirected by humans."

The group brings up a number of interesting quandaries that have long been staples of science fiction. In 1967, "Star Trek" posed the question of whether robots can responsibly conduct a war involving human combatants in the episode "A Taste of Armageddon." James Cameron's "Terminator" films ask whether a robot designed for the purpose of killing can learn to understand human emotions and the need for compassion.

The morals of the issue are hardly clear-cut, either. Developing autonomous weapons may be immoral, but is it more or less so than sending human combatants to die? Can the group, in good faith, ask robotics researchers to stop pursuing a line of scientific inquiry?

The Campaign Against Killer Robots will likely grow as the organization seeks out global support, but don't expect any easy answers on this issue.

Follow Marshall Honorof . Follow us , on  Facebook  or on .