Science fiction may become reality with ‘killer robots’

For more than a decade, serious concern has been raised about civilian victims of drone strikes, yet there is still little transparency or accountability, and the attacks continue. A strike in December on a wedding procession in Yemen killed 12 men and wounded at least 15 other people, including the bride.

May 16, 2014

For more than a decade, serious concern has been raised about civilian victims of drone strikes, yet there is still little transparency or accountability, and the attacks continue. A strike in December on a wedding procession in Yemen killed 12 men and wounded at least 15 other people, including the bride.

Drones are unmanned aircraft with human operators remotely controlling their targeting and firing. But concerns are now mounting over developments that may produce fully autonomous drones and other weapons systems that will be able to identify and fire at targets without any human control.

Over the past year and a half, the Campaign to Stop Killer Robots, a UN special rapporteur, and an increasing number of governments have expressed concern at the prospect of fully autonomous weapons – those that would make targeting and kill decisions on their own.

These “killer robots” do not yet exist, but nations are working on precursors that reflect the move toward ever greater autonomy.

During a debate at the United Nations in November, the Holy See predicted that questions over the military applications of robotic technologies will “grow in relevance and urgency.”

The Vatican representative expressed concern at the present use of armed drones, but described as “most critical” the inability of “pre-programmed, automated technical systems to make moral judgments over life and death, to respect human rights, and to comply with the principle of humanity.”

At least six countries are reported to be pursuing autonomous weapons: the United States, China, Israel, Russia, South Korea, and the United Kingdom.

Autonomous aircraft developed by the US and UK have had test flights in recent months. South Korea has fielded an armed ground-based “sentry robot” along the demilitarized zone with North Korea.

As nations develop the technological capability, many may choose to go down the path toward full autonomy, because of the benefits these weapons could provide: rapid response time, reduced risk to their own soldiers, fewer costs, and insulation from the effects of human emotions involved in decisions to use force.

Yet the consequences of the lack of human control far outweigh these advantages. Perhaps most fundamental is the assertion that humans bring judgment and compassion into decisions about the use of lethal force and that cannot be replaced by a machine.

Many doubt that fully autonomous weapons could comply with the complex and subjective rules of international humanitarian law which require human understanding and judgment.

There’s no certainty these weapons would be able to distinguish between combatants and civilians. It would be difficult – if not impossible – to program them to carry out the complex proportionality test to assess whether the anticipated military advantage from any given attack outweighed the likely harm to civilians.

If such weapons do target and kill unlawfully, there would be an accountability gap, with legal and practical obstacles to holding anyone responsible for such an attack – commander, programmer or manufacturer alike.

Similar questions would arise if these weapons were used not on the battlefield but for law enforcement operations. And finally, it is questionable whether allowing machines to decide who to kill would ever comport with human dignity or the dictates of public conscience.

“Shaking the Foundations,” a new report to be released on May 12 by Human Rights Watch and Harvard Law School’s International Human Rights Clinic, finds that the weapons threaten to violate the most fundamental rights to life and to a remedy, as well as to undermine the underlying principle of human dignity.

To provide a coordinated response to mounting concerns over the weapons and increase public awareness and support for the principle of human control over targeting and attack decisions, non-governmental organizations started the Campaign to Stop Killer Robots a year ago. The coalition is seeking a pre-emptive ban on the development, production and use of fully autonomous weapons.

The global response has been unprecedented in its swiftness. Nations first considered the matter at the UN Human Rights Council in May last year when the UN special rapporteur Christof Heyns delivered a report that called on all nations to enact an immediate moratorium.

Less than six months later, nations agreed to begin a diplomatic process in 2014 to start considering technical, legal, ethical, operational and other questions relating to this emerging technology.

The decision by the Convention on Conventional Weapons (CCW) was taken by consensus, a rare feat in today’s disarmament diplomacy, as it requires that no nation object to the proposal.

Many of the 117 states that are part of the CCW are expected to attend the four-day experts meeting, which opens at the United Nations in Geneva on May 13.

The meeting is open to all nations regardless of whether they have joined the convention.

States parties from Asia include China and South Korea, both known to be actively developing autonomous weapons, as well as the robotics leader Japan.

Other Asian nations participating include India, Pakistan, Bangladesh, Cambodia, Laos and the Philippines, plus Australia and New Zealand.

Representatives of international and UN agencies, the International Committee of the Red Cross, regional bodies, and registered nongovernmental groups are also participating.

The Campaign to Stop Killer Robots delegation of 40 experts includes the Nobel Peace laureate Jody Williams, former UN disarmament chief Ambassador Jayantha Dhanapala from Sri Lanka and Yukie Osa, the head of Japan’s largest humanitarian and disaster relief organization, Association for Aid and Relief Japan.

Japan is one of 44 nations to have spoken on the subject, noting the challenges posed by fully autonomous weapons in such areas as human rights, law, technology and arms control. Japanese robotics experts are scheduled to provide technical presentations at the meeting.

Under an international ban it will still be possible for Japan, China and other nations to continue to develop their expertise in robotics and autonomy. The campaign seeks to ban the weaponization of that technology through an international treaty, drawing the line at systems with no meaningful human control over targeting and attack decisions. Once the technology exists its already too late.--ucanews.com

Total Comments:0

Name
Email
Comments