Urgent Call To Ban Killer Robots

by | Nov 14, 2019 | News, Vincentian Family at the U.N.

With input from Caitlin Neier, Intern for the Daughters of Charity at the UN

On Monday, in the United States, we honored veterans.  Commonwealth nations observed Remembrance Day, held in memory of those who have served in armed forces and died in the line of duty.   It is more than fitting that we remember those who have provided military service or made the ultimate sacrifice so that the rest of us can enjoy freedom.

But we can’t reflect on the impact of war and conflict without considering what future warfare might be like.

At the United Nations, civil society recently focused on one specific means of warfare—lethal autonomous weapons (LAWS), also known as fully autonomous weapons or, more commonly, “killer robots.”   Such weapons could potentially kill people without a direct instruction from a human being.

“Allowing a machine in theory through algorithms to decide what they will target and what they will attack is one of the huge reasons why we consider it crossing the Rubicon and grossly unethical and immoral.” – Nobel Peace Laureate, Jody Williams

There are many questions about whether autonomous weapons can comply with International Humanitarian Law.  One of the four basic principles of  the Law of Armed Conflict states that the parties to the conflict shall at all times distinguish between the civilian population and combatants and between civilian objects and military objectives and accordingly shall direct their operations only against military objectives.  Under the principle of proportionality, the means and methods of warfare used must not be disproportionate to the military advantage sought.  It is unlikely that lethal autonomous weapons can comply with these rules.   Questions also arise about who takes responsibility for errors.

On October 21 at the United Nations, a group called for pressure to stimulate negotiations on a new international treaty to prohibit killer robots.

Photo credit: Caitlin Neier

Mary Wareham, of Human Rights Watch, Arms Division, coordinates the Campaign to Stop Killer Robots, which is comprised of 130 nongovernmental organizations in 60 countries. The group is encouraging governments to launch negotiations on a new international treaty to prohibit weapons systems that would select and engage targets without meaningful human control.

“We believe they cross the threshold of acceptability and therefore must be prohibited,” Wareham said.

Liz O’Sullivan, from the International Committee of Robot Arms Control, notes a number of concerns with killer robots.   First, the machines can’t infer traits, she said, but rather only respond to a program. In example, if someone had their hands up in the air, a human would interpret it as an act of surrender. But a lethal autonomous weapon could interpret the action as a false surrender and either strike the person or pass them by.

Potential hacking and the harm it could cause are also issues.  Further, unprincipled military regimes could potentially use such systems on their own people.  The weapons also rely heavily on algorithms, which can be prone to bias or failure, O’Sullivan said.

It was comforting to hear O’Sullivan say that in tech communities, some workers are trying to prevent their efforts from being put towards nefarious ends.

“The more people know about this weaponry, the more they do not want to see it unleashed upon this planet.” – Nobel Peace Laureate, Jody Williams

Also present at the meeting was Nobel Peace Laureate and co-founder of the Campaign to Stop Killer Robots, Jody Williams.

“Allowing a machine in theory through algorithms to decide what they will target and what they will attack is one of the huge reasons why we consider it crossing the Rubicon and grossly unethical and immoral,” she said.   A machine should be in the service of human beings rather than the other way around, Williams indicated.

“The more people know about this weaponry, the more they do not want to see it unleashed upon this planet,” said Williams.

Photo credit: Caitlin Neier

An October episode of Madam Secretary focused on the many concerns about killer robots. In the episode, (fictional) US President Elizabeth McCord chooses Navy Seal action rather than engaging lethally autonomous weapons.

In March, UN leader Antonio Guterres commented that “machines with the power and discretion to take lives without human involvement are politically unacceptable, morally repugnant and should be prohibited by international law.”

In September, a high level group at the UN (the Alliance for Multilateralism) recognized during the UN General Assembly the potential dangers presented by lethal autonomous weapons. In an effort led by France and Germany, multiple foreign ministers endorsed a declaration expressing the goal of promoting rules based international order and committed to addressing killer robots.  But not all countries are on board.  During an August meeting of the Convention on Certain Conventional Weapons (CCW), some major military powers opposed negotiation of a new treaty on killer robots because they felt it was premature.

From November 13-15, the CCW meeting of high contracting parties, in Geneva, will decide on future work to address killer robots in 2020.  Hopefully, it will result in plans to move forward a treaty to protect the globe from killer robots.

 

0 Comments

FAMVIN

FREE
VIEW