Killer Robots: The Moral and Legal Concerns

Representatives of over 70 nations met at the UN to debate how to deal with autonomous weapons systems. (Image: Screenshot / Youtube)
Representatives of over 70 nations met at the UN to debate how to deal with autonomous weapons systems. (Image: Screenshot / Youtube)

In August 2018, representatives of over 70 nations met at the UN to debate how to deal with autonomous weapons systems. The reason — the countries felt that killer robots presented a very complex set of moral and legal challenges to the human race that needed to be addressed as quickly as possible.

Ethical concerns

The biggest ethical issue with killer robots is that they do not possess human feelings. As such, they will never feel the accountability and remorse that a human might feel when they choose to kill someone, even if it is their enemy. And the fact that these machines can simply keep killing hundreds and thousands of people without ever feeling any remorse is indeed scary.

After all, if one does not feel remorse when killing, it would mean that the act of killing would potentially never stop. We would essentially be unleashing a remorseless killing machine that can keep taking people’s lives all day, every day, year after year, to infinity. Is that something we should invent?

Another moral problem with autonomous killing machines has to do with how they will decide whom to kill and whom not to. Since the machines will probably be configured to attain an objective, the AI will inevitably use all possible measures to fulfill its purpose.

Battlefield ER- The brutal life of a warzone medic 3-55 screenshot

If an autonomous killing machine is sent into a war zone, how will it decide whom to kill and whom not to? (Image: Screenshot / Youtube)

Imagine sending a killer robot into a war zone to mitigate the threat of terrorists hiding in a village. The machine might decide that it would be ideal to bomb the entire village to kill the terrorists even if it results in the death of children and pregnant women.

Supporters of killer robots might argue that one could instruct the machine to avoid killing children and women. But by doing that, the terrorists can simply use women and children against the robot to neutralize it, effectively making the machine useless. Under such circumstances, it is not impossible to imagine that the team carrying out the operation might just decide to let the killer robot do anything it wants as long as it produces results.

Legal issues

In addition to the moral issues, we also have to deal with the legal problems of using killer robots. And a major legal issue to tackle will be the Martens Clause.

The Martens Clause is an international law that asks governments and militaries not to deviate “from the principles of humanity and from the dictates of public conscience” while using weapons. Automated killing machines definitely violate this standard.

Battlefield ER- The brutal life of a warzone medic 0-59 screenshot

The Martens Clause asks governments and militaries not to deviate ‘from the principles of humanity and from the dictates of public conscience’ while using weapons. Automated killing machines definitely violate this standard. (Image: Screenshot / Youtube)

“The idea of delegating life and death decisions to cold compassionless machines without empathy or understanding cannot comply with the Martens clause and it makes my blood run cold,” The Guardian quotes roboticist Noel Sharkey.

There is also the issue of accountability. If a killer robot were given an instruction to just kill a single person in an area and it then ends up killing three people because of an error, who will be held accountable for the death of the other two innocent people?

The instructor certainly cannot be held responsible for their deaths since they strictly commanded the robot to just eliminate a single person. The machine obviously cannot be held accountable. As such, the sad fact is that no one will be held accountable for the extra two deaths.

In fact, it would simply be recorded as an accidental death or collateral damage. No one would be punished. And when killing innocent people does not result in punishment, we will have gone deep into a moral black hole from which we might never return.

Follow us on Twitter or subscribe to our weekly email

 

This Faraway Planet Might Have Vast Oceans and Be Habitable
How Climate Change Is Making Hurricanes Worse
#article-ad-block-->