Deadly robots deposit in a basic moral legal test -Principles and public awareness call for a preventive ban
The rise of the opposition from several quarters to weapons of full self-control shows how public awareness supports the prohibition of weapons that lack real human control. © 2018 Contact Christian / Human Rights Watch
Deadly robots deposit in a basic moral legal test
Principles and public awareness call for a preventive ban
GENEVA - Human rights and public awareness support a ban on self-governing weapons, Human Rights Watch said in a report released today. States participating in the upcoming international meeting on "Deadly Robots" should negotiate the prohibition of the development, production and use of these weapons systems.
The "Call for a moral and legal imperative to ban lethal robots" (46 pages) argues that self-controlling weapons violate the so-called "Martens clause" of international humanitarian law, which states that emerging technologies must be judged on the "principles of humanity" And "public awareness", when they are not already covered by the provisions of other treaties.
"Letting the development and use of killer robots undermine established moral and legal standards, states should work together to ban these weapons systems before they multiply in the world," said Bonnie Dorte, senior researcher at Arms Department at Human Rights Watch and coordinator of the Stop Robotic Robots campaign.
The 1995 preventive ban on blinding laser weapons, largely driven by the Martens condition, provides a precedent for the prohibition of self-governing weapons as they approach reality.
The report was published in collaboration with the International Clinic for Human Rights at Harvard Law School, where Dorty is Assistant Director of the Department of Armed Conflict and Protection of Civilians.
More than 70 governments will meet at the United Nations in Geneva from 27 to 31 August 2018 for the sixth time since 2014 on the challenges posed by self-controlling weapons, also known as "lethal, self-governing weapons systems". The talks were formalized under the Conventional Weapons Convention, a major disarmament treaty, in 2017, but have yet to be directed towards a specific objective.
Human Rights Watch and the Stop the Robotic Robots campaign urge States parties to the Convention to agree to begin negotiations in 2019 to develop a new treaty that requires meaningful human control over weapons systems and the use of force. Self-control weapons choose targets and attack them without real human control.
To date, 26 countries have explicitly supported the self-governing arms embargo. He also called for thousands of scientists and artificial intelligence experts, more than 20 Nobel Peace Prize laureates and more than 160 religious leaders and organizations of various affiliations to ban. In June, Google released a set of ethical principles including a pledge not to develop artificial intelligence for use in weapons.
In the CCW meetings, almost all countries called for the maintenance of a form of human control over the use of force. The consensus reflects the maintenance of meaningful human control, similar to the arms embargo that lacks such control, the broad opposition to self-governing weapons.
Human Rights Watch and the Harvard Clinic evaluated self-control weapons through the essential elements of the Martens condition. This is the condition of the Geneva Conventions, which is referred to in many disarmament treaties because of the absence of the provisions of a specific international treaty on a subject. This condition sets the moral basis for judging emerging weapons.
The groups found that self-governing weapons undermined the principles of humanity, because they would not be able to apply any precise legal and ethical judgment to decisions on the use of lethal force. Without these human qualities, weapons face great obstacles in ensuring humane treatment of others and showing respect for human life and dignity.
Self-control weapons also run counter to public awareness. Governments, experts and the public have widely condemned the loss of human control over the use of force.
Partial measures, such as political instructions or declarations that lack a legally binding prohibition, may not succeed in eliminating the many dangers posed by self-controlling weapons. In addition to violating the Martens clause, these weapons raise other legal concerns related to accountability, security and technology.
In previous publications, Human Rights Watch and the Harvard Clinic talked about the challenges that self-governing weapons would impose on compliance with international humanitarian law and international human rights law, and bridged the gap in accountability for the illegal damage caused by these weapons to critics of the preventive ban.
Algeria, Argentina, Austria, Bolivia, Brazil, Chile, China, Colombia, Cuba, Ecuador, Guatemala, Mexico, Nicaragua, Peru, Pakistan, Panama, Peru, Palestine, Uganda, Venezuela and Zimbabwe.
The Stop the Robotic Robots campaign, launched in 2013, is a coalition of 75 non-governmental organizations in 32 countries working to impose a proactive ban on the development, production and use of self-controlled weapons. Parte will present the report in a presentation of the Stop Robotic Robots Campaign to members of the Convention on Certain Conventional Weapons (CCW), scheduled for August 28 at the United Nations in Geneva.
"The wave of opposition among scientists, religious leaders, technology companies, nongovernmental groups and ordinary citizens shows that the public understands that deadly robots are crossing the moral threshold," Douarty said.
The rise of the opposition from several quarters to weapons of full self-control shows how public awareness supports the prohibition of weapons that lack real human control. © 2018 Contact Christian / Human Rights Watch
Comments
Post a Comment