Lethal Autonomous Weapons Systems: A Legal Perspective

 By Harshita Porwal
0
148

21st century has seen a significant advancement in the technology. The arena of security and warfare has seen a remarkable development because of the technical advances such as artificial intelligence (“AI”), nano-technology, machine learning, robotics, etc., that has changed the dynamics of security and warfare. These technologies will provide decisive and exponential advantage in modern warfare.[1]

All these technical advancement plays a huge role in introduction of Lethal Autonomous Weapons Systems (“LAWS”). The development of LAWS is universally regarded as the third revolution in the warfare, after the invention of gun-powder and nuclear arms.

The general idea of LAWS is that once such system is developed and activated, it would with the help of AI, machine learning, sensors and complex algorithms, identify, search, seek and attack targets without any human intervention. As of now, no country in the world is successful in developing a fully-autonomous weapon system, however, rapid advancement has made this a distinct possibility in near future. It is important to note that at this point, several countries use near- autonomous defensive systems. These are used in a protective roles to intercept incoming attacks. These systems do not actively search, seek and attack on the target but instead respond to predetermined threats. The Iron Dome of Isreal, is one such example.[2]

Application of such a technology would be beneficial for the security of any country. Unlike other remotely controlled systems or automated systems, LAWS would be able to operate at an increase range for a longer period without hindrance; there would be no dependency on communication links; the military would need less manpower for operations; the pace of combat would increase; and the decision making process will be objective which could lead to a better overall outcomes.[3] LAWS also generate substantial threats such as the control over use of force and its consequences might be different from how humans exercise control. Human beings might not predict and justify the attacks done by LAWS. The element of identification and accuracy in a highly unpredictable environment might result in unpredictable and inaccurate outcomes. These threats raise several, strategic, military, ethical and legal concerns.

The question which is of concern is, whether LAWS will comply with the existing International Humanitarian Law (“IHL”) without breaching any of the ethical concerns.

Legal concerns around LAWS:

The discussions on LAWS has been addressed internationally under the framework of the 1980 United Nations Convention on Certain Conventional Weapons (“CCW”) since 2014[4], in three annual informal experts meetings. In 2016, the CCW decided to formalise this process and estab­lish a Group of Government Experts (“GGE”) to meet to examine and discuss questions related to the emerging technologies in the areas of LAWS. It is important to highlight that a set of possible guiding principles were affirmed by the GGE in August 2018. It states that International Humanitarian law (“IHL”) would apply fully to all weapons systems, including the potential development and use of LAWS.[5]

A definition is the foremost thing to focus on before mapping out a legal structure. As of now, the GGE has not come up with a definition that can possibly be accepted by the international community. For the purpose of analysis and clarity, an autonomous weapons system is defined as: ‘any weapon system with autonomy in its critical functions—that is, a weapon system that can select (search for, detect, identify, track or select) and attack (use force against, neutralize, damage or destroy) targets without human intervention’.[6] This definition by ICRC provides for a useful basis for a legal analysis for the discussion on the autonomous systems.

The international community has diverse views on the regulations of LAWS. Many countries and entities such as Human Rights Watch[7] have called for a preemptive ban on development, production, and use of fully autonomous weapons. However, several other countries including the major powers such as USA[8] are proponents of development and usage of LAWS. There are some advocates who emphasises the ban on usage but not on the development of LAWS.

The reason why LAWS is such a debatable topic worldwide is because it fails to address the ethical and moral concerns. It is argued worldwide that LAWS would violate the IHL due to their inability to differentiate between civilians and combatants. The concept of civil-ness is a complex and a context-dependent concept. Its recognition and application on battlefield requires value based judgments, a degree of situational awareness and understanding of social context.[9]

Due to increased inclination towards ethics and human dignity, the degree of autonomy becomes very relevant. Majority of CCW State parties and other non-governmental parties believe that LAWS must have some human control. It is difficult for a machine to ever achieve a human like spirit.  The relevant question is what degree of human control is required in such systems.  An ideal approach would be let the system work autonomously however, as and when needed humans must have a veto on the functioning of LAWS. A principle of human control should be internationally recognised within the framework of CCW and possibly other documents of international law and the basis from which requirements can be developed as part of a norm- shaping process.[10]

Accountability for AWS remains unclear. As LAWS involves many levels of military chain of responsibility, it might be difficult to hold someone responsible in case the system fails. The IHL creates obligation for humans only which if transferred to the machine or weapon systems will result in creating a legal accountability gap in case of violations of IHL. Keeping in mind these changing dynamics, responsibility and accountability should be distributed both at state and individual level which should be governed by IHL, International Human Rights Law, International Criminal Law and the laws of product liability.

The development of deployment of LAWS might attract many legal arguments in relation to Human Rights, Product liability, etc., however, this paper will specifically focus on IHL.

IHL has three basic principles enshrined in its law i.e. the principle of distinction, proportionality and precaution. These principles are usually kept in mind while distinguishing between military and civilian objects. It is hard to judge that a machine would ensure distinction, judge proportionality or take precaution should the circumstance change.[11] Ideally, these obligations cannot and should not be transferred to a machine as it is hard to hold them accountable and therefore, the relevance of limited human control as highlighted above is necessary.

Further, to bridge the gap between the ethical considerations and IHL, it is relevant to highlight the Martens Clause[12] that requires systems and their usage to meet the ‘principles of humanity and the dictates of public conscience’. It is used for cases which are not covered by existing treaties and highlight the importance of upholding the basic ethical principles of human dignity. Since it’s a new arena with no precedent whatsoever, the predictability, accuracy, and reliability of such systems is uncertain and therefore, we need pillars of humanity to put reasonable restrictions on the development and usage of the same.

Few ways to regulate LAWS are listed below- each of the options presented has certain legal, ethical and operational implications depending on the perspective of State parties.

Firstly, there can be a general legally binding protocol applicable to conventional weapons which provides for a mandatory limited human control. Specific rules on human control and use of force could be left on States as per their respective national instruments. This would answer substantial ethical and legal concerns, like the violation of human dignity and necessity of human to make legal judgments.[13]

A second option could be a comprehensive ban on the development and use of LAWS. This would address security concerns by limiting the arms race and proliferations. It is important to note that a general ban over the broad meaning of autonomous weapons would be illogical considering a full scope of such weapons is still unknown. This might cause hindrance for any future innovation in both military and civilian spheres. On contrary to a general ban, depending on a specific definition as decided by the states, a ban could be more selective resulting in smaller impact on international arms control.

A third option could be banning only the use of LAWS as opposed to both development and use. This option would allow States to continue to research and develop such systems without any standard of regulation. This would also allow development and use on domestic level and export of technologies to State and Non-State actors. As opposed to completely banning uses of LAWS, a protocol could be considered to set limitation on certain uses of LAWS rather than a complete ban.

Another option could be a political declaration by CCW high contracting parties. This is more of a soft law approach. Such a declaration would set constraints on many states as it creates a political obligation. However, this gives individual states discretion to regulate the development and use of LAWS as they deem fit. This can be problematic as in future, political influence will become a standard for regulating such a contentious topic.

A last option could be no regulation. This could be a reality if the CCW process results in lack of consensus about any regulatory approach. This could also happen if the parties think that existing IHL is enough to deal with this issue and a national weapons review process as highlighted in Article 36 of the Additional Protocol 1 is enough to address the development and use of LAWS. It is important to note that, a legal review process is necessary but not sufficient. Further, not all states have such a process in place. In addition to this, a general process is going to be different in each State and without any standard, this will set a low threshold and likely to violate IHL principles.[14] Given the rate at which the technology is advancing a mere legal review process might not be enough. It remains an important element but has to be used with additional processes.

A Way Forward:

This debate on LAWS is at a very early stage and therefore, very broad and vague. A pragmatic approach to deal with the LAWS is necessary. To begin with, it is necessary for the international community to first come up with a working definition to start the process of detailed discussion on individual elements. Lack of consensus on a working definition is delaying the discussion on a regulatory framework, while many countries have already started developing the technology required for LAWS. Many major powers have already made significant progress in this arena. At present, there are a lot of challenges that needs to be addressed by GGE over and above the existing concerns. These challenges could be: first, the delay of process because of lack of consensus on precise technology; second: exchange and misuse of technology by non-state actors does not fall within the existing scope of this discussion; third: the existing framework is strengthened only by IHL. It does not include other legal framework such as human right, criminal law, product liability, etc.; fourth: the focus of debate is just on weapons and their use on armed conflicts. However, the usage and benefits in other spheres of security and peace keeping operations is not under discussions.

A set of regulations with comprehensive rules and reasonable restrictions at every stage starting form development, testing, deployment and usage of LAWS would be ideal. This would be effective if the regulation also address a compulsory national legal review process and relevant standards of accountability. It is also important to address issue of proliferation and trade of such technologies. Since, such technologies are still at a primary state, it would be premature and unreasonable to ban the broader idea altogether as they do offer a number of military and civilian advantages.

Referances:

[1] Brigadier Saurabh Tewari, “Impact of Disruptive technologies on warfare”, CLAWS-Issue Brief No. 185, June 18, 2019. URL- https://www.claws.in/publication/impact-of-disruptive-technologies-on-warfare/

[2] R.Shashank Reddy, “India & the challenges of Autonomous weapons”, Carnegie India, June, 2016, URL- https://carnegieendowment.org/files/CEIP_CP275_Reddy_final.pdf

[3] Regina Surber, “Artificial Intelligence: Autonomous Technology (AT), Lethal Autonomous Weapons Systems (LAWS) and Peace Time Threats”, ICT or Peace Foundation and the Zurich Hub for Ethics & Technology, February 21, 2018. URL- https://ict4peace.org/wp-content/uploads/2019/08/ICT4Peace-2018-AI-AT-LAWS-Peace-Time-Threats.pdf

[4] United Nations, UNODA, CCW. Report on the Meeting of Experts of the High Contracting Parties to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons (2014, 2015 and 2016), Available on the Internet.

[5] United Nations, UNODA, CCW GGE on Laws. Emerging Commonalties, Conclusions and Recommendations (August, 2018). Available on the Internet.

[6] United Nations, UNODA Occasional Paper No. 30. “A Legal Perspective: Autonomous weapon systems under International Humanitarian Law- by ICRC”. Available on the Internet

[7] International Human Rights Clinic, Heed the Call- A Moral and Legal Imperative to Ban Killer Robots, (August 2018). Available on the Internet.

[8]Defense Primer: U.S. Policy on Lethal Autonomous Weapon Systems”, Congressional Research Service, December 19 2019, URL- https://fas.org/sgp/crs/natsec/IF11150.pdf

[9] Elvira Rosert, “Prohibiting Autonomous Weapons: Put Human Dignity First”, Global Policy Volume 10- Issue 3 (September, 2019). URL- https://www.globalpolicyjournal.com/articles/international-law-and-human-rights/prohibiting-autonomous-weapons-put-human-dignity-first

[10] International Panel on the Regulation of Autonomous Weapons, Concluding Report: Recommendation to the GGE (December 2019). Available on the Internet.

[11] United Nations, UNODA Occasional Paper No. 30. “A Legal Perspective: Autonomous weapon systems under International Humanitarian Law- by ICRC”. Available on the Internet

[12] Additional Protocol 1 and Preamble of Additional Protocol 2 to the Geneva Conventions of August 1949. Available on the Internet.

[13] International Panel on the Regulation of Autonomous Weapons, Concluding Report: Recommendation to the GGE, December 2019. Available on the Internet.

[14] International Panel on the Regulation of Autonomous Weapons, Concluding Report: Recommendation to the GGE, December 2019. Available on the Internet.