top of page
Writer's pictureThe Law Gazette

Legality of ‘LAWS’: When and How to regulate them?

Lethal Autonomous Weapon Systems, also known as LAWS, are viewed to be the next revolution of warfare. Many countries, specifically United States, China and Russia, are indulging themselves in development of this emerging weapon technology. With these countries venturing into this unexplored territory, debates on the legality of such weapon systems have become indispensable. Debates as to how and when to regulate LAWS, and whether there should be a ban of such weapon systems, are happening in both formal and informal international platforms.


Often referred to as “killer robots” by the critics, LAWS are fully autonomous weapon system, which means that they can select and attack targets without any human intervention. This includes their capability to react to changing set of circumstances without any human judgment. Consequently, the actions of such ‘human out of the loop weapons’ are predictable only to the extent of the way they are programmed. The introduction of autonomy in these weapon systems have raised questions that are not only technical or legal, but also ethical and socio-political.


CHALLENGES TO THE DEHUMANIZATION OF WEAPON SYSTEMS

One of the most important and controversial debate in this regard is whether these weapons fulfil the fundamental requirements of international humanitarian laws (IHL). Although these weapons are not specifically regulated by IHL treaties, it is undisputed that any such weapon system must comply with the principles of IHL.


The main aim of IHL is to allow activities to attain military objectives, while ensuring protection of civilians from combat, and of combatants from unnecessary suffering. This has given rise to two main principles of IHL, the principle of distinction and the principle of proportionality. The principle of distinction emphasizes the need to differentiate between military objectives and civilian objects, combatants and civilians, as well as active combatants and those hors de combat. On the other hand, the principle of proportionality requires that any attack which could cause potential civilian harm should not be excessive in relation to the military advantage anticipated. Both these principles are not only embodied in various IHL treaties, like the Additional Protocol I of the 1949 Geneva Convention but have also become a part of customary international law.


When it comes to practical application of these principles, it becomes difficult, especially in the case of autonomous weapon systems since their proper implementation requires a highly context dependent analysis of the circumstances. For instance, a situation where a male carrying a dagger called kirpan for purely religious reasons, is running towards the weapon system to save a child from danger, may be interpreted as potential threat by the machine as such situations require a qualitative analysis rather than a quantitative one. However, such situations may be relatively easier for humans to interpret. Similarly, the proportionality assessment is also a qualitative exercise since it is practically impossible to allocate numerical values to military objective and civilian damage in order to predict how many civilians it is proportional to kill to achieve a particular military objective. Although technology like collateral damage estimate (CDE) may be helpful in assessing the damage that may result from a particular attack, it fails to determine as to what constitutes of “excessive” collateral damage. Proportionality analysis becomes particularly challenging since it takes places in various stages and a myriad of factors have to be taken into consideration for the same.


OPINIONS

Many jurist and scholars are of the opinion that the current body of IHL is inadequate to deal with the challenges posed by these autonomous weapon systems.[1] However, it should be noted that IHL consists of general principles and rules which are not restricted to a particular technology but are applicable to a range of weapon systems. Therefore, despite the fact that these rules have multiple interpretations, they are capable of dealing with autonomous weapon systems as well. These weapons are required to comply to these rules while executing their combat operations. This requires the conversion of these rules and principles into digital codes for the machine to apply them in a given scenario. However, the possibility of computers being capable of qualitative assessment during the rapidly changing and unpredictable conflict situations, seems to be far-fetched dream for now. Moreover, these weapon systems will have to provide for diverse degrees of combat intensity in order to suit a given circumstance while complying with the rules of IHL, which is again a challenge in itself.


The increased autonomy in the weapon systems which allows them take life-and-death decisions without any involvement of human judgment has also raised some ethical questions which goes beyond than just compliance with the principles of IHL. The ‘Martens Clause’ serves as a link between IHL and these ethical considerations. According to this clause, in cases that are not covered by the prevailing treaties, customary IHL, the principles of humanity and dictates of public conscience will come into play.[2] Consequently, the idea of a weapon system that works free from any human intervention has encountered wide criticism as it does not go well with the dictates of public conscience.[3]


Another major issue faced by these autonomous weapon systems is the allocation of individual responsibility. The introduction of these weapon systems has resulted in those who plan the military operation being removed from the actual battle field which makes it difficult to hold an individual criminally responsible. Given that most of the legal systems requires showing of intent and mens rea to hold an individual responsible, a range of actors are required to be assessed in order to determine if command responsibility applies. Even if it is argued that these weapon systems will work under some kind of human supervision, the problem arises when the operator doesn’t get enough time to validate the action that the machine has decided to take. This may result in ‘war crimes’ being committed by a machine. Furthermore, these weapon systems being man-made machines, the biggest fear remains the malfunctioning of these systems since they are open to tampering and other interferences as well.


THE ROAD AHEAD

There is no doubt about the fact that under IHL, the lawful level of autonomy in the weapon systems will be restricted. States and other actors need to determine the internationally agreed limits by evaluating the degree of human control required in the working of these weapon system, for the purpose of ensuring compliance with the rules and principles of IHL as well as to satisfy the ethical considerations.


With regard to the issue of assigning individual responsibility, in order to rectify this loophole, military organizations need to establish clear rules and regulations to administer the engagement of these autonomous weapon systems in various operations. Another approach to overcome this problem can be to make the decision-making process within these systems traceable. This will allow each decision taken by the weapon system to be traced back to the individual responsible for it, who can then be held liable for the same. Although this sounds as a feasible solution, at present, its practical application is not very clear.


Above all, owing to the incremental steps being taken towards autonomy in weapon systems, the debate on the desirability of these autonomous weapon systems requires an urgent and high degree of intellectual integrity and thoroughness, in order to come up with a viable solution.


ENDNOTES

1. See, RONALD C. ARKIN, GOVERNING LETHAL BEHAVIOR IN AUTONOMOUS ROBOTS 192–212 (2009); ARMIN KRISHNAN, KILLER ROBOTS: LEGALITY AND ETHICALITY OF AUTONOMOUS WEAPONS (2009); Hin-Yan Liu, Categorization and Legality of Autonomous and Remote Weapons Systems, 94 INT’L REV. RED CROSS 627, 629 (2012)

2. Article 1(2) of Additional Protocol I; Preamble of Additional Protocol II

3. ICRC (2015) Statement to the Convention on Certain Conventional Weapons (CCW) Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), 13-17 April 2015, Geneva, https://www.icrc.org/en/document/lethal-autonomous-weapons-systems-LAWS


ABOUT THE AUTHOR

This blog has been authored by Bhumika Khandelwal who is a 3rd Year B.A., LL.B. (Hons.) student at ILS Law College, Pune.


[PUBLICATION NO. TLG_BLOG_20_0704]

Comments


bottom of page