On April 11, 2016, the Convention on Certain Conventional Weapons (CCW) will hold a week-long meeting on lethal autonomous weapons systems (LAWS) in Geneva. Previous meetings were held in 2014 and 2015 to discuss the legality of LAWS under the law of armed conflict (LOAC) and international human rights law. Some nations that attended these meetings, as well as all of the nongovernmental organizations (NGOs) in attendance, have advocated the strict regulation or even outright banning of LAWS via a new protocol to the CCW. Only a few nations, including the United States, have thus far resisted calls for an outright ban on LAWS.
The United States should continue to resist such calls at the CCW. The U.S. should also work with allies and like-minded nations to initiate a broader discussion of LAWS and how long-standing LOAC principles may be applied to these cutting-edge weapons. The most effective vehicle to address such issues is a LOAC manual on LAWS. The U.S. should lead an effort to develop such a manual as an alternative to the regulation, or banning, of LAWS through the CCW.
In a 2014 treatise published by the U.S. Naval War College, Kenneth Anderson, Daniel Reisner, and Matthew Waxman wrote that there is “a unique (although probably short-lived) opportunity to…develop the rules and code of conduct for [autonomous weapons] systems before they are fielded on the battlefields of the world in large numbers.” The treatise proposed, inter alia, that an “international instrument” be developed to adapt LOAC to LAWS. The international instrument would serve the purpose of achieving consensus on “some core minimum standards” for the development of LAWS while retaining “flexibility for international standards and requirements to evolve as technology evolves.”
The international instrument could take different forms: a new, stand-alone treaty; a protocol to the CCW; or a LOAC manual. The treatise settled on the LOAC-manual approach as the most appropriate vehicle, stating that the process of convening national governments and independent experts would help “foster broad consensus” regarding the use of LAWS, and would “surface disagreements that require resolution.”
This Special Report examines the feasibility of the Anderson–Reisner–Waxman proposal, and concludes that the U.S. should lead an effort to develop a Manual on the International Law Applicable to Lethal Autonomous Weapons Systems (LAWS manual). While such an effort would face significant challenges, it is crucial that aspects of LOAC relevant to LAWS be identified and clarified with a view toward applying them to the development and use of LAWS as these weapons become more prevalent on the battlefield. The absence of a LAWS manual will leave a vacuum that certain nations and NGOs would gladly fill with a CCW protocol regulating or banning LAWS from existence.
Part I of this report discusses the timeliness and desirability of developing a LAWS manual in the same tradition as prior LOAC manuals.
Part II addresses conceptual challenges to the development of a LAWS manual and deconstructs the debate surrounding the application of fundamental LOAC principles to LAWS, including the principles of distinction, proportionality, individual criminal accountability, and command responsibility.
Part III identifies significant political and philosophical challenges facing the development of a LAWS manual, including the central challenge of overcoming an ongoing campaign by international NGOs to ban LAWS.
Part IV concludes that a LAWS manual should be pursued regardless of the challenges that such an effort would face.
Despite attempts to ban LAWS, experts on international law, robotics, and armed conflict should strive to clarify the application of LOAC to LAWS, since such weapons are rapidly developing and are likely to be inevitable features on future battlefields.
I. A LAWS Manual Is Timely and Desirable
There are compelling reasons why developing a LAWS manual is timely and desirable. Many commentators agree that the advent of LAWS is inevitable, and that weapons meeting the definition of “autonomous” may already be in use by today’s armed forces. As such, it behooves the legal community to get ahead of the curve and confront the “legal lag” that may already exist between the appearance of this new class of weapon and the proper application of LOAC principles.
Getting Ahead of the “Legal Lag” Curve. As many as 40 nations are currently developing military robotics. Some extant weapons may fairly be classified as “autonomous” since they, “once activated, can select and engage targets without further intervention by a human operator.” Current weapons that may meet the definition of autonomous (or may be modified to be) include Raytheon’s Phalanx Close-In Weapon System, a “rapid-fire, computer-controlled, radar-guided gun system” designed to destroy incoming anti-ship missiles; Israel Aerospace Industries’ Harpy, described by its manufacturer as a “‘Fire and Forget’ autonomous weapon” designed to destroy enemy radar stations; MBDA’s Dual Mode Brimstone anti-armor missile, and the Samsung Techwin SGR-A1 sentry gun.
Whether or not they meet the definition of autonomous, these weapons have already been deployed on U.S. warships, in the Libyan civil war, and in the Korean Demilitarized Zone. It is arguable that these weapons exist in a state of “normative ambiguity.” The purpose of a LAWS manual would be to remove this ambiguity by “bringing some degree of clarity to the complex legal issues” surrounding LAWS.
It is axiomatic that technological advances often outpace the law, including LOAC, particularly during wartime. For example, during World War I, “all sorts of recent inventions, from airplanes dropping bombs to cannons shooting chemical weapons, were introduced before anyone agreed on the rules for their use—and, as to be expected, the warring sides sometimes took different interpretations on critical questions.” Weapons technology advances so quickly that a significant “legal lag” often occurs between the introduction of a new weapon and an application of LOAC to that weapon. But that does not have to be the case:
The historical fact that the law of armed conflict (LOAC) has always lagged behind current methods of warfare does not mean that it always must.… [T]he underlying assumption that law must be reactive is not an intrinsic reality inherent in effective armed conflict governance. Rather, just as military practitioners work steadily to predict new threats and defend against them, LOAC practitioners need to focus on the future of armed conflict and attempt to be proactive in evolving the law to meet future needs.
In that spirit, the development of a LAWS manual would represent a proactive effort to get ahead of the legal lag inherent in technological advancements surrounding robotic weapons. A LAWS manual would be the latest instance of international experts striving to overcome this lag by applying existing legal norms to a new or rapidly evolving method or means of war.
Past LOAC Manuals Serve as Precedent for a LAWS Manual. Over the past century, several LOAC manuals have been developed to respond to legal lags caused by the evolution of war on land, sea, and in the air. Significant examples include:
- Oxford Manual on the Laws of Naval War Governing the Relations Between Belligerents (1913). The Oxford Manual was developed by the Institute of International Law several years after the adoption of a series of Hague Conventions relating to naval warfare. The manual addresses multiple aspects of LOAC and its application to naval warfare, including prohibition of treachery; the use of poisoned weapons, asphyxiating gas, or other weapons calculated to cause unnecessary suffering; and the use of mines that fail to deactivate once unmoored.
- Hague Rules of Air Warfare (1923). These rules were proposed as a Hague convention after World War I, which saw the emergence of strategic bombing of population centers by airplanes and zeppelins. Though not adopted, the rules had “considerable impact on the development of the customary law of armed conflict.”
- San Remo Manual on International Law Applicable to Armed Conflicts at Sea (1994). The San Remo Manual is considered a modern equivalent of the Oxford Manual of the Laws of Naval War. Developed under the auspices of the International Institute of Humanitarian Law, the San Remo Manual reaffirmed and updated LOAC in regard to naval warfare and applied it to new developments, including submarine warfare.
- Manual on International Law Applicable to Air and Missile Warfare (2009). Developed under the International Humanitarian Law Research Initiative, the air and missile warfare (AMW) manual is seen as a successor to the 1923 Hague Rules of Air Warfare, and was meant to address “the exponential changes brought about in air and missile technology” that “have transformed the face of the modern battlefield, revolutionized military strategy, and created a series of distinct challenges to the protection of civilians in time of armed conflict.”
- Tallinn Manual on the International Law Applicable to Cyber Warfare (2013). The Tallinn Manual was prepared over a three-year period by an international group of experts at the invitation of the NATO Cooperative Cyber Defense Centre of Excellence, and applied international legal norms to the concept of cyber warfare.
These manuals are not historical anomalies. Indeed, efforts to explicate LOAC through an “informal” vehicle rather than an international convention stretch back to the 19th century, including the 1863 Lieber Code regarding the conduct of U.S. forces in the American Civil War, the 1874 draft International Declaration Concerning the Laws and Customs of War, and the 1880 Oxford Manual on Land Warfare. A LAWS manual would continue the tradition of developing comprehensive LOAC codes to address evolving methods and means of warfare.
II. Conceptual Challenges to Developing a LAWS Manual
The central challenge to developing a widely accepted LAWS manual is the fact that some legal experts insist that it is impossible to apply LOAC to LAWS. If a LAWS manual cannot overcome the objections raised by these experts, it is possible that some nations will not accept the manual as a definitive source.
Over the past few years two camps—“optimists” and “pessimists”—have emerged among lawyers, roboticists, and ethicists in regard to whether LAWS may comply with LOAC. Prominent optimists (such as the aforementioned Anderson, Reisner, and Waxman) maintain that while LAWS create challenges to the existing legal framework, LOAC may be adapted to account for their particular features. The pessimist camp, such as Peter Asaro, Noel Sharkey, and Human Rights Watch, insist that the nature of LAWS is such that they may never be designed or deployed in a way that would comply with LOAC.
Certain LOAC principles may be applied to LAWS without objection from the pessimists. For instance, all would agree that nations have an obligation when developing or acquiring a new weapon to determine whether its use would violate international law. But the focus of the legal debate between optimists and pessimists has centered on other core principles of LOAC: distinction, proportionality, individual criminal responsibility, and command responsibility.
The legitimacy of a LAWS manual may rise and fall on how well the optimist viewpoint fully addresses and adequately rebuts the pessimist viewpoint on these core principles.
Distinction, Proportionality, and Targeting. Perhaps the most debated and contentious aspect regarding the use of LAWS concerns the fundamental LOAC principles of distinction and proportionality. Achieving consensus on the proper application of these law of war (jus in bello) principles is the raison d’être for a LAWS manual.
Distinction and Proportionality. The principle of distinction requires that parties to a conflict distinguish at all times between civilians and combatants, and between civilian objects and military objectives. The principle of proportionality prohibits the launching of an attack “which may be expected to cause incidental loss of civilian life, injury to civilians, damage to civilian objects, or a combination thereof, which would be excessive in relation to the concrete and direct military advantage anticipated.” LAWS pessimists maintain that no design or program presently exists, or will exist in the future, that would permit LAWS to comply with these fundamental principles.
In regard to distinction, the pessimists maintain that LAWS lack the programming, sensors, or other capabilities to distinguish combatants from civilians, or to identify combatants that are wounded, surrendering, or otherwise hors de combat. The alleged inability of LAWS to distinguish combatants from civilians is described by the pessimists in stark terms:
For example, a frightened mother may run after her two children and yell at them to stop playing with toy guns near a soldier. A human soldier could identify with the mother’s fear and the children’s game and thus recognize their intentions as harmless, while a fully autonomous weapon might see only a person running toward it and two armed individuals. The former would hold fire, and the latter might launch an attack.
In contrast, LAWS optimists—though not going so far as to say that LAWS can undoubtedly be designed to comport their actions with the principle of distinction—maintain that technology has progressed to a point where LAWS do not constitute a per se violation of LOAC. One commentator, for example, notes that LAWS may in fact be capable of complying with the principle of distinction because “[m]odern sensors can, inter alia, assess the shape and size of objects, determine their speed, identify the type of propulsion being used, determine the material of which they are made, listen to the object and its environs, and intercept associated communications or other electronic emissions.”
Pessimists are also dubious that LAWS can comply with the principle of proportionality. Human Rights Watch, for example, states that “[i]t is highly unlikely that a robot could be pre-programmed to handle the infinite number of scenarios it might face so it would have to interpret a situation in real time.” Optimists, on the other hand, contemplate situations where LAWS may comport with the principle of proportionality, particularly in scenarios where civilians are not in the line of fire.
Convening recognized experts on robotics, advanced weapons systems, and international law in a neutral, non-adversarial series of meetings would significantly advance the debate on the manner in which LAWS may be used lawfully in the modern battlespace, including how they will be expected to comply with the principles of distinction and proportionality.
One benefit of convening experts to produce a LAWS manual is that the debate regarding distinction and proportionality would expose the “worst case scenario” fallacies repeatedly raised by pessimists—such as two children armed with toy guns being chased by their mother—as a narrow and emotionally charged hypothetical that sheds little or no light on complex legal issues. The pessimists give short shrift to scenarios in which LAWS may be used lawfully without threat of violating the principles of distinction and proportionality. In their view, the only LAWS that will be developed are Skynet Terminator cybernetic “killer robots” that will patrol in urban settings intermixed with combatants and civilians.
By contrast, a LAWS manual process would address the application of LOAC to LAWS in realistic combat situations, including in permissive environments, such as an attack “on a tank formation in a remote area of the desert or from warships in areas of the high seas far from maritime navigation routes.” Rather than focusing on the pessimists’ narrow and unrealistic hypotheticals, a LAWS manual could help to establish the norms that would apply to LAWS in all the ways that they may be deployed. Such a manual may in fact conclude that, based on current technology, the deployment of LAWS in an impermissive environment where civilians and combatants are intermixed would violate LOAC. That does not mean, however, that the deployment of LAWS in any and all other environments would similarly violate the law.
A LAWS manual would affirm that LAWS, like any other weapons system, must at all times distinguish between civilians, civilian objects, combatants, and military objectives. Furthermore, a LAWS manual would affirm that LAWS may not be used in an attack where the expected loss of civilian life or damage to civilian objects would be excessive in relation to the direct military advantage. Central to the distinction and proportionality analyses is a temporal issue that a LAWS manual could address and resolve.
Targeting and Temporality. Pursuant to LOAC, decisions regarding distinction and proportionality are usually made at or near the moment in time that a weapon is fired or deployed—when a sniper’s bullet is put downrange, when a Tomahawk cruise missile is launched from a warship, or when an artillery barrage is ordered. The same necessarily holds true in regard to LAWS. From a normative standpoint, a LAWS manual could clarify that (1) decisions regarding distinction and proportionality are made by commanders and individual combatants employing LAWS, and (2) such decisions are made at the moment in time when LAWS are deployed.
LAWS pessimists have attempted to skew the targeting and temporal debate by framing it in a misleading manner. Specifically, pessimists want the debate to turn on whether it is legal to permit LAWS to make the “final decision” on the use of lethal force. One pessimist states, “It is my view that autonomous weapon systems represent a qualitative shift in military technology, precisely because they eliminate human judgement in the initiation of lethal force.” Likewise, the International Committee for Robot Arms Control states, “We believe…that it is unacceptable for machines to control, determine, or decide upon the application of force or violence in conflict or war,” including “the decision to kill or use lethal force against a human being.” Christof Heyns, the U.N. special rapporteur on extrajudicial, summary, or arbitrary executions, asks rhetorically “whether it is not inherently wrong to let autonomous machines decide who and when to kill.”
In the view of the pessimists, LAWS must be banned because robots should not be permitted to make “decisions” about life and death. But under LOAC, military commanders and soldiers in the field—not weapons systems—are responsible for decisions to use lethal force. In their current state of development (putting aside for now the possibility of an emergence of true artificial intelligence), LAWS do not make decisions about whether to use lethal force any more than toasters make decisions about when a slice of bread is adequately toasted. LAWS, like other advanced weapon systems, merely follow the commands made by their operators, restrained by the algorithms embedded in their programming.
The Dual Mode Brimstone anti-armor missile currently in use by the Royal Air Force is a good example. The Brimstone uses advanced semi-active laser and millimeter wave radar guidance systems to search out targets autonomously while in flight and strike only those vehicles that match a programmed signature, while ignoring other vehicles, such as cars and buses. Several Brimstone missiles may be fired at once in a salvo, permitting each munition to target individual armored vehicles travelling in a column. While the Brimstone’s programming permits it to seek out, select, and use lethal force on a target, the missile is not deciding to kill, nor is it conducting an analysis regarding distinction and proportionality. Those decisions and analyses were made at the mission planning stage, or by the pilot who actually fired the missiles.
Assuming, for the sake of argument, that LAWS pessimists deem Brimstone missiles to be “precision-guided munitions” and not LAWS, it merely begs the question to the next level of autonomy. Consider the possibility that a MQ-9 Reaper unmanned aerial vehicle is equipped with the same sensory and guidance systems as the Brimstone, and is deployed in a combat environment armed with several Brimstone missiles. The Reaper could autonomously loiter over a vast battlespace until it identifies a column of enemy armored vehicles and, without any additional human intervention, fire one or more missiles. Under this hypothetical, all “decisions” regarding distinction and proportionality are still made by the commander that launched the Reaper—not by the Reaper itself, nor by the Brimstone missiles.
A LAWS manual could make it clear, in contrast to the views of the pessimists, that only human commanders and operators utilizing LAWS make decisions to kill. Gaining consensus on the temporal targeting issue would represent a significant development in the application of LOAC to LAWS, and remove from the debate the pessimists’ specter of “killer robots” making decisions regarding life and death.
Meaningful Human Control. In March 2014, the International Committee of the Red Cross (ICRC) held an “expert meeting” on LAWS in Geneva that was attended by representatives from 21 countries, including the United States, where a wide range of issues regarding LAWS were discussed. In addition, the parties to the CCW held an informal meeting on the same topic in May. From these meetings, the term “meaningful human control” (MHC) was coined to describe the operational standard by which LAWS should be utilized.
However, no consensus has emerged on the definition of MHC. For example, U.S. policy states that LAWS shall be designed “to allow commanders and operators to exercise appropriate levels of human judgment over the use of force,” while the United Kingdom has stated that “the operation of weapons systems will always…be under human control.” (Emphasis added.) A LAWS manual, applying LOAC principles, could provide a workable definition for MHC that would enable LAWS to be designed, programmed, and deployed in a lawful manner.
Existing LOAC principles provide sufficient basis for determining whether a commander or operator exerts MHC over LAWS. Under LOAC, parties to a conflict must do “everything feasible” to verify that targets are military objectives, and must cancel or suspend an attack “if it becomes apparent that the objective is not a military one” or if the attack would violate the principle of proportionality.
It stands to reason that if a commander utilizing LAWS is incapable of cancelling or suspending an attack once it becomes apparent that the attack will violate the principles of distinction and/or proportionality, it is likely that the deployment of LAWS in that scenario would violate LOAC. A commander who does not do “everything feasible” to verify that his target is a military objective before deploying LAWS to attack the target also violates LOAC. In the context of LAWS, a commander may not rely solely on the programming of LAWS, however advanced the programming may be, to satisfy these requirements. In this manner, the commander may be said to exercise MHC over the LAWS at his command.
Accountability and Responsibility for LOAC Violations. LAWS pessimists raise the LOAC principles of individual accountability, command responsibility, and state responsibility as obstacles to the lawful use of LAWS. Traditionally, individual combatants are criminally responsible for the war crimes they commit, and commanders and other superiors are criminally responsible for war crimes committed pursuant to their orders, and for failure to prevent, repress, or report war crimes. According to the pessimists, the nature of LAWS skews the traditional application of LOAC in situations where war crimes have been committed.
Individual Accountability and Command Responsibility. The pessimists maintain that the traditional LOAC framework cannot be applied to LAWS in regard to accountability for war crimes. U.N. special rapporteur Heyns poses the question: “Robots have no moral agency and as a result cannot be held responsible in any recognizable way if they cause deprivation of life that would normally require accountability if humans had made the decisions. Who, then, is to bear the responsibility?” Rather than individual combatants or commanders, pessimists raise the specter that software programmers or arms manufacturers may be held criminally liable for the actions of LAWS.
A LAWS manual would help clarify the principles of LOAC relating to accountability when LAWS are employed unlawfully or when grievous errors occur during their employment. Specifically, a LAWS manual would reaffirm that the existing rules regarding individual accountability and command responsibility apply to the use of LAWS in combat. That is to say that individual combatants are criminally responsible for committing war crimes if they use LAWS to commit them, and commanders are responsible if they order the criminal acts or fail to prevent them.
LAWS pessimists object to this traditional application of LOAC on the grounds that it “would not be fair since it could be the fault of the person who programmed the mission, the manufacturer who made the robot, or the senior staff or policymakers who decided to employ it.” The objection is based on the notion that individual combatants and commanders would not possess the requisite criminal intent (mens rea) since they do not understand the complex algorithms and programming of LAWS and therefore should not be held criminally responsible for the actions of LAWS. In other words, if LAWS are fully autonomous, it would be unjust to hold combatants and commanders responsible for deaths caused by weapons over which they have no control.
This creates, in the pessimists’ view, an accountability gap or vacuum. But this “gap” presupposes that future combatants and commanders are incapable of obtaining the requisite knowledge and training to fully understand the capabilities of the LAWS in their arsenal. U.N. special rapporteur Heyns concedes this point, noting that it “will be important to establish, inter alia, whether military commanders will be in a position to understand the complex programming of [lethal autonomous robots] sufficiently well to warrant criminal liability.” A LAWS manual could clarify what level of knowledge a combatant or commander must have regarding the programming and operation of LAWS.
But a lack of knowledge of the program code of LAWS cannot serve as a barrier to fielding LAWS in combat. After all, it is unlikely that present-day combatants and commanders are experts on the complex programming of weapons already in use, some of which, like the aforementioned Brimstone missile, operate with significant levels of autonomy. LOAC does not require combatants to hold advanced degrees in software engineering or robotics. Rather, LOAC requires combatants and commanders to meet a standard of reasonableness in their decisions regarding which weapons to employ and how to employ them.
The decision-making process for a “reasonable” commander contemplating the use of LAWS would require the commander to “make a complicated and subjective decision, one that requires a judgment about the capabilities of the system, the circumstances in which it is to be deployed, and the nature and type of operations in which the system can be expected to function appropriately.” This clarification places accountability and responsibility where they belong—on the combatants and commanders, not on the LAWS themselves.
That said, a LAWS manual could develop new norms for the decision to use LAWS, acknowledging that they represent a special challenge to traditional notions of individual accountability. A combatant using LAWS may be required to have considerably more familiarity with the behavior of LAWS, above and beyond his familiarity with non-autonomous weapons systems. Combatants may be responsible for special training in the operation of LAWS in order to fully understand their capabilities, risks, and limits. In this manner, if a combatant willfully uses LAWS in a manner inconsistent with LOAC, he may be found to possess the requisite level of mens rea to support a charge of war crimes.
State Responsibility. Long-standing LOAC principles hold that a nation is responsible for violations of international humanitarian law committed by its armed forces. The scope of the law on “state responsibility” is broad enough to encompass every possible LOAC violation that may occur during the utilization of LAWS, since the state is responsible for violations committed by persons it empowers to exercise elements of governmental authority, by persons acting under its direction or control, and even by private persons whose conduct the state acknowledges and adopts as its own. The remedy for state responsibility for violations of LOAC is to “make full reparation for the loss or injury occurred.”
A LAWS manual could reaffirm that traditional LOAC principles regarding state responsibility apply to violations of LOAC caused by a nation’s use of LAWS. This particular application of LOAC to LAWS should be uncontroversial. Even in the event that the actions of LAWS are unpredictable, or in instances where LAWS malfunction, it still may be said that any loss caused by LAWS is “attributable” to the state employing them.
III. Political and Philosophical Challenges to Developing a LAWS Manual
Of equal significance to the conceptual challenges facing the successful development of a widely accepted LAWS manual are the political and philosophical challenges facing such an effort. Unlike previous LOAC manuals, a LAWS manual would be developed and published in an environment that is exceptionally hostile to the weapons at issue. There is currently an international political campaign being waged by pessimists to ban LAWS—an effort which is supported not only by international NGOs and U.N. officials, but also by some nations. Moreover, some of the objections raised by the pessimists—objections based not in law but in philosophy and ethics—cannot be adequately addressed in a LAWS manual.
Political Challenges. An effort to develop a LAWS manual suffers from the fact that it, unlike past LOAC manuals, lacks a large political constituency. The San Remo, AMW, and Tallinn manuals had in common (1) the fact that there was a significant coterie of experts and nations committed to applying LOAC norms to a particular means or method of warfare, and (2) the absence of organized NGO resistance or campaigns dedicated to the abolishment of the means or method of warfare at issue.
In the case of LAWS, however, international NGOs, U.N. officials, and even some nations have voiced a strong preference for regulating or banning LAWS. This raises a significant political challenge to broad acceptance of any norms developed by the experts convened to draft a LAWS manual. While unanimity among stakeholders is not a mandatory condition precedent for successfully developing a credible LOAC manual, the current international political environment is suboptimal for developing a widely accepted LAWS manual.
The Momentum Is to Ban, Not Normalize, LAWS. It may fairly be said that the amount of attention currently being paid to LAWS is primarily a product of objections raised by the international human rights and arms control communities. The impetus for the current debate over LAWS is in large part a result of a report published by Human Rights Watch (HRW) in November 2012, “Losing Humanity: The Case Against Killer Robots.” “Losing Humanity” raises the pessimists’ objections discussed above in regard to the alleged inability of LAWS to comply with LOAC, and concludes that “fully autonomous weapons should be banned and…governments should urgently pursue that end.”
HRW, along with several other NGOs, launched the Campaign to Stop Killer Robots in April 2013. As its name implies, the goal of the campaign is not normalization or mere regulation of LAWS, but rather to have them treated as illegal per se and prohibit the development of any weapon that falls within the campaign’s broad definition of “autonomous.”
The active opposition to the development of LAWS by international NGOs should not be discounted when calculating whether to initiate a process to draft a LAWS manual. Although sovereign states have the final say at meetings of the CCW, past NGO campaigns—organized, funded, and led by some of the same groups seeking to ban LAWS—have played a significant role in efforts to ban other controversial weapons.
Indeed, the current NGO campaign against LAWS is built on the success of past campaigns conducted by the same NGOs. Efforts to ban anti-personnel landmines (APLs) and cluster munitions are two major successes that serve as a template for the ongoing effort to ban LAWS.
CCW Hostility, U.N. Skepticism, and ICRC Reticence. In 1992, HRW and a handful of other NGOs founded the International Campaign to Ban Landmines, and within a mere five years helped develop a treaty that banned all anti-personnel landmines outright, which was adopted in Oslo by 89 states in September 1997. Currently, 162 nations are party to the Ottawa Convention. Thus in this manner have APLs been banned from the battlefield by the vast majority of the international community despite the fact that the CCW had already successfully regulated APLs via a protocol in 1980 and an amended protocol in 1996.
In 2003 HRW, Amnesty International, and other NGOs formed the Cluster Munition Coalition (CMC), which led to the successful completion five years later of the Convention on Cluster Munitions (CCM). The 98 nations that are party to the CCM have pledged to ban the development and use of cluster munitions, and to destroy their existing stockpiles. In this case, the CCW was actually in the process of negotiating a protocol that would regulate the design and use of cluster munitions. The CMC—impatient with the pace of the CCW negotiations and unhappy with merely regulating cluster munitions—simply convened its own international conference and drafted a treaty that bans the weapon outright. At a CCW meeting in 2011, CCM state parties, bolstered by the CMC, blocked consensus on the ongoing negotiations, successfully scuttling the CCW effort to “merely” regulate cluster munitions via a CCW protocol.
If, in 1992 or 2003, an effort was made to convene experts to develop a LOAC manual for APLs or cluster munitions, respectively, such would likely have been met with strenuous objections and organized resistance by key NGOs and nations inclined to ban those weapons. A similar environment would certainly confront any attempt today to develop a LAWS manual.
At past CCW meetings on LAWS, NGOs such as HRW, Amnesty International, and Article 36, uniformly advocated that LAWS be banned. Representatives from at least two NGO campaigns dedicated to banning LAWS spoke at the CCW meetings—the aforementioned Campaign to Stop Killer Robots and the International Committee for Robot Arms Control.
In addition to the NGO campaigns, some U.N. officials have expressed concern and even opposition to the development of LAWS:
- Angela Kane, the current U.N. High Representative for Disarmament Affairs, recently spoke out against LAWS, stating that she believes “that there cannot be a weapon that can be fired without human intervention” because such weapons would bring forth “a faceless war and I think that’s really terrible and so to my mind I think it should be outlawed.”
- U.N. special rapporteur Heyns envisions dire consequences in the event LAWS are developed: “[T]he introduction of such powerful yet controversial new weapons systems has the potential to pose new threats to the right to life. It could also create serious international division and weaken the role and rule of international law—and in the process undermine the international security system.” Unsurprisingly, Heyns calls on all nations to place an immediate moratorium “on at least the testing, production, assembly, transfer, acquisition, deployment and use” of LAWS.
In addition to hostile NGOs and skeptical U.N. officials, the ICRC has embraced neither the inevitability nor legality of LAWS. It has expressed concern “over the potential human cost of autonomous weapon systems and whether they are capable of being used in accordance with international humanitarian law.” When asked whether it supports calls for a moratorium or a ban on LAWS, the ICRC responded that it “has not joined these calls for now.” As further evidence of its skepticism, the ICRC appears to have adopted some of the views of LAWS pessimists, including the view that LAWS would “make life-and-death decisions” without human control. According to the ICRC, the deployment of LAWS “would reflect a paradigm shift and a major qualitative change in the conduct of hostilities” and calls on states not to employ LAWS unless compliance with LOAC can be “guaranteed”—a high standard indeed.
ICRC reticence regarding LAWS is significant since it played a key role in the development of other LOAC manuals, including the San Remo Manual:
The ICRC played a major role throughout. Apart from coorganizing the meeting held in Geneva, it offered its advice to the [San Remo International Institute of International Law] throughout the process, coordinated the drafting work and helped contribute to the administrative and secretarial work. The ICRC also convened three meetings of the rapporteurs, whose reports were the basis of discussion in the annual meetings, in order to organize the drafting of the “Explanation.”
In addition, the Program on Humanitarian Policy and Conflict Research consulted with the ICRC in regard to drafting the AMW manual. The International Group of Experts that drafted the Tallinn Manual invited the ICRC to be one of only three official observers to their process, and the ICRC “participated fully in the discussions and drafting of the Manual.”
The air of legitimacy that ICRC participation would bring to the process of developing a LAWS manual cannot be underestimated. The presence of an independent, globally known, well-respected institution devoted to upholding LOAC would greatly increase the acceptance of the LAWS manual by nations and their militaries. This is particularly the case here because, in all likelihood, the vast majority of nations will not be represented by the group of experts convened to draft the LAWS manual. ICRC participation would be helpful to counter criticism of “geographical bias” that may be leveled at the LAWS manual’s group of experts. Indeed, the expert group that developed the Tallinn Manual has been criticized for being geographically insular.
Skepticism Among Nations. Perhaps most concerning to an effort to develop a LAWS manual is the fact that some nations have already expressed a desire to ban or impose a moratorium on LAWS.
Unlike the process that birthed the Tallinn Manual, there does not currently appear to be any appetite among nations to accept LAWS as a reality in the way that cyber weapons had been accepted as reality. Indeed, the San Remo, AMW, and Tallinn manuals share in common the fact that they address methods and means of warfare that are widely—perhaps universally—accepted as reality by nations. LAWS, however, do not share that benefit and some nations believe that LAWS should not become a reality.
Several European nations have expressed opinions regarding LAWS that range from skepticism to hostility. Germany, for example, stated in 2014 that it “does not intend to have any weapon systems that take away the decision about life and death from men.” Sweden expressed doubt whether LAWS were “a desirable development for a military force” and insisted that humans “should never be ‘out of the loop.’” Croatia has pledged that LAWS “will not be a part of our military and defense doctrine since we perceive them as being contrary to our comprehension of fundamental bases of the international humanitarian law.” Austria went so far as to call “on all currently engaged in the development of such weapon systems to freeze these programmes and those deliberating to start such development not to do so.”
Other nations have been more outspoken. Pakistan, for example, expressed all of the following points in a single statement at the CCW: LAWS “are by nature unethical,” “cannot be programmed to comply with International Humanitarian Law,” “lower the threshold of going to war,” “undermine international peace and security,” “create an accountability vacuum,” and “amount to a situation of one-sided killing.” Unsurprisingly, Pakistan called for LAWS to be banned:
We should not let the blind quest for the ultimate weapon, driven by commercial interests of the military-industrial complex, get the better of us. The introduction of LAWS would be illegal, unethical, inhumane and unaccountable as well as destabilizing for international peace and security with grave consequences. Therefore, their further development and use must be pre-emptively banned through a dedicated Protocol of the CCW. Pending the negotiations and conclusions of a legally binding Protocol, the states currently developing such weapons should place an immediate moratorium on their production and use.
It is possible and even likely that more nations will raise objections to LAWS and join in efforts to ban them, either through the CCW process or through an ad hoc convention similar to those that resulted in the Ottawa Convention and the Cluster Munitions Convention. This may be particularly true of nations that possess neither the technology nor resources to develop or acquire LAWS themselves.
These key actors—international NGOs, U.N. officials, skeptical nations, and the ICRC—already have a favored forum at the CCW to debate and scrutinize LAWS. Unless a significant number of key nations and the ICRC can be convinced to convene a new forum to develop a LAWS manual, it may be fruitless to initiate the project.
Philosophical Challenges. Like armed drones, LAWS are criticized on philosophical and ethical grounds that have little or nothing to do with LOAC. No matter how well a LAWS manual demonstrates that LAWS may operate lawfully and in full compliance with LOAC, the manual will likely be unable to overcome the non-legal objections that have been raised by the pessimists.
One such philosophical objection is that LAWS place too much distance on the battlefield between the armies that employ them and the combatants being targeted. This “distancing” argument gained currency in the debate over the use of armed drones, and has now been raised in regard to LAWS. U.N. special rapporteur Heyns analogizes the two weapons systems in his April 2013 report, where he criticizes drones for enabling “those who control lethal force not to be physically present when it is deployed, but rather to activate it while sitting behind computers in faraway places, and stay out of the line of fire.” According to Heyns, the use of LAWS “would add a new dimension to this distancing, in that targeting decisions could be taken by the robots themselves.”
It is dubious to claim that ethical war may be conducted only if done at a certain distance, requiring combatants from both sides to expose themselves to lethal force. Yet this claim has been made forcefully in connection to drones. Specifically, it has been argued that drones upset the traditional “relationship of reciprocal risk” associated with warfare—meaning that a combatant’s right to kill another during combat is linked to the combatant’s willingness to himself be killed. To this line of thought, war waged when risk is only experienced by one party sitting at a safe distance is unethical. As one pessimist states: “[A]n innovation, technological or otherwise, that promises riskless warfare threatens the ethos of reciprocity. Without the possibility of a reciprocal response, the moral situation changes. An insurmountable tactical asymmetry takes us beyond the ethos of warfare.”
The “distancing” debate leads to a related objection: that removing human combatants from the battlefield and replacing them with LAWS will “lower the thresholds for nations to start wars.” Like drones, LAWS have been characterized as a technology that replaces humans, and thereby allegedly removes the fear of casualties during armed conflict. This substitution will, the argument goes, make nations less reluctant to go to war than they otherwise would be. Since LAWS “will offer less harm to us and them, then it is more likely that we’ll reach for them early, rather than spending weeks and months slogging at diplomacy.”
But LAWS are the latest weapon developed in the history of military technology that has resulted in increased distance between combatants. As conventional weapons have evolved over time—from the bow and arrow to the rifle, the cannon, the bomber, the cruise missile, and the drone—the distance between combatants has greatly expanded. LAWS are merely the latest development along this technological continuum.
These claims and counterclaims concerning ethics and the evolution of military technology are not well-suited to the development of a LOAC manual, which necessarily focuses on the law, not on philosophy or history. A LAWS manual will do little or nothing to resolve these debates. Knowing that, pessimists will continue to hold the upper hand within academia, the CCW process, and U.N. disarmament circles, and will continue to raise these objections which transcend the law.
Since a LAWS manual is ill-suited for addressing and rebutting these objections, it calls into question whether developing such a manual is worth the significant time and expense. A LAWS manual that is not widely accepted, or is deemed illegitimate due to its failure to address philosophical objections to LAWS, may not be worth the effort.
An influential group of NGOs, academics, and activists have their sights set on achieving an international ban on LAWS. They will pursue that goal either within traditional arms control mechanisms such as the CCW, or outside those mechanisms, as they have successfully done in past campaigns to ban APLs and cluster munitions. These efforts should not be ignored, but neither should they be viewed as a prohibitive obstacle to developing a LAWS manual. If it is true that LAWS are an inevitable feature of future battlefields, objections from human rights and arms control groups, however sincere, cannot forestall the day when combatants require guidance regarding their development and use.
A LAWS manual, properly conceived, would provide that guidance. Building on the precedent of LOAC manuals, the United States should institute and lead an effort to develop a LAWS manual as soon as practicable. The United States and like-minded nations should serve as the convener for a group of experts drawn from advanced militaries, legal academia, robotics engineers, computer programmers, and ethicists.
The option of equal representation in the group of experts should be extended to African, Asian, and South American nations, but it is more likely than not that most of the experts in robotics, weapons development, and international law will be drawn from institutions based in developed nations. As such, it behooves the convener to enlist the ICRC in its efforts. Time is of the essence in this regard. Although the ICRC has not taken a firm position on the legality of LAWS, its statements indicate a nascent skepticism. While ICRC participation is not indispensable, the organization played a central role in the development of past LOAC manuals, and its ability to bring legitimacy to the development process of a LAWS manual should not be discounted.
To be sure, the momentum in U.N. forums and among human rights and arms control activists is to ban LAWS, not normalize them. An effort to develop a LAWS manual would be swimming against the current of that momentum. But states that believe in the inevitability of LAWS, that LAWS may be used lawfully, and that LAWS may result in fewer civilian deaths, should support such an effort.
No modern military force is interested in waging war with inherently unlawful weapons. Military commanders and combat soldiers require guidance on when they may be held criminally responsible for the use or misuse of LAWS in battle. A LAWS manual will help clarify the application of LOAC to this particular class of weapon, and will provide designers, programmers, military planners, procurers, and combatants some measure of certainty regarding the lawful development and use of LAWS.
—Steven Groves is Bernard and Barbara Lomas Senior Research Fellow in the Margaret Thatcher Center for Freedom, of the Kathryn and Shelby Cullom Davis Institute for National Security and Foreign Policy, at The Heritage Foundation.