The U.S. Should Oppose the U.N.’s Attempt to Ban Autonomous Weapons

Report Defense

The U.S. Should Oppose the U.N.’s Attempt to Ban Autonomous Weapons

March 5, 2015 16 min read Download Report
Steven Groves
Steven Groves
Bernard and Barbara Lomas Senior Research Fellow
Steven works to protect and preserve American sovereignty, self-governance and independence as a leader of The Freedom Project.

As many as 40 nations are currently developing military robotics.[1] Indeed, some weapons already in use may be considered “autonomous” (or may be easily modified to be autonomous). These include Raytheon’s Phalanx Close-In Weapon System (CIWS), a “rapid-fire, computer-controlled, radar-guided gun system” designed to destroy incoming anti-ship missiles;[2] Israel Aerospace Industries’ Harpy and Harpy-2 missiles, described as a “fire and forget” autonomous weapon designed to destroy enemy radar stations;[3] MBDA’s Dual Mode Brimstone anti-armor missile;[4] and the Samsung Techwin SGR-A1 sentry gun.[5]

These lethal autonomous weapons systems (LAWS) are classified as such since “once activated, [they] can select and engage targets without further intervention by a human operator.”[6] LAWS are distinguished from precision-guided munitions (PGM), such as laser-guided bombs, in that a human operator generally selects a target and directly engages it with a PGM, while LAWS, once deployed by a human operator, select and engage targets on their own, based on their programming.

As the prevalence of LAWS has grown, so has a concerted effort to ban them. United Nations officials, nongovernmental organizations (NGOs), and even some nations have coalesced to ensure that LAWS are no longer developed or deployed by any nation. The impetus for the attempt to ban LAWS is in large part a result of a report published by Human Rights Watch in November 2012, “Losing Humanity: The Case against Killer Robots.” In its report, Human Rights Watch raised certain objections regarding the alleged inability of LAWS to comply with international law, and concluded that “fully autonomous weapons should be banned and that governments should urgently pursue that end.”[7]

Anti-LAWS NGO campaigns, such as the International Committee for Robot Arms Control (ICRAC) and the Campaign to Stop Killer Robots, have joined with Human Rights Watch and other prominent human rights NGOs to spearhead the effort to ban LAWS. The principal forum for their efforts is the Convention on Certain Conventional Weapons (CCW), which meets regularly in Geneva. The CCW held a meeting of experts in May 2014 to address LAWS, and a second meeting is scheduled for April 2015.

The United States should attend this meeting and make clear that it has no intention of banning LAWS or supporting a moratorium on their development.

LAWS Comply with the Law of Armed Conflict

Human Rights Watch, ICRAC, and other opponents of LAWS maintain that autonomous weapons are incapable of complying with the law of armed conflict (LOAC), the laws and principles regulating the lawful waging of war. The main legal objections raised by LAWS opponents focus on the core LOAC principles of distinction and proportionality.[8]

Distinction. The principle of distinction requires that parties to an armed conflict distinguish at all times between civilians and combatants and between civilian objects and military targets.[9] Parties may only attack targets of military value, such as tanks, warships, warplanes, and enemy combatants. Targeting civilians and civilian objects, such as schools, hospitals, and private homes is also prohibited.[10] Moreover, enemy combatants may not be targeted if they are hors de combat (literally “outside the fight,” meaning that they have surrendered or are defenseless due to wounds or sickness).[11]

The opponents maintain that LAWS, no matter how well designed, lack the sensors or other capabilities required to distinguish combatants from civilians and combatants that are hors de combat.[12] The supposed difference in abilities between human combatants and LAWS to successfully distinguish combatants from civilians is described by the opponents in stark terms. For example, from Human Rights Watch’s “Losing Humanity”:

[A] frightened mother may run after her two children and yell at them to stop playing with toy guns near a soldier. A human soldier could identify with the mother’s fear and the children’s game and thus recognize their intentions as harmless, while a fully autonomous weapon might see only a person running toward it and two armed individuals. The former would hold fire, and the latter might launch an attack.[13]

LAWS opponents regularly characterize hypothetical LAWS in worst-case-scenario environments—Skynet Terminator cybernetic “killer robots” patrolling in urban settings intermixed with combatants and civilians.

In reality, once fully developed, LAWS are likely to be deployed in a wide variety of combat situations, including permissive environments where LAWS may attack a tank formation in a remote area (such as a desert) or a warship sailing on the high seas far from busy commercial navigation routes.[14] These scenarios are relevant to the principle of distinction. If there are no civilians or civilian objects present in the combat zone, LAWS cannot violate the principle.

Also, it is likely that technology has progressed (or will progress) to a point where LAWS are capable of complying with the principle of distinction since “[m]odern sensors can, inter alia, assess the shape and size of objects, determine their speed, identify the type of propulsion being used, determine the material of which they are made, listen to the object and its environs, and intercept associated communications or other electronic emissions.”[15] Indeed, some weapons systems are already capable of making such assessments. The Dual Mode Brimstone anti-armor missile, currently in use by the United Kingdom’s Royal Air Force, uses advanced semi-active laser and millimeter wave radar guidance systems to search for targets autonomously while in flight, and strikes only armored vehicles that match a programmed signature, while ignoring other vehicles, such as cars and buses.[16] In this manner, the Brimstone’s technology permits it to distinguish between military targets and civilian objects, thereby satisfying the principle of distinction.

Proportionality. The principle of proportionality prohibits the launching of an attack “which may be expected to cause incidental loss of civilian life, injury to civilians, damage to civilian objects, or a combination thereof, which would be excessive in relation to the concrete and direct military advantage anticipated.”[17] The principle of proportionality recognizes the fact that, no matter how carefully a combatant adheres to the principle of distinction, civilians at or near the location of an attack may be injured or killed. While civilian deaths and damage to civilian objects is a recognized and terrible consequence of armed conflict, combatants are not required to guarantee zero civilian casualties, but only to reduce such damage as much as practicable. Proportionality therefore prohibits combatants from launching attacks against military targets if the “collateral” damage that will foreseeably occur to civilians and civilian objects is “excessive” compared to the military advantage that will be gained by the attack.

LAWS opponents contend that LAWS are incapable of adhering to the principle of proportionality. For that to be true, opponents must establish that there are no circumstances in which LAWS could be used in an attack without resulting in excessive civilian casualties. In other words, they must prove that every use of LAWS would result in civilian casualties excessive to the military advantage gained by the attack. Such a notion is unsupportable on its face.

For example, LAWS could be used in the air to patrol a no-fly zone, hunting for enemy aircraft where no civilian aircraft are permitted to fly. The weapon may be programmed to recognize the profiles of enemy aircraft, their heat signature, airspeed threshold, and any other number of criteria that would distinguish them from civilian aircraft. Attacking an enemy aircraft under such circumstances would adhere to the principle of proportionality since the advantage of attacking the enemy aircraft would not be outweighed by the risk of excessive civilian casualties, which would be zero. Likewise, similarly capable LAWS could operate underwater to patrol for and attack enemy submarines without risk of causing excessive collateral damage.

Existing weapons that are arguably autonomous have a demonstrated ability to comply with the principle of proportionality. Israel’s Harpy-2 unmanned combat air vehicle, also known as the Harop, is a “fire and forget autonomous weapon” designed to destroy enemy radar stations.[18] The Harpy-2 autonomously loiters over the battlefield, automatically searches for and detects mobile or static anti-aircraft missile radar systems, and attacks by colliding with them and detonating.[19] LAWS opponents cannot credibly assert that the Harpy-2 constitutes a per se violation of LOAC. Even in a situation where an anti-aircraft radar is intentionally placed in a civilian area, attacks on those radar stations by the Harpy-2 do not automatically violate the principle of proportionality. Each such strike would require evaluation on a case-by-case basis.

Attempts to Ban LAWS

Regardless of the fact that LAWS do not present a per se violation of LOAC, international NGOs, U.N. officials, and even some nations are actively seeking to ban LAWS preemptively. Indeed, the amount of attention currently being paid to LAWS is primarily due to the fact that international human rights and arms control NGOs have raised objections to the weapons.

After the release of “Losing Humanity,” Human Rights Watch, along with several other NGOs, launched the Campaign to Stop Killer Robots in April 2013. As the name implies, the goal of the effort is not mere regulation of LAWS, but to have them treated as illegal per se and prohibit the development of any weapon that falls within the campaign’s broad definition of “autonomous.” Such efforts must not be ignored, as similar campaigns have been successful in the past.

Past NGO Campaigns. Opposition to the development of LAWS by international NGOs should not be discounted. Although sovereign states have the final say at meetings of the CCW and in ad hoc treaty negotiating forums, previous NGO campaigns—organized, funded, and led by some of the same NGOs seeking to ban LAWS—have played a significant role in efforts to ban other “controversial” weapons, such as anti-personnel landmines (APLs) and cluster munitions. Indeed, the current NGO campaign against LAWS is modeled on the success of past campaigns conducted by the very same NGOs.

In 1992, Human Rights Watch and a handful of other NGOs created the International Campaign to Ban Landmines, and within a mere five years helped develop the Convention on the Prohibition of the Use, Stockpiling, Production and Transfer of Anti-Personnel Mines and on their Destruction, also known as the Ottawa Convention, which banned APLs outright. The Ottawa Convention was adopted by 89 nations in September 1997 and currently boasts 162 parties.[20] This occurred despite the fact that at the time of its adoption two protocols, adopted in 1980 and 1996 through the CCW process, already placed comprehensive regulations on the design and use of APLs.[21] In other words, the CCW protocols did not go far enough for the NGOs and some nations, so they simply invented their own international treaty negotiating forum outside of the established CCW process and concocted their own treaty.[22]

In 2003, following up on their Ottawa Convention success, Human Rights Watch, Amnesty International, and other NGOs formed the Cluster Munition Coalition (CMC) and spearheaded the successful completion five years later of the Convention on Cluster Munitions (CCM).[23] The 88 parties to the CCM have pledged to ban the development and use of cluster munitions and to destroy their own existing stockpiles.[24] In this case, the CCW was still in the process of negotiating a protocol that would regulate the design and use of cluster munitions. The CMC and some nations—apparently impatient with the pace of the CCW negotiations and dissatisfied with merely regulating cluster munitions—simply convened another ad hoc international conference and drafted a treaty that banned cluster munitions.[25] Moreover, at a CCW meeting in 2011, the CCM parties, bolstered by the CMC, scuttled efforts to “merely” regulate cluster munitions via a CCW protocol.[26] In other words, the CCM parties and the CMC would countenance only a complete ban on cluster munitions.

Human rights and arms control NGOs hope to build on these successes by banning LAWS through the CCW process. If that cannot be achieved these NGOs will likely form yet another ad hoc treaty conference to draft a treaty banning LAWS regardless of what happens at the CCW. Debates regarding LAWS are already underway at the CCW, where NGOs such as Human Rights Watch and Amnesty International uniformly expressed their desire that LAWS be banned. Human Rights Watch called for “a preemptive prohibition on fully autonomous weapon systems.”[27] Amnesty International called for “a prohibition on the use of LAWS, including such weapons when they are ‘less lethal’ and can result in death and serious injury.” Short of a ban, Amnesty urged all governments “to impose a moratorium on the development, transfer, deployment and use of LAWS.”[28] The Campaign to Stop Killer Robots and ICRAC also called for LAWS to be banned at the CCW meeting.

The United Nations and the International Committee of the Red Cross. In addition to NGOs, U.N. officials have expressed opposition to LAWS:

  • Angela Kane, the U.N. High Representative for Disarmament Affairs, recently spoke out against LAWS, stating her belief “that there cannot be a weapon that can be fired without human intervention” because such weapons would bring forth “a faceless war and I think that’s really terrible and so to my mind I think it should be outlawed.”[29]
  • Christof Heyns, the U.N. special rapporteur on extrajudicial, summary or arbitrary executions, envisions dire consequences if LAWS are developed: “[T]he introduction of such powerful yet controversial new weapons systems has the potential to pose new threats to the right to life. It could also create serious international division and weaken the role and rule of international law—and in the process undermine the international security system.”[30] Unsurprisingly, Heyns calls on all states to place an immediate moratorium “on at least the testing, production, assembly, transfer, acquisition, deployment and use” of LAWS.[31]

For its part, the International Committee of the Red Cross (ICRC) has been less than enthusiastic about LAWS. It has expressed concern “over the potential human cost of autonomous weapon systems and whether they are capable of being used in accordance with international humanitarian law.” When asked whether it supports calls for a moratorium or a ban on LAWS, the ICRC responded that it “has not joined these calls for now” (emphasis added).[32] As further evidence of its skepticism, the ICRC appears to have adopted some of the views of LAWS opponents, including the view that LAWS would “make life-and-death decisions” without human control.[33] According to the ICRC, the deployment of LAWS “would reflect a paradigm shift and a major qualitative change in the conduct of hostilities” and has called on nations not to employ LAWS unless compliance with LOAC can be “guaranteed”—a high standard indeed.[34]

Skepticism Among Foreign Governments. Perhaps most concerning to the future viability of LAWS is the fact that some nations have already expressed a desire to ban LAWS or impose a moratorium on their development.

During meetings at the CCW in May 2014, several European nations expressed opinions regarding LAWS that ranged from skepticism to hostility. Germany, for example, stated that it “does not intend to have any weapon systems that take away the decision about life and death from men.”[35] Sweden expressed doubt whether LAWS were “a desirable development for a military force” and insisted that humans “should never be ‘out of the loop.’”[36] Croatia pledged that LAWS “will not be a part of our military and defense doctrine since we perceive them as being contrary to our comprehension of fundamental bases of the international humanitarian law.”[37] Austria went so far as to call on “all currently engaged in the development of such weapon systems to freeze these programmes and those deliberating to start such development not to do so.”[38]

Other nations have been more outspoken. Pakistan expressed all of the following in a single statement at the CCW meeting: LAWS “are by nature unethical,” “cannot be programmed to comply with International Humanitarian Law,” “lower the threshold of going to war,” “undermine international peace and security,” “create an accountability vacuum,” and “amount to a situation of one-sided killing.” Unsurprisingly, Pakistan called for LAWS to be banned:

We should not let the blind quest for the ultimate weapon, driven by commercial interests of the military-industrial complex, get the better of us. The introduction of LAWS would be illegal, unethical, inhumane and unaccountable as well as destabilizing for international peace and security with grave consequences. Therefore, their further development and use must be pre-emptively banned through a dedicated Protocol of the CCW. Pending the negotiations and conclusions of a legally binding Protocol, the states currently developing such weapons should place an immediate moratorium on their production and use.[39]

It is possible and even likely that more nations will raise objections to LAWS and join in efforts to ban them, either through the CCW process or through an ad hoc convention process similar to those that resulted in the Ottawa Convention and the Cluster Munitions Convention. This may be particularly true of nations that have neither the technology nor resources to develop or acquire LAWS.

What Would a CCW Protocol on LAWS Look Like?

In 2013, ICRAC produced a white paper that proposed an “Autonomous Weapons Convention” (AWC) and described its principal components. Nations that join the AWC would pledge, inter alia, “not to develop, test, produce, stockpile, deploy, transfer, broker transfer, or use weapon systems capable of autonomous target selection and engagement.”[40] Compliance with that pledge and other obligations would be enforced by a treaty-implementing organization (TIO), which would “implement technical safeguards, and conduct inquiries and investigations when so mandated.”[41]

In order to verify that a party to the AWC has not used LAWS, the ICRAC white paper proposes that parties to the treaty create an “evidence trail” for each and every single time that a weapon with potential autonomous capabilities is used in combat. Specifically, the ICRAC white paper proposes the following to ensure that every weapon used during an engagement was fired by a human:

Proving that the command to select and to engage a particular target was the action of a particular person is difficult, but an evidence trail that such a command was given can be generated and made difficult to forge or tamper with. Such evidence could include video and other data that was presented to the operator, plus a video view that includes the operator’s hands on controls and the operator’s view of the console. The data record would also include the commands as received by the console and codes for the identities of the accountable operator and accountable commander, which might be encrypted in physical keys which they are personally accountable for controlling at all times, and which are needed in order to operate the weapon.[42]

Each time a weapon is used in an engagement a unique “use of force identifier” (UFI) would be generated, given a time stamp, and recorded in a tamper-proof manner. Such UFI records would be stored by the state party, and “would be periodically downloaded during on-site inspections by the TIO” which would hold the records in a repository.[43]

On its face, the ICRAC proposal is unworkable and overly intrusive. The idea that advanced militaries, such as that of the United States, would consent to collect and keep electronic data and video evidence of every instance when a potentially autonomous weapon was used during combat strains credulity. That such militaries would then permit a treaty organization to “download” data from every use of force for analysis borders on the absurd.

The U.S. Should Oppose Attempts to Ban LAWS

The United States should never ratify a treaty along the lines proposed by ICRAC, not only due to its intrusive nature, but primarily because the United States will likely develop weapons systems that are considered autonomous that will strengthen U.S. national security. The Phalanx CIWS, the Dual Mode Brimstone, the Harpy, and the Samsung Techwin sentry gun are only the tip of the proverbial iceberg of autonomous or near-autonomous weapons that will shape the future battlefield.

The U.S. Department of Defense has already developed a responsible policy for the development of LAWS.[44] Directive 3000.09 requires that LAWS “be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force.” U.S. military personnel that operate LAWS must do so in accordance with LOAC and the applicable rules of engagement.

Although the United States is obligated to develop weapons that comply with LOAC, it is not required to preemptively prohibit the development of weapons that have not been proven to violate LOAC. Despite the arguments of LAWS opponents, there is no evidence that LAWS constitute per se violations of the principles of distinction and proportionality. To the contrary, weapons that qualify as autonomous—such as the Brimstone and the Harpy—have already been successfully deployed in combat with no reported violations of LOAC.

Moreover, LAWS have the potential to increase U.S. effectiveness on the battlefield while decreasing collateral damage and loss of human life.[45] Advanced sensors may be more precise in targeting a military objective than a manned system, and LAWS may perform better than humans in dangerous environments where a human combatant may act out of fear or rage.[46]

Preemptively banning a weapon is a questionable practice and is very rare. For example, CCW Protocol IV preemptively banned the use of lasers in combat to permanently blind enemy combatants. But even there, incidental or collateral instances of blinding caused by the legitimate military use of a laser are not prohibited by the protocol. LAWS opponents want to preemptively ban any weapon that has the capability to acquire and engage a target without being commanded to do so by a human operator, regardless of whether the weapon is capable of adhering to LOAC. That is an unworkable standard.

For these reasons:

  • The United States should use its voice within the CCW process in Geneva to oppose any CCW protocol that would ban or regulate the development or use of LAWS by U.S. armed forces. The U.S. should make its position clear at the next meeting of the CCW regarding LAWS, scheduled for April 2015.
  • Congress should fund the research and development of autonomous technology. The capabilities of LAWS to increase U.S. national security have yet to be fully explored, and a preemptive ban or moratorium on such research is against U.S. interests.
  • The U.S. delegation to the CCW should take particular care to identify nations that are inclined to support a ban or moratorium on LAWS, and persuade those nations against that course of action. Moreover, the U.S. should align itself with nations at the CCW who are committed to the responsible development and use of LAWS.

The United States should oppose attempts at the CCW to ban LAWS, and should continue to develop LAWS in a responsible manner in order to keep U.S. armed forces at the leading edge of military technology.

—Steven Groves is Bernard and Barbara Lomas Senior Research Fellow in the Margaret Thatcher Center for Freedom, of the Kathryn and Shelby Cullom Davis Institute for National Security and Foreign Policy, at The Heritage Foundation.

[1] P. W. Singer, “Robots at War: The New Battlefield,” The Wilson Quarterly, Vol. 30 (Winter 2009).

[2] “Phalanx Close-In Weapon System (CIWS),” Raytheon, (accessed December 17, 2014).

[3] Israel Aerospace Industries, “Harpy,” (accessed December 17, 2014).

[4] MBDA Missile Systems, “Brimstone,” (accessed December 17, 2014).

[5], “Samsung Techwin SGR-A1 Sentry Guard Robot,” (accessed December 17, 2014).

[6] U.S. Department of Defense, “Autonomy in Weapon Systems,” Directive No. 3000.09, November 21, 2012, p. 13, (accessed February 11, 2014). Although there is no consensus definition for LAWS, the U.S. definition contains the elements common to most other definitions. See, e.g., International Committee of the Red Cross, “Report of the ICRC Expert Meeting on ‘Autonomous Weapon Systems: Technical, Military, Legal and Humanitarian Aspects,’ 26–28 March 2014, Geneva: Meeting Highlights,” May 9, 2014, (accessed February 11, 2015). “There is no internationally agreed definition of autonomous weapon systems. For the purposes of the meeting, ‘autonomous weapon systems’ were defined as weapons that can independently select and attack targets, i.e. with autonomy in the ‘critical functions’ of acquiring, tracking, selecting and attacking targets.”

[7] Human Rights Watch, “Losing Humanity: The Case Against Killer Robots,” 2012, p. 1.

[8] The principles of distinction and proportionality are the central (but not only) areas of dispute regarding the ability of LAWS to comply with LOAC. Other contentious issues not addressed in this paper include, but are not limited to, individual criminal accountability, command responsibility, and state responsibility.

[9] Protocol Additional to the Geneva Conventions of 12 August 1949, and relating to the Protection of Victims of International Armed Conflicts (Protocol I). June 8, 1977, art. 48, 51(2), and 52(2), (accessed February 11, 2015).

[10] Civilians may lose their protected status, however, if they directly participate in hostilities. In such cases, they may be targeted as combatants. Likewise, civilian objects may lose their protected status if combatants use them, for instance, as a headquarters or as cover during combat.

[11] ICRC, “Customary International Humanitarian Law: Rule 47. Attacks Against Persons Hors de Combat,” (accessed February 11, 2015).

[12] See, for example, Noel E. Sharkey, “The Evitability of Autonomous Robot Warfare,” International Review of the Red Cross, Vol. 94 (Summer 2012), pp. 787 and 788–790. See also, Christof Heyns, “Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions,” U.N. Human Rights Council, April 9, 2013, ¶ 67, (accessed February 13, 2015). “There are several factors that will likely impede the ability of [lethal autonomous robots] to operate according to these rules in this regard, including the technological inadequacy of existing sensors, a robot’s inability to understand context, and the difficulty of applying of IHL language in defining non-combatant status in practice, which must be translated into a computer programme. It would be difficult for robots to establish, for example, whether someone is wounded and hors de combat, and also whether soldiers are in the process of surrendering” (footnotes omitted).

[13] Human Rights Watch, “Losing Humanity,” pp. 31–32.

[14] Michael N. Schmittt, “Autonomous Weapon Systems and International Humanitarian Law: A Reply to the Critics,” Harvard National Security Journal, Vol. 11 (2013).

[15] Ibid.

[16] MBDA Missile Systems, “Brimstone,” and, “Brimstone Advanced Anti-Armour Missile, United Kingdom,” (accessed February 11, 2015).

[17] Protocol I, art. 51(5)(b).

[18] Israel Aerospace Industries, “Harpy.” The “Harop” (Harpy-2) is the next generation of the Harpy missile.

[19], “Harop Loitering Munitions, UCAV, System, Israel,” (accessed February 11, 2014). According to some sources, Harpy-2 has the capability to have a human “in the loop” during the target selection and engagement phase. See Paul Scharre, “Autonomy, ‘Killer Robots,’ and Human Control in the Use of Force—Part II,” Just Security, July 9, 2014, (accessed February 11, 2015).

[20] Convention on the Prohibition of the Use, Stockpiling, Production and Transfer of Anti-Personnel Mines and on their Destruction, September 18, 1997, (accessed February 11, 2015).

[21] “Protocol on Prohibitions or Restrictions on the Use of Mines, Booby-Traps and Other Devices,” October 10, 1980, and “Protocol on Prohibitions or Restrictions on the Use of Mines, Booby-Traps and Other Devices as Amended on 3 May 1996,” May 3, 1996.

[22] Steven Groves and Ted R. Bromund, “The Ottawa Mine Ban Convention: Unacceptable on Substance and Process,” Heritage Foundation Backgrounder No. 2496, December 13, 2010,

[23] Cluster Munition Coalition, (accessed February 11, 2015).

[24] Diplomatic Conference for the Adoption of a Convention on Cluster Munitions, Dublin, May 19–30, 2008, (accessed February 11, 2015).

[25] Steven Groves and Ted R. Bromund, “The United States Should Not Join the Convention on Cluster Munitions,” Heritage Foundation Backgrounder No. 2550, April 28, 2011, pp. 19–21,

[26] Daryl G. Kimball, “CCW Review Conference Fails to Reach Consensus on Weak Cluster Munitions Protocol,” Arms Control Now blog, November 25, 2011, (accessed February 11, 2015).

[27] “Statement by Human Rights Watch to the Convention on Conventional Weapons, Informal Meeting of Experts on Lethal Autonomous Weapons Systems,” May 13, 2014, $file/NGOHRW_LAWS_GenStatement_2014.pdf (accessed February 11, 2015).

[28] Amnesty International, “CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), statement by Brian Wood, May 13, 2014,$file/NGOAmnesty_MX_LAWS_2014.pdf (accessed February 11, 2015).

[29] Ben Farmer, “Killer Robots a Small Step Away and Must Be Outlawed, Says Top UN Official,” The Telegraph, August 27, 2014, (accessed February 11, 2015).

[30] Heyns, “Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary executions,” ¶ 30.

[31] Ibid., pp. 113 and 118.

[32] International Committee of the Red Cross, “Autonomous Weapon Systems–Q & A,” November 12, 2014, (accessed February 11, 2015).

[33] Ibid.

[34] Ibid.

[35] General Statement by Federal Republic of Germany, CCW Expert Meeting on Lethal Autonomous Weapon Systems, Geneva, May 13–16, 2014,$file/Germany+LAWS+2014.pdf (accessed February 11, 2015).

[36] “Remarks by Sweden at the Expert Meeting on LAWS in the CCW,” May 13, 2014,$file/Sweden+LAWS+2014.pdf (accessed February 11, 2015).

[37] “Closing Statement of the Republic of Croatia,” CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems, May 16, 2014,$file/Croatia_MX_LAWS_FinalStatement_2014.pdf (accessed February 11, 2015).

[38] “Closing Statement of Austria,” CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems, May 16, 2014,$file/Austria_LAWS_FinalStatement_2014.pdf (accessed February 11, 2015).

[39] Statement by Pakistani Ambassador Zamir Akram at the Informal Meeting of Experts on Lethal Autonomous Weapon Systems (LAWS) in the framework of the Convention on Certain Conventional Weapons, May 13, 2014,$file/Pakistan+LAWS+2014.pdf (accessed February 11, 2015).

[40] Mark Gubrud and Jürgen Altmann, “Compliance Measures for an Autonomous Weapons Convention,” International Committee for Robot Arms Control, Working Paper No. 2, May 2013, p. 3, (accessed February 11, 2015).

[41] Ibid., pp. 3–4.

[42] Ibid., p. 6.

[43] Ibid.

[44] U.S. Department of Defense, “Autonomy in Weapon Systems,” Directive No. 3000.09, November 21, 2012, (accessed February 11, 2015).

[45] James Jay Carafano, “Autonomous Military Technology: Opportunities and Challenges for Policy and Law,” Heritage Foundation Backgrounder No. 2932, August 6, 2014,

[46] Schmitt, “Autonomous Weapon Systems and International Humanitarian Law,” p. 25.


Steven Groves
Steven Groves

Bernard and Barbara Lomas Senior Research Fellow