Skip to main content
Log in

Framing robot arms control

  • Original Paper
  • Published:
Ethics and Information Technology Aims and scope Submit manuscript

Abstract

The development of autonomous, robotic weaponry is progressing rapidly. Many observers agree that banning the initiation of lethal activity by autonomous weapons is a worthy goal. Some disagree with this goal, on the grounds that robots may equal and exceed the ethical conduct of human soldiers on the battlefield. Those who seek arms-control agreements limiting the use of military robots face practical difficulties. One such difficulty concerns defining the notion of an autonomous action by a robot. Another challenge concerns how to verify and monitor the capabilities of rapidly changing technologies. In this article we describe concepts from our previous work about autonomy and ethics for robots and apply them to military robots and robot arms control. We conclude with a proposal for a first step toward limiting the deployment of autonomous weapons capable of initiating lethal force.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

Notes

  1. The U.S. Army Science Board (2002) describes a scale of ten levels of autonomous behavior beyond Manual—Remote Control (0). At the lowest level is Simple Automation (1) followed by Automated Tasks and Functions (2), Scripted Missions (3), Semi-Automated Missions/Simple Decision Making (4), Complex Missions Specific Reasoning (5), Dynamically Mission Adaptable (6), Synergistic Multi-Mission Reasoning (7), Human-Like Autonomy in a Mixed Team (8), Autonomous Teams with Unmanned Leader/Mission Manager (9), Autonomous—Conglomerate (10). Most of the levels of autonomous behavior are based upon projected future technological capabilities. The discussion in this paper is directed at the development of systems capability of initiating lethal force at level (4) and level (5).

  2. Arkin does propose that a human supervisor could override the ethical governor.

  3. The U.S. military will wish to permit autonomous defensive systems such as anti-ballistic missile systems (e.g., Patriot) and ship defense systems (e.g., Phalanx).

  4. We are grateful to Jürgen Altmann and an anonymous reviewer for pointing this out.

References

  • Altmann, J. (2009). Preventive arms control for uninhabited military vehicles. In R. Capurro & M. Nagenborg (Eds.), Ethics for robotics. AKA Verlag, Heidelberg.

  • Arkin, R. (2009). Governing lethal behavior in autonomous robots. Chapman and Hall: CRC.

    Book  Google Scholar 

  • Arkin, R. (2012). Presentations at the EPIIC international symposium on conflict in the 21st century. Tufts University, February 22, 23.

  • Asaro, P. (2008). How just could a robot war be? In P. Brey, A. Briggle, & K. Waelbers (Eds.), Current issues in computing and philosophy (pp. 50–64). Amsterdam, The Netherlands: IOS Press.

    Google Scholar 

  • Borenstein, J. (2008). The ethics of autonomous military robots. Studies in Ethics, Law, and Technology 2(1): Article 2. doi:10.2202/1941-6008.1036. Available at: http://www.bepress.com/selt/vol2/iss1/art2.

  • Dahm, W. J. A. (2012). Killer robots are science fiction. The Wall Street Journal, February 16th 2011. Available online at http://online.wsj.com/article/SB10001424052970204883304577221590015475180.html. Accessed 13 Oct 2012

  • Dancy, J. (2011). Contribution to discussion on “The Future of Moral Machines”, On the human. National Humanities Center. http://onthehuman.org/2011/12/the-future-of-moral-machines/. Accessed 1 May 2012.

  • Dennett, D. C. (1978). Brainstorms. Cambridge: MIT Press.

    Google Scholar 

  • Finn, P. (2011). A future for drones: Automated killing. The Washington post, September 19, 2011. Available online at http://www.washingtonpost.com/national/national-security/a-future-for-drones-automated-killing/2011/09/15/gIQAVy9mgK_story.html. Accessed 19 December 2011.

  • Fodor, J. A. (1983). The modularity of mind. Cambridge: MIT Press.

    Google Scholar 

  • Gips, J. (1991). Towards the ethical robot. In K. G. Ford, C. Glymour, & P. J. Hayes (Eds.), Android epistemology (pp. 243–252). Cambridge: MIT press.

  • Gormley, D. M. (2008). Missile contagion: Cruise missile proliferation and the threat to international security. London: Praeger.

    Google Scholar 

  • Hollnagel, E., Woods, D. D., & Leveson, N. (Eds.). (2006). Resiliance engineering: Concepts and precepts. Aldershot: Ashgate Publishing.

  • Kim, T.-G. (2010). Machine gun-armed robots to guard DMZ. The Korea Times, June 24, 2010. Available online at http://www.koreatimes.co.kr/www/news/biz/2010/06/123_68227.html. Accessed 19 December 2011.

  • Krishnan, A. (2009). Killer robots: Legality and ethicality of autonomous weapons. Burlington: Ashgate.

    Google Scholar 

  • Lin, P. (2011). Drone-ethics briefing: What a leading robot expert told the CIA, The Altantic. December 15, 2011. Available online at http://www.theatlantic.com/technology/archive/2011/12/drone-ethics-briefing-what-a-leading-robot-expert-told-the-cia/250060/. Accessed 19 December 2011.

  • Lokhorst, G., & van den Hoven, J. (2012). Responsibility for military robots. In P. Lin, K. Abney, & G. A. Bekey (Eds.), Robot ethics. Cambridge: MIT Press.

  • Matthias, A. (2011). Algorithmic moral control of war robots: Philosophical questions. Law, Innovation and Technology, 3(2), 279–301.

    Article  Google Scholar 

  • McCarthy, J., & Hayes, P. J. (1969). Some philosophical problems from the standpoint of artificial intelligence. In D. Michie, & B. Meltzer (Eds.), Machine Intelligence 4 (pp. 463–502). Edinburgh: Edinburgh University Press.

  • Sharkey, N. (2011). The automation and proliferation of military drones and the protection of civilians. Law, Innovation and Technology, 3(2), 229–240.

    Article  Google Scholar 

  • Sharkey, N. (2012). Killing made easy: From joysticks to politics. In P. Lin, K. Abney, & G. A. Bekey (Eds.), Robot ethics. Cambridge: MIT Press.

    Google Scholar 

  • Singer, P. W. (2009). Wired for war. New York: Penguin Press.

    Google Scholar 

  • Sparrow, R. (2009). Predators or plowshares? Arms control of robotic weapons. IEEE Technology and Society, 28(1), 25–29.

    Article  MathSciNet  Google Scholar 

  • Sparrow, R. (2011). Robotic weapons and the future of war. In J. Wolfendale, & P. Tripodi (Eds.), New wars and new soldiers: Military ethics in the contemporary world (pp. 117–133). Surrey, UK & Burlington, VA: Ashgate.

  • Stahl, B. C. (2002). Can a computer adhere to the categorical imperative? A contemplation of the limits of transcendental ethics in IT. Paper presented at the international conference on systems research, informatics and cybernetics, Baden Baden, GE.

  • Taleb, N. N. (2007). The black swan: the impact of the highly improbable. New York: Random House.

    Google Scholar 

  • U.S. Army Science Board. (2002). Ad Hoc study on human robot interface issues. Available online at http://www.dtic.mil/cgi-bin/GetTRDoc?AD=ADA411834. Accessed 19 April 2012.

  • U.S. Army Medical Department. (2008). MHAT-IV. http://www.armymedicine.army.mil/reports/mhat/mhat_iv/mhat-iv.cfm. Accessed 20 December 2011.

  • U.S. Air Force. (2009). Unmanned Aircraft Systems Flight Plan 2009–2047. Available at http://www.govexec.com/pdfs/072309kp1.pdf. Accessed 19 April 2012.

  • U.S. Department of Defense (2009). Fiscal Year 20092034 Unmanned systems integrated roadmap. http://www.acq.osd.mil/psa/docs/UMSIntegratedRoadmap2009.pdf. Accessed 20 December 2011.

  • U.S. Department of Defense. (2012). Task force report: The role of autonomy in DoD systems. http://www.fas.org/irp/agency/dod/dsb/autonomy.pdf. Accessed 22 September 2012.

  • Wallach, W., & Allen, C. (2009). Moral machines: Teaching robots right from wrong. Oxford: Oxford University Press.

    Google Scholar 

  • Woods, D. D., & Hollnagel, E. (2006). Joint cognitive systems: Patterns in cognitive systems engineering. Boca Raton: CRC Press.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wendell Wallach.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Wallach, W., Allen, C. Framing robot arms control. Ethics Inf Technol 15, 125–135 (2013). https://doi.org/10.1007/s10676-012-9303-0

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10676-012-9303-0

Keywords

Navigation