Military Robotics & Relationality: Criteria for Ethical Decision-Making

  • Chapter
  • First Online:
Responsible Innovation 1

Abstract

In this article, we argue that the implementation of military robots must be preceded by a careful reflection on the ethics of warfare in that warfare must be regarded as a strictly human activity, for which human beings must remain responsible and in control and that ethical decision-making can never be transferred to autonomous robots in the foreseeable future, since these robots are not capable of making ethical decisions. Non-autonomous robots require that humans authorize any decision to use lethal force, i.e., they require a ‘man-in-the-loop’. We propose a model of relationality for the moral attitude that is needed to confront the moral questions and dilemmas that will be faced by future military operations using robots. This model provides two minimal criteria for ethical decision making: non-binary thinking and reflexivity by means of rooting and shifting. In the second part of this article, we apply these criteria to today’s human operators of non-autonomous military robots and secondly, to tomorrow’s autonomous military robots, and ask whether robots are capable of relationality, and to what degree human operators make decisions on the basis of relationality. We then conclude with what we take to be a possible, albeit limited, role for robotics in the military with regard to both the current and the foreseeable future role of military robotics.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
EUR 29.95
Price includes VAT (Germany)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
EUR 85.59
Price includes VAT (Germany)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
EUR 106.99
Price includes VAT (Germany)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free ship** worldwide - see info
Hardcover Book
EUR 106.99
Price includes VAT (Germany)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free ship** worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    The Predator is an unmanned airplane which can remain airborne for 24 h and is currently employed extensively in Afghanistan. The Predator drones can fire Hellfire missiles and are flown by pilots located at a military base in the Nevada desert, thousands of miles away from the battlefield. Their successor, the Reaper, which may phase out the F-16, has already been spotted in Afghanistan. This machine can carry 5,000 pounds of explosive devices, Hellfire missiles, or laser directed bombs, and uses day-and-night cameras to navigate through a sheet of clouds. This unmanned combat aerial vehicle is operated by two pilots located at a ground control station behind a computer at a safe distance from the war zone.

  2. 2.

    To make sure that military robots would perform reliably and lawfully they could be required to pass a military Turing Test “That means an autonomous system should be no worse than a human at taking decisions [about valid targets]” (Mick 2008). This test entails an assessment of the comparative morality pairs of descriptions of morally significant behaviour where one describes the actions of a human being in an ethical dilemma and the other the actions of a machine faced with the same dilemma. If the machine is not identified as the less moral pair significantly, then it has passed the test (Mick 2008).

  3. 3.

    Singapore has announced its goal to build a military robot, named Urban Warrior, that can fight autonomously in urban environment like a human soldier, and conducted a contest in 2008. South Korea and Israel have been already deploying autonomous armed robot border guards. The South Korean system, the Samsung Techwin SGR-A1 stationary sentry robot, is capable of interrogating suspects, identifying potential enemy intruders, and autonomous firing of its weapon (Krishnan 2009). The unmanned ground vehicle commissioned by the Israeli military, the Guardium, is an autonomous observation and target intercept system.

  4. 4.

    Although we can state that the robot is causally responsible, but the robot is off the hook regarding moral responsibility. Some authors claim that fully autonomous robots can be considered as moral agents (Dennett 1997), but this discussion is beyond the scope of this article and such an attribution unnecessarily complicates the issue of responsibility ascription for immoral actions.

  5. 5.

    If ethical theories do not provide moral principles that can be straightforwardly applied to get the right answer, what then is their role, if any, in applied ethics? Their role is, first, instrumental in discovering the ethical aspects of a problem or situation. Different ethical theories stress different aspects of a situation; consequentialism for example draws attention to how consequences of actions may be morally relevant; deontological theories might draw attention to the moral importance of promises, rights and obligations. And virtue ethics may remind us that certain character traits can be morally relevant. Ethical theories also suggest certain arguments or reasons that can play a role in moral judgments.

  6. 6.

    Besides the substantial cause, military robots also are an expression of, for example, the ‘culture of fear’ and increasing risk-averseness in our society.

  7. 7.

    One of the most cited arguments in support of drones is to avoid the practice of moral disengagement known as dehumanization which is a common cause of violence, murder and potentially genocide.

  8. 8.

    Recent interviews with former members of the Taliban have also revealed the increase in PTSD and other war related mental disorders suffered by their fighters as well as the local population (Newsweek Dec 6, 2010) www.newsweek.com/2010/12/06/do-the-taliban-get-ptsd.html. The importance of setting standards for ethical decision making in the military is thus a shared responsibility as these standards affect not only those that are killed but also those doing the killing and those supporting them.

  9. 9.

    As General Stanley McCrystal (the former commander of the ISAF mission) shared in an interview, while computers are able to process wealth of information they are not capable of understanding it. 3/11/10, www.idga.org/podcenter.cfm?externalid=826&mac=IDGA_OI_Featured_2010&utm_source=idga.org&utm_medium=email&utm_campaign=IDGAOptIn&utm_content=11/4/10.

  10. 10.

    See also Levenson (1981), Rotter (1996).

  11. 11.

    However, military robots still controlled (as opposed to monitored) by human beings are indeed still able to effectuate ethical decisions made by a remote operator.

References

  • Anderson, M., and S.L. Anderson. 2007. Machine ethics: Creating an ethical intelligent agent. AI Magazine 28(4): 15–26.

    Google Scholar 

  • Aquino, K., A. Reed, S. Thau, and D. Freeman. 2007. A grotesque and dark beauty: How moral identity and mechanisms of moral disengagement influences cognitive and emotional reactions to war. Journal of Experimental Social Psychology 43: 385–392.

    Article  Google Scholar 

  • Arendt, H. 1968. The human condition. New York: Schocken.

    Google Scholar 

  • Arendt, H. 2005. Responsibility and judgment. New York: Schocken.

    Google Scholar 

  • Arendt, H. 2006. Eichmann in Jerusalem: A report on the banality of evil. New York: Penguin Classics.

    Google Scholar 

  • Arkin, R.C. 2007. Governing lethal behavior: Embedding ethics in a hybrid deliberative/reactive robot architecture, Technical report GIT-GVU-07-11. Atlanta: Georgia Institute of Technology.

    Google Scholar 

  • Bandura, A. 1999. Moral disengagement in the perpetration of inhumanities. Personality and Social Psychology Review 3: 193–209.

    Article  Google Scholar 

  • Bandura, A. 2002. Selective moral disengagement in the exercise of moral agency. Journal of Moral Education 31(2): 101–119.

    Article  Google Scholar 

  • Clarke, R. 1994. Asimov’s laws of robotics: Implications for information technology. Computer 27(1): 57–66.

    Article  Google Scholar 

  • Dennett, D.C. 1997. When Hal kills, Who’s to blame? In Hals legacy: 2001’s computer as dream and reality, ed. D. Stork, 351–365. Cambridge: MIT Press.

    Google Scholar 

  • Detert, J.R., L.K. Treviño, and V.L. Sweitzer. 2008. Moral disengagement in ethical decision making: A study of antecedents and outcomes. Journal of Applied Psychology 93(2): 374–391.

    Article  Google Scholar 

  • Donnelly, S.B. 2005. Long-distance warriors. Time Magazine, 4 December 2005.

    Google Scholar 

  • Fitzsimonds, J.R., and T.G. Mahnken. 2007. Military officer attitudes towards UAV adoption: Exploring institutional impediments to innovation. Joint Force Quarterly 46: 96–103.

    Google Scholar 

  • Grossman, D. 1996. On killing: The psychological cost of learning to kill in war and society. New York: Little, Brown, and Company.

    Google Scholar 

  • Gulam, H., and S.W. Lee. 2006. Uninhabited combat aerial vehicles and the law of armed conflicts. Australian Army Journal 3(2): 123–136.

    Google Scholar 

  • Kaag, J., and W. Kaufman. 2009. Military frameworks: Technological know-how and the legitimization of warfare. Cambridge Review of International Affairs 22(4): 585–606.

    Article  Google Scholar 

  • Kaldor, M. 2004. New & old wars: Organized violence in a global era. Cambridge: Polity.

    Google Scholar 

  • Kaldor, M., and B. Vashee (eds.). 1997. New wars. Restructuring the global military sector. London: Pinter.

    Google Scholar 

  • Kenyon, H.S. 2006. Israel deploys robot guardians. Signal 60(7): 41–44.

    Google Scholar 

  • Krishnan, A. 2009. Killer robots. Legality and ethicality of autonomous weapons. Farnham: Ashgate Publishing Limited.

    Google Scholar 

  • Levenson, H. 1981. Differentiating among internality, powerful others, and chance. In Research with the locus of control construct: Vol 1. Assessment methods, ed. H.M. Lefcourt, 15–63. New York: Academic.

    Chapter  Google Scholar 

  • Mastroianni, G.R. 2011. The person–situation debate: Implications for military leadership and civilian–military relations. Journal of Military Ethics 10(1): 2–16.

    Article  Google Scholar 

  • Matthias, A. 2004. The responsibility gap: Ascribing responsibility for the actions of learning automata. Ethics and Information Technology 6: 175–183.

    Article  Google Scholar 

  • McAlister, A.L. 2001. Moral disengagement: Measurement and modification. Journal of Peace Research 38: 87–99.

    Article  Google Scholar 

  • Mick, J. 2008. Can robots commit war crimes? Daily Tech, 29 February 2008. http://www.dailytech.com/Can+Robots+Commit+War+Crimes/article10917.htm

  • Moller, A.C., and E.L. Deci. 2010. Interpersonal control, dehumanization, and violence: A self-determination theory perspective. Group Processes and Intergroup Relations 13(1): 41–53.

    Article  Google Scholar 

  • Moshman, D. 2007. Us and them: Identity and genocide. Identity 7(2): 115–135.

    Article  Google Scholar 

  • Rotter, J.B. 1996. Generalized expectancies for internal versus external control of reinforcement, Psychological monographs: General and applied, vol 80, 1–28. Washington: American Psychological Association.

    Google Scholar 

  • Royakkers, L.M.M., and Q. van Est. 2010. The cubicle warrior: The marionette of digitalized warfare. Ethics and Information Technology 12: 289–296.

    Article  Google Scholar 

  • Shalev, A.Y. 2002. Acute stress reactions in adults. Biological Psychiatry 51: 532–543.

    Article  Google Scholar 

  • Sharkey, N. 2008. Cassandra or false prophet of doom: AI robots and war. IEEE Intelligent Systems 23(4): 14–17.

    Article  Google Scholar 

  • Singer, P.W. 2009. Wired for war: The robotics revolution and conflict in the twenty-first century. New York: The Penguin Press.

    Google Scholar 

  • Slim, H. 2007. Killing civilians: Method, madness and morality in war. London: Hurst & Company.

    Google Scholar 

  • Sparrow, R. 2007. Killer robots. Journal of Applied Philosophy 24(1): 62–77.

    Article  Google Scholar 

  • Sullins, J. 2010. RoboWarfare: Can robots be more ethical than humans on the battlefield? Ethics and Information Technology 12(3): 263–275.

    Article  Google Scholar 

  • Tanielian, T., and L.H. Jaycox (eds.). 2008. Invisible wounds of war: Psychological and cognitive injuries, their consequences, and services to assist recovery. Santa Monica: RAND Corporation.

    Google Scholar 

  • Topolski, A. 2010. Peacekee** without banisters: The need for new practices that go beyond just war. In At war for peace, ed. M. Forough, 49–56. Oxford: Inter-Disciplinary Press.

    Google Scholar 

  • Treviño, L.K., and S.A. Youngblood. 1990. Bad apples in bad barrels: A causal analysis of ethical decision-making behavior. Journal of Applied Psychology 74: 378–385.

    Article  Google Scholar 

  • United States Air Force. 2009. Unmannde aircraft systems flight plan 2009–2047. Washington. http://www.unmanned.co.uk/unmanned-systems-special-reports/usaf-unmanned-aircraft-systems-flight-plan-2009-2047/.

  • Van de Poel, I.R., and L.M.M. Royakkers. 2007. The ethical cycle. Journal of Business Ethics 71(1): 1–13.

    Article  Google Scholar 

  • Veruggio, G., and F. Operto. 2008. Roboethics: Social and ethical implications of robotics. In Springer handbook of robotics, ed. B. Siciliano and O. Khatib, 1499–1524. Berlin: Springer.

    Chapter  Google Scholar 

  • Wallach, W., and C. Allen. 2008. Moral machines. New York: Oxford University Press.

    Google Scholar 

  • Yuval-Davies, N. 1998. What is transversal politics? Soundings 12(Summer): 94–98.

    Google Scholar 

Download references

Acknowledgments

This research is part of the research program ‘Moral fitness of military personnel in a networked operation environment’, which is supported by the Netherlands Organization for Scientific Research (NWO) under grant number 313-99-110.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lambèr Royakkers .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer Science+Business Media Dordrecht

About this chapter

Cite this chapter

Royakkers, L., Topolski, A. (2014). Military Robotics & Relationality: Criteria for Ethical Decision-Making. In: van den Hoven, J., Doorn, N., Swierstra, T., Koops, BJ., Romijn, H. (eds) Responsible Innovation 1. Springer, Dordrecht. https://doi.org/10.1007/978-94-017-8956-1_20

Download citation

Publish with us

Policies and ethics

Navigation