Skip to main content
Log in

Technology with No Human Responsibility?

  • Published:
Journal of Business Ethics Aims and scope Submit manuscript

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Notes

  1. A more extended analysis of the law of agency as it might apply to artificial agents is found in Chopin and White (2012).

  2. (Accessed at http://www.amazon.com/Robot-Futures-Illah-Reza-Nourbakhsh/dp/0262018624/ref=sr_1_1?s=books&ie=UTF8&qid=1375020104&sr=1-1&keywords=future+of+robots, July 28, 2013).

  3. Elsewhere I have explored the varying conceptions of autonomy that are being used in this discourse; see M. Noorman and D.G. Johnson, “Negotiating Autonomy and Responsibility in Military Robots”, Ethics and Information Technology, forthcoming.

  4. Google has succeeded in convincing several municipalities to allow Google’s so-called autonomous cars to operate in their areas but these cars are not unmanned.

  5. Although the idea will not be taken up here, it is worth noting that the notion of an incomprehensible and uncontrollable technology needs to be unpacked for many current technologies are incomprehensible and uncontrollable to some but not to others.

References

  • Allen, C., Smit, I., & Wallach, W. (2005). Artificial morality: top–down, bottom–up, and hybrid approaches. Ethics and Information Technology, 7(3), 149–155.

    Article  Google Scholar 

  • Anderson, S. L. (2011). Machine metaethics. In M. Anderson & S. L. Anderson (Eds.), Machine ethics (pp. 21–27). New York: Cambridge University Press.

    Chapter  Google Scholar 

  • Anderson, M., & Anderson, S. L. (Eds.). (2011). Machine ethics. New York: Cambridge University Press.

    Google Scholar 

  • Arkin, R. C. (2008). Governing lethal behavior: Embedding ethics in a hybrid deliberative/reactive robot architecture part I: Motivation and philosophy. In Human-Robot Interaction (HRI), 2008 3rd ACM/IEEE International Conference on (pp. 121-128). IEEE.

  • Arkin, R. C. (2009). Ethical robots in warfare. Technology and Society Magazine, IEEE, 28(1), 30–33.

    Article  Google Scholar 

  • Arkin, R. C. (2010). The case for ethical autonomy in unmanned systems. Journal of Military Ethics, 9(4), 332–341.

    Article  Google Scholar 

  • Asaro, P. M. (2012). 11 A body to kick, but still no soul to damn: Legal perspectives on robotics. In P. Lin, K. Abney, & G. A. Bekey (Eds.), Robot ethics: The ethical and social implications of robotics. Cambridge: MIT Press.

    Google Scholar 

  • Asaro, P. (2007). Robots and responsibility from a legal perspective. Proceedings of the IEEE.

  • Bijker, W. E., Hughes, T. P., & Pinch, T. (Eds.). (1987). The social construction of technological systems: New directions in the sociology and history of technology. Cambridge, MA: The MIT Press.

  • Cummings, M. L. (2004). Creating moral buffers in weapon control interface design. Technology and Society Magazine, IEEE, 23(3), 28–33.

    Article  Google Scholar 

  • Cummings, M. L. (2006). Automation and accountability in decision support system interface design. Journal of Technology Studies, 32(1), 23–31.

    Google Scholar 

  • De George, R. T. (2003). The ethics of information technology and business. Malden: Blackwell Publishing.

    Google Scholar 

  • Hellström, T. (2013). On the moral responsibility of military robots. Ethics and Information Technology, 12(2), 99–107.

    Article  Google Scholar 

  • Johnson, D. G. (2005). The social construction of technology. In C. Mitcham (Ed.), The encyclopedia of science, technology, and ethics. Farmington Hills: Gale Group Publishing.

    Google Scholar 

  • Johnson, D. G. (2006). Computer systems: Moral entities but not moral agents. Ethics and Information Technology, 8(4), 195–204.

    Article  Google Scholar 

  • Johnson, M., Bradshaw, J. M., Feltovich, P. J., Jonker, C. M., van Riemsdijk, B., & Sierhuis, M. (2011). The fundamental principle of coactive design: Interdependence must shape autonomy. Coordination, organizations, institutions, and norms in agent systems VI. Heidelberg: Springer.

    Google Scholar 

  • MacKenzie, D., & Wajcman, J. (1996). The social shaping of technology (2nd ed.). Buckingham: Open University Press.

    Google Scholar 

  • Matthias, A. (2004). The responsibility gap: Ascribing responsibility for the actions of learning automata. Ethics and Information Technology, 6(3), 175–183.

    Article  Google Scholar 

  • Nagenborg, M., Capurro, R., Weber, J., & Pingel, C. (2008). Ethical regulations on robotics in Europe. AI & Society, 22(3), 349–366.

    Article  Google Scholar 

  • Nourbakhsh, I. R. (2013). Robot futures. MIT Press.

  • Petersen, S. (2007). The ethics of robot servitude. Journal of Experimental & Theoretical Artificial Intelligence, 19(1), 43–54.

    Article  Google Scholar 

  • Santoro, M., Marino, D., & Tamburrini, G. (2008). Learning robots interacting with humans: from epistemic risk to responsibility. AI & Society, 22(3), 301–314.

    Article  Google Scholar 

  • Sparrow, R. (2007). Killer robots. Journal of applied philosophy, 24(1), 62–77.

    Article  Google Scholar 

  • Sullins, John P. (2006). When is a robot a moral agent? International Review of Information Ethics, 6, 23–30.

    Google Scholar 

  • Sullins, John P. (2009). Artificial moral agency in technoethics. In R. Luppicini & R. Adell (Eds.), Handbook of research on technoethics (pp. 205–221). New York: IGI Global.

    Chapter  Google Scholar 

  • U.S. Department of Defense (2012). Task force report: The role of autonomy in DoD systems. http://www.fas.org/irp/agency/dod/dsb/autonomy.pdf. Accessed July 16, 2013.

  • Whitby, B. (2008). Sometimes it’s hard to be a robot: A call for action on the ethics of abusing artificial agents. Interacting with Computers, 20(3), 326–333.

    Article  Google Scholar 

Download references

Acknowledgments

Research for this paper was supported by the National Science Foundation under Grant No. 1058457. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author and do not necessarily reflect the views of the National Science Foundation. This article has been greatly improved from comments on an earlier version from Norm Bowie and Keith Miller.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Deborah G. Johnson.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Johnson, D.G. Technology with No Human Responsibility?. J Bus Ethics 127, 707–715 (2015). https://doi.org/10.1007/s10551-014-2180-1

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10551-014-2180-1

Keywords

Navigation