Skip to main content
Log in

Engineering responsibility

  • Original Paper
  • Published:
Ethics and Information Technology Aims and scope Submit manuscript

Abstract

Many optimistic responses have been proposed to bridge the threat of responsibility gaps which artificial systems create. This paper identifies a question which arises if this optimistic project proves successful. On a response-dependent understanding of responsibility, our responsibility practices themselves at least partially determine who counts as a responsible agent. On this basis, if AI or robot technology advance such that AI or robot agents become fitting participants within responsibility exchanges, then responsibility itself might be engineered. If we have good reason to think such technological advances are likely, then we should take steps to address the potential for engineering responsibility.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. Nyholm (2020) for extensive discussion of the many ethical issues current robot technology raises.

  2. Shoemaker’s pluralistic approach relies on distinguishing particular sentiments which correspond with each type of responsibility. I’m not convinced by that aspect of the view, so in what follows I give what I take to be a broader and less controversial description of each type.

  3. There is a natural connection between answerability and legal responsibility (Coeckelbergh, 2020); however, to reiterate, this paper is concerned solely with moral responsibility.

  4. See McKenna (2012) for a “conversational” approach to moral responsibility which is based on the natural structure of these kinds of attitudinal exchange.

  5. The contrast is sometimes expressed in terms of a “being responsible” vs. “holding responsible” distinction which can be seen in the following characteristic biconditional: “S is morally responsible (for action x) if and only if it would be appropriate to hold S morally responsible (for action x)” (Wallace, 1994, 91). Response-independent theorists give explanatory priority to the left side—the “being” side—of the biconditional, while response-dependent theorists give explanatory priority to the right side—the “holding” side—of the biconditional.

  6. In emphasizing the reactive attitudes, my argument thus focuses on accountability; however, the response-dependent approach to responsibility encompasses attributability and answerability as well. For example, a discussion of response-dependent attributability might focus on responses such as admiration or disdain.

  7. For an extensive argument in favor of reading Strawson’s view as response-dependent, see Shoemaker (2022).

  8. For readers skeptical of the notion of robots responding with resentment, again consider Star Wars: most viewers easily recognize when R2-D2’s bleeps and bloops are indignant.

  9. A referee correctly points out that Coeckelbergh’s and Danaher’s points need not entail moral agency but possibly only concern whether robots or artificial systems should be considered moral patients. As I discuss in the next section, the response-dependent approach might support a more radical shift which turns away from these kinds of moral status claims.

  10. Compare Joel Krueger and Lucy Osler’s (2019) discussion of external means of influencing our affective lives.

  11. Note that Strawson’s view has appealed to many as a response to the problem of determinism. Frischmann and Selinger intentionally elide nuances within the free will debate (2018, 301–3); thus, though my purpose here is not to confront directly their arguments, the present discussion suggests that a Strawsonian response to their idea of “engineered determinism” might be compelling.

  12. See McGeer (2014) for an interpretation of Strawson’s approach to responsibility in terms of a two-tiered indirect consequentialism.

  13. The “moral” in “moral ecology” derives from Vargas’s agency cultivation model, which includes an element of aiming toward moral improvement; however, the idea need not be moralized. We might instead think in terms of an “interpersonal” or “social” ecology.

  14. Interestingly, a relational ethics might undermine the (indirect) consequentialist aspects of Vargas’s and McGeer’s approaches to our responsibility practices. A closer examination of the relationship between relational ethics and indirect consequentialism would seem warranted because the latter arguably can avoid the two problems Coeckelbergh uses to motivate the former.

  15. This suggests an addendum to John-Stewart Gordon and David Gunkel’s (2022) recent discussion of technological advances and moral status. Consistent with Coeckelbergh, Gordon and Gunkel argue that innovations in intelligent and autonomous machines will allow us to reconsider our moral theories and concepts. Although Gordon and Gunkel’s discussion is in terms of technology prompting changes in theories, I think the idea would be put better in terms of seeking a reflective equilibrium between our intuitions about (and attitudes toward) the innovations, on one hand, and our moral (and moral responsibility) theories and concepts, on the other hand. I take this point to be consistent with Gordon and Gunkel’s view.

  16. A referee aptly wonders why the focus should be on engineers. Should not all members of the moral community be concerned to improve our moral responsibility practices? In response, I suggest that the broader moral question cannot be addressed until further considerations such as those in footnote 14 are pursued.

  17. See Frigo et al., (2021) for discussion of a similar program at Karlsruhe Institute of Technology. For an alternative model which does not prioritize character-based ethics but does seek to embed ethics throughout a related curriculum, see the discussion of Embedded EthiCS at Harvard University in Grosz et al., (2018).

  18. On the role phronesis can play within philosophy and technology more broadly, see Vallor (2016).

References

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nicholas Sars.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sars, N. Engineering responsibility. Ethics Inf Technol 24, 32 (2022). https://doi.org/10.1007/s10676-022-09660-z

Download citation

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s10676-022-09660-z

Keywords

Navigation