Skip to main content

Tell Me, What Are You Most Afraid Of? Exploring the Effects of Agent Representation on Information Disclosure in Human-Chatbot Interaction

  • Conference paper
  • First Online:
Artificial Intelligence in HCI (HCII 2023)

Abstract

Self-disclosure counts as a key factor influencing successful health treatment, particularly when it comes to building a functioning patient-therapist-connection. To this end, the use of chatbots may be considered a promising puzzle piece that helps foster respective information provision. Several studies have shown that people disclose more information when they are interacting with a chatbot than when they are interacting with another human being. If and how the chatbot is embodied, however, seems to play an important role influencing the extent to which information is disclosed. Here, research shows that people disclose less if the chatbot is embodied with a human avatar in comparison to a chatbot without embodiment. Still, there is only little information available as to whether it is the embodiment with a human face that inhibits disclosure, or whether any type of face will reduce the amount of shared information. The study presented in this paper thus aims to investigate how the type of chatbot embodiment influences self-disclosure in human-chatbot-interaction. We conducted a quasi-experimental study in which \(n=178\) participants were asked to interact with one of three settings of a chatbot app. In each setting, the humanness of the chatbot embodiment was different (i.e., human vs. robot vs. disembodied). A subsequent discourse analysis explored difference in the breadth and depth of self-disclosure. Results show that non-human embodiment seems to have little effect on self-disclosure. Yet, our data also shows, that, contradicting to previous work, human embodiment may have a positive effect on the breadth and depth of self-disclosure.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Online: https://venturebeat.com/2018/05/01/facebook-messenger-passes-300000-bots/ [accessed: February 10th 2023].

  2. 2.

    Note: strictly speaking the visual representation of a chatbot should be called ‘agent’ since it is controlled by an algorithm whereas a human-controlled visual appearance should be referred to as ‘avatar’ [16].

  3. 3.

    Online: https://www.headspace.com/ [accessed: February 10th 2023].

  4. 4.

    Online: https://www.calm.com/ [accessed: February 10th 2023].

  5. 5.

    Online: https://www.recoveryrecord.eu/ [accessed: February 10th 2023].

  6. 6.

    Online: https://www.betterhelp.com/ [accessed: February 10th 2023].

  7. 7.

    Online: https://woebothealth.com/ [accessed: February 10th 2023].

  8. 8.

    Online: https://www.wysa.io/ [accessed: February 10th 2023].

  9. 9.

    Online: https://www.x2ai.com/individuals [accessed: February 10th 2023].

References

  1. Adamopoulou, E., Moussiades, L.: Chatbots: History, technology, and applications. Mach. Learn. Appl. 2, 100006 (2020)

    Google Scholar 

  2. Afifi, T., Steuber, K.: The revelation risk model (RRM): Factors that predict the revelation of secrets and the strategies used to reveal them. Commun. Monogr. 76(2), 144–176 (2009). https://doi.org/10.1080/03637750902828412

    Article  Google Scholar 

  3. Altman, I., Taylor, D.A.: Social Penetration: The Development of Interpersonal Relationships. Rinehart & Winston, Holt (1973)

    Google Scholar 

  4. Appel, J., von der Pütten, A., Krämer, N.C., Gratch, J.: Does humanity matter? analyzing the importance of social cues and perceived agency of a computer system for the emergence of social reactions during human-computer interaction. Adv. Hum. Comput. Interact. 2012 (2012)

    Google Scholar 

  5. Araujo, T.: Living up to the chatbot hype: The influence of anthropomorphic design cues and communicative agency framing on conversational agent and company perceptions. Comput. Hum. Behav. 85, 183–189 (2018). https://doi.org/10.1016/j.chb.2018.03.051

    Article  Google Scholar 

  6. Astrid, M., Krämer, N.C., Gratch, J., Kang, S.H.: “It doesn’t matter what you are!’’ explaining social effects of agents and avatars. Comput. Hum. Behav. 26(6), 1641–1650 (2010)

    Article  Google Scholar 

  7. Bailenson, J.N., Yee, N., Merget, D., Schroeder, R.: The effect of behavioral realism and form realism of real-time avatar faces on verbal disclosure, nonverbal disclosure, emotion recognition, and copresence in dyadic interaction. Presence: Teleoper. Virt. Environ. 15(4), 359–372 (2006)

    Google Scholar 

  8. Bickmore, T.W., Picard, R.W.: Establishing and maintaining long-term human-computer relationships. ACM Trans. Comput. Hum. Interact. 12(2), 293–327 (2005)

    Article  Google Scholar 

  9. Chaves, A.P., Gerosa, M.A.: How should my chatbot interact? a survey on social characteristics in human-chatbot interaction design. Int. J. Hum. Comput. Interact. 37(8), 729–758 (2021)

    Article  Google Scholar 

  10. Ciechanowski, L., Przegalinska, A., Magnuski, M., Gloor, P.: In the shades of the uncanny valley: An experimental study of human-chatbot interaction. Future Gen. Comput. Syst. 92, 539–548 (2019)

    Article  Google Scholar 

  11. Cozby, P.C.: Self-disclosure: A literature review. Psychol. Bull. 79(2), 73 (1973)

    Article  Google Scholar 

  12. D’Alfonso, S.: Ai in mental health. Curr. Opin. Psychol. 36, 112–117 (2020)

    Article  Google Scholar 

  13. De Visser, E.J., et al.: Almost human: Anthropomorphism increases trust resilience in cognitive agents. J. Exp. Psychol. Appl. 22(3), 331 (2016)

    Google Scholar 

  14. Diederich, S., Brendel, A.B., Kolbe, L.M.: On conversational agents in information systems research: Analyzing the past to guide future work. In: Proceedings of WI, pp. 1550–1564. AIS (2019)

    Google Scholar 

  15. Fitzpatrick, K.K., Darcy, A., Vierhile, M.: Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (woebot): a randomized controlled trial. JMIR Mental Health 4(2), e7785 (2017)

    Article  Google Scholar 

  16. Fox, J., Ahn, S.J., Janssen, J.H., Yeykelis, L., Segovia, K.Y., Bailenson, J.N.: Avatars versus agents: A meta-analysis quantifying the effect of agency on social influence. Hum. Comput. Interact. 30(5), 401–432 (2015)

    Article  Google Scholar 

  17. Gambino, A., Fox, J., Ratan, R.A.: Building a stronger casa: Extending the computers are social actors paradigm. Hum. Mach. Commun. 1, 71–85 (2020)

    Article  Google Scholar 

  18. Gardiner, P.M., et al.: Engaging women with an embodied conversational agent to deliver mindfulness and lifestyle recommendations: A feasibility randomized control trial. Patient Educ. Counsel. 100(9), 1720–1729 (2017)

    Google Scholar 

  19. Gnewuch, U., Morana, S., Adam, M.T., Maedche, A.: Faster is not always better: Understanding the effect of dynamic response delays in human-chatbot interaction. In: Frank, U. (ed.). 26th European Conference on Information Systems: Beyond Digitization-Facets of Socio-Technical Change, ECIS 2018, Portsmouth, UK, 23–28 June, 2018, p. 143975 (2018)

    Google Scholar 

  20. Greene, K.: An integrated model of health disclosure decision-making. In: Uncertainty, Information Management, and Disclosure Decisions, pp. 242–269. Routledge (2015)

    Google Scholar 

  21. Greene, K., Magsamen-Conrad, K., Venetis, M.K., Checton, M.G., Bagdasarov, Z., Banerjee, S.C.: Assessing health diagnosis disclosure decisions in relationships: Testing the disclosure decision-making model. Health Commun. 27(4), 356–368 (2012)

    Article  Google Scholar 

  22. Griffin, E.A.: A First Look at Communication Theory. McGraw-Hill (2003)

    Google Scholar 

  23. Hill, J., Ford, W.R., Farreras, I.G.: Real conversations with artificial intelligence: A comparison between human-human online conversations and human-chatbot conversations. Comput. Hum. Behav. 49, 245–250 (2015)

    Article  Google Scholar 

  24. Ho, A., Hancock, J., Miner, A.S.: Psychological, relational, and emotional effects of self-disclosure after conversations with a chatbot. J. Commun. 68(4), 712–733 (2018)

    Article  Google Scholar 

  25. Inkster, B., Sarda, S., Subramanian, V., et al.: An empathy-driven, conversational artificial intelligence agent (WYSA) for digital mental well-being: Real-world data evaluation mixed-methods study. JMIR mHealth uHealth 6(11), e12106 (2018)

    Article  Google Scholar 

  26. Joinson, A.N.: Knowing me, knowing you: Reciprocal self-disclosure in internet-based surveys. Cyber Psychol. Behav. 4(5), 587–591 (2001)

    Article  Google Scholar 

  27. Kang, S.H., Gratch, J.: Virtual humans elicit socially anxious interactants’ verbal self-disclosure. Comput. Anim. Virt. Worlds 21(3–4), 473–482 (2010)

    Article  Google Scholar 

  28. Knapp, M.L., Hall, J.A., Horgan, T.G.: Nonverbal Communication in Human Interaction. Cengage Learning (2013)

    Google Scholar 

  29. Kreuter, F., Presser, S., Tourangeau, R.: Social desirability bias in CATI, IVR, and web surveysthe effects of mode and question sensitivity. Publ. Opin. Quart. 72(5), 847–865 (2008)

    Article  Google Scholar 

  30. Li, Z., Rau, P.L.P., Huang, D.: Self-disclosure to an IoT conversational agent: Effects of space and user context on users’ willingness to self-disclose personal information. Appl. Sci. 9(9), 1887 (2019)

    Article  Google Scholar 

  31. Lind, L.H., Schober, M.F., Conrad, F.G., Reichert, H.: Why do survey respondents disclose more when computers ask the questions? Publ. Opin. Quart. 77(4), 888–935 (2013)

    Article  Google Scholar 

  32. Lucas, G.M., Gratch, J., King, A., Morency, L.P.: It’s only a computer: Virtual humans increase willingness to disclose. Comput. Hum. Behav. 37, 94–100 (2014). https://doi.org/10.1016/j.chb.2014.04.043

    Article  Google Scholar 

  33. Lucas, G.M., et al.: Reporting mental health symptoms: Breaking down barriers to care with virtual human interviewers. Front. Robot. AI 4, 51 (2017)

    Google Scholar 

  34. MacDorman, K.F., Green, R.D., Ho, C.C., Koch, C.T.: Too real for comfort? Uncanny responses to computer generated faces. Comput. Hum. Behav. 25(3), 695–710 (2009)

    Article  Google Scholar 

  35. MacDorman, K.F., Ishiguro, H.: The uncanny advantage of using androids in cognitive and social science research. Interact. Stud. 7(3), 297–337 (2006)

    Article  Google Scholar 

  36. Monnier, D.: Woebot: A continuation of and an end to psychotherapy? Psychotherapies 40(2), 71–78 (2020)

    Google Scholar 

  37. Mori, M.: The uncanny valley: The original essay by masahiro mori. IEEE Spectrum (1970)

    Google Scholar 

  38. Mori, M., MacDorman, K.F., Kageki, N.: The uncanny valley [from the field]. IEEE Robot. Automat. Magaz. 19(2), 98–100 (2012)

    Article  Google Scholar 

  39. Nass, C., Moon, Y., Green, N.: Are machines gender neutral? Gender-stereotypic responses to computers with voices. J. Appl. Soc. Psychol. 27(10), 864–876 (1997)

    Article  Google Scholar 

  40. Nimavat, K., Champaneria, T.: Chatbots: An overview types, architecture, tools and future possibilities. Int. J. Sci. Res. Dev. 5(7), 1019–1024 (2017)

    Google Scholar 

  41. Nowak, K.L., Rauh, C.: The influence of the avatar on online perceptions of anthropomorphism, androgyny, credibility, homophily, and attraction. J. Comput. Mediat. Commun. 11(1), 153–178 (2005)

    Article  Google Scholar 

  42. Oh, J., Jang, S., Kim, H., Kim, J.J.: Efficacy of mobile app-based interactive cognitive behavioral therapy using a chatbot for panic disorder. Int. J. Med. Inf. 140, 104171 (2020)

    Article  Google Scholar 

  43. Omarzu, J.: A disclosure decision model: Determining how and when individuals will self-disclose. Personal. Soc. Psychol. Rev. 4(2), 174–185 (2000)

    Article  Google Scholar 

  44. Pickard, M.D., Roster, C.A.: Using computer automated systems to conduct personal interviews: Does the mere presence of a human face inhibit disclosure? Comput. Hum. Behav. 105, 106197 (2020)

    Article  Google Scholar 

  45. Pickard, M.D., Roster, C.A., Chen, Y.: Revealing sensitive information in personal interviews: Is self-disclosure easier with humans or avatars and under what conditions? Comput. Hum. Behav. 65, 23–30 (2016). https://doi.org/10.1016/j.chb.2016.08.004

    Article  Google Scholar 

  46. Rosenthal-von der Pütten, A.M., Krämer, N.C., Hoffmann, L., Sobieraj, S., Eimler, S.C.: An experimental study on emotional reactions towards a robot. Int. J. Soc. Robot. 5(1), 17–34 (2013). https://doi.org/10.1007/s12369-012-0173-8

  47. Ruane, E., Birhane, A., Ventresque, A.: Conversational AI: Social and ethical considerations. In: AICS, pp. 104–115 (2019)

    Google Scholar 

  48. Sah, Y.J., Peng, W.: Effects of visual and linguistic anthropomorphic cues on social perception, self-awareness, and information disclosure in a health website. Comput. Hum. Behav. 45, 392–401 (2015)

    Article  Google Scholar 

  49. Thaler, M., Schlögl, S., Groth, A.: Agent vs. avatar: Comparing embodied conversational agents concerning characteristics of the uncanny valley. In: 2020 IEEE International Conference on Human-Machine Systems (ICHMS), pp. 1–6. IEEE (2020)

    Google Scholar 

  50. Vaidyam, A.N., Wisniewski, H., Halamka, J.D., Kashavan, M.S., Torous, J.B.: Chatbots and conversational agents in mental health: a review of the psychiatric landscape. Canadian J. Psychiat. 64(7), 456–464 (2019)

    Article  Google Scholar 

  51. Verhagen, T., Van Nes, J., Feldberg, F., Van Dolen, W.: Virtual customer service agents: Using social presence and personalization to shape online service encounters. J. Comput. Mediat. Commun. 19(3), 529–545 (2014)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Stephan Schlögl .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Stock, A., Schlögl, S., Groth, A. (2023). Tell Me, What Are You Most Afraid Of? Exploring the Effects of Agent Representation on Information Disclosure in Human-Chatbot Interaction. In: Degen, H., Ntoa, S. (eds) Artificial Intelligence in HCI. HCII 2023. Lecture Notes in Computer Science(), vol 14051. Springer, Cham. https://doi.org/10.1007/978-3-031-35894-4_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-35894-4_13

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-35893-7

  • Online ISBN: 978-3-031-35894-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics