single-rb.php

JRM Vol.36 No.1 pp. 158-167
doi: 10.20965/jrm.2024.p0158
(2024)

Paper:

babypapa: Multiple Communication Robots to Enrich Relationship Between Parents and Child–Design and Evaluation of KANSEI Model to Control Closeness—

Satoru Suzuki, Noriaki Imaoka, and Takeshi Ando

Panasonic Holdings Corporation
2-7 Matsuba, Kadoma, Osaka 571-8502, Japan

Received:
April 24, 2023
Accepted:
October 4, 2023
Published:
February 20, 2024
Keywords:
communication robot, social robot, human–robot interaction, KANSEI, well-being
Abstract

There is a need to create a well-being oriented society to improve people’s lives by enhancing their mental satisfaction. In this study, we examined the changes in human emotions based on human–robot interaction by using a communication robot called babypapa. We defined KANSEI as the sensitivity of emotional change to the robot’s behavior and established a KANSEI model. Specifically, to clarify the behavior that the robot should exhibit to make children feel close to it, we conducted play experiments between the robot and 3–4 year-old children to investigate the relationship between the robot’s behavior and the feeling of closeness. The results of the experiment showed that contact and noncontact behaviors of the robot contributed to the feeling of closeness. We demonstrated a certain tendency of closeness and robot behavior.

babypapa: multiple communication robot

babypapa: multiple communication robot

Cite this article as:
S. Suzuki, N. Imaoka, and T. Ando, “babypapa: Multiple Communication Robots to Enrich Relationship Between Parents and Child–Design and Evaluation of KANSEI Model to Control Closeness—,” J. Robot. Mechatron., Vol.36 No.1, pp. 158-167, 2024.
Data files:
References
  1. [1] E. Diener and R. Biswas-Diener, “Will Money Increase Subjective Well-Being?,” Social Indicators Research, Vol.57, No.2, pp. 119-169, 2002. https://doi.org/10.1023/A:1014411319119
  2. [2] R. Layard, “Happiness has social science a clue,” London School of Economics, 2002.
  3. [3] J. Durkin, D. Jackson, and K. Usher, “Touch in Times of COVID-19: Touch Hunger Hurts,” J. Clin. Nurs., Vol.30, Issues 1-2, pp. e4-e5, 2021. https://doi.org/10.1111/jocn.15488
  4. [4] J. Mohd, H. Abid, V. Abhishek, V. Raju, and K. P. Iyengar, “Robotics Applications in COVID-19: A Review,” J. of Industrial Integration and Management, Vol.5, No.4, pp. 441-451, 2020. https://doi.org/10.1142/S2424862220300033
  5. [5] Y. Shen, D. Guo, F. Long, L. A. Mateos, H. Ding, Z. Xiu, R. B. Hellman, A. King, S. Chen, C. Zhang, and H. Tan, “Robots Under COVID-19 Pandemic: A Comprehensive Survey,” IEEE Access, Vol.9, pp. 1590-1615, 2021. https://doi.org/10.1109/access.2020.3045792
  6. [6] K. Wada and T. Shibata, “Living With Seal Robots—Its Sociopsychological and Physiological Influences on the Elderly at a Care House,” IEEE Trans. on Robotics, Vol.23, No.5, pp. 972-980, 2007. https://doi.org/10.1109/TRO.2007.906261
  7. [7] B. Smith, “An approach to graphs of linear forms (Unpublished work style),” unpublished.
  8. [8] N. Yoshida, S. Yonemura, M. Emoto, K. Kawai, N. Numaguchi, H. Nakazato, S. Otsubo, M. Takada, and K. Hayashi, “Production of Character Animation in a Home Robot: A Case Study of LOVOT,” Int. J. of Soc. Robotics, Vol.14, pp. 39-54, 2021. https://doi.org/10.1007/s12369-021-00746-0
  9. [9] C. Bartneck and J. Forlizzi, “A Design-Centred Framework for Social Human-Robot Interaction,” 13th IEEE Int. Workshop on Robot and Human Interactive Communication (RO-MAN 2004), pp. 591-594, 2004. https://doi.org/10.1109/ROMAN.2004.1374827
  10. [10] S. Saunderson and G. Nejat, “How Robots Influence Humans: A Survey of Nonverbal Communication in Social Human–Robot Interaction,” Int. J. of Soc. Robotics, Vol.11, pp. 575-608, 2019. https://doi.org/10.1007/s12369-019-00523-0
  11. [11] A. Bonarini, “Communication in Human-Robot Interaction,” Current Robotics Reports, Vol.1, pp. 279-285, 2020. https://doi.org/10.1007/s43154-020-00026-1
  12. [12] T. Nomura, T. Kanda, and T. Suzuki, “Experimental investigation into influence of negative attitudes toward robots on human–robot interaction,” AI & Society, Vol.20, pp. 138-150, 2006. https://doi.org/10.1007/s00146-005-0012-7
  13. [13] C. Shi, M. Shiomi, T. Kanda, H. Ishiguro, and N. Hagita, “Measuring Communication Participation to Initiate Conversation in Human–Robot Interaction,” Int. J. of Soc. Robotics, Vol.7, pp. 889-910, 2015. https://doi.org/10.1007/s12369-015-0285-z
  14. [14] H. Fukuda, M. Shiomi, K. Nakagawa, and K. Ueda, “‘Midas Touch’ in Human-Robot Interaction: Evidence From Event-Related Potentials During the Ultimatum Game,” 2012 7th ACM/IEEE Int. Conf. on Human-Robot Interaction (HRI), pp. 131-132, 2012. https://doi.org/10.1145/2157689.2157720
  15. [15] R. Yu, E. Hui, J. Lee, D. Poon, A. Ng, K. Sit, K. Ip, F. Yeung, M. Wong, T. Shibata, and J. Woo, “Use of a Therapeutic, Socially Assistive Pet Robot (PARO) in Improving Mood and Stimulating Social Interaction and Communication for People With Dementia: Study Protocol for a Randomized Controlled Trial,” JMIR Res. Protoc., Vol.4, No.2, 2015. https://doi.org/10.2196/resprot.4189
  16. [16] A. E. Block and K. J. Kuchenbecker, “Softness, Warmth, and Responsiveness Improve Robot Hugs,” Int. J. of Soc. Robotics, Vol.11, pp. 49-64, 2019. https://doi.org/10.1007/s12369-018-0495-2
  17. [17] D. Tanaka, P. Ravindra, S. De Silva, and M. Okada, “Peepho: Robotic camera to capture child’s cute behavior from inner view of the robot,” 2012 Human-Agent Interaction Symp., Article No.2D-14, 2012 (in Japanese).
  18. [18] C. Breazeal, “Social Interactions in HRI: The Robot View,” IEEE Trans. on Systems, Man, and Cybernetics, Part C (Applications and Reviews), Vol.34, No.2, pp. 181-186, 2004. https://doi.org/10.1109/TSMCC.2004.826268
  19. [19] T. Kanda, R. Sato, N. Saiwaki, and H. Ishiguro, “A Two-Month Field Trial in an Elementary School for Long-Term Human–Robot Interaction,” IEEE Trans. on Robotics, Vol.23, No.5, pp. 962-971, 2007. https://doi.org/10.1109/TRO.2007.904904
  20. [20] G. Gordon, C. Breazeal, and S. Engel, “Can Children Catch Curiosity from a Social Robot?,” Proc. of the 10th Annual ACM/IEEE Int. Conf. on Human-Robot Interaction (HRI ’15), pp. 91-98, 2015. https://doi.org/10.1145/2696454.2696469
  21. [21] S. Shamsuddin, H. Yussof, L. Ismail, F. A. Hanapiah, S. Mohamed, H. A. Piah, and N. I. Zahari, “Initial response of autistic children in human-robot interaction therapy with humanoid robot NAO,” 2012 IEEE 8th Int. Colloquium on Signal Processing and its Applications, pp. 188-193, 2012. https://doi.org/10.1109/CSPA.2012.6194716
  22. [22] N. A. Malik, H. Yussof, F. A. Hanapiah, R. A. A. Rahman, and H. H. Basri, “Human-Robot Interaction for Children with Cerebral Palsy: Reflection and Suggestion for Interactive Scenario Design,” Procedia Computer Science, Vol.76, pp. 388-393, 2015. https://doi.org/10.1016/j.procs.2015.12.315
  23. [23] C. Hieida, K. Abe, T. Nagai, T. Shimotomai, and T. Oomori, “How Important Is Holding Hands on Building Relationship Between Children and Robots,” 2013 Human-Agent Interaction Symp., pp. 206-213, 2013 (in Japanese).
  24. [24] K. Abe, C. Hieida, M. Attamimi, T. Nagai, A. Iwasaki, T. Shimotomai, T. Omori, and N. Oka, “Play Strategies for Building Good Relationships Between Shy Children and Robots,” J. of Information Processing Society of Japan, Vol.55, No.12, pp. 2524-2536, 2014 (in Japanese).
  25. [25] Y. Nishiwaki, S. Itashiki, N. Karatas, and M. Okada, “Cooperative Interactions Generated by Incompleteness in Robots’ Utterance,” Proc. of the 6th Int. Conf. on Human-Agent Interaction (HAI’2018), pp. 76-83, 2018. https://doi.org/10.1145/3284432.3284441
  26. [26] K. Hirai, S. Ueno, K. Hasegawa, and M. Okada, “Walking-Bones: Fieldwork for Investigating How Children Interact with Mobile Robot based on Proxemics,” 2021 Human-Agent Interaction Symp., Article No.P-11, 2021.
  27. [27] T. Ando, M. Takeda, T. Hirose, S. Fujioka, O. Mizuno, K. Yamada, Y. Ohno, and Y. Honda, “Biosignal-Based Relaxation Evaluation of Head-Care Robot,” 35th Annual Int. Conf. of the IEEE EMBS, pp. 6732-6735, 2013. https://doi.org/10.1109/EMBC.2013.6611101
  28. [28] E. Coronado, G. Venture, and N. Yamanobe, “Applying KANSEI/Affective Engineering Methodologies in the Design of Social and Service Robots: A Systematic Review,” Int. J. of Soc. Robotics, Vol.13, pp. 1161-1171, 2021. https://doi.org/10.1007/s12369-020-00709-x
  29. [29] S. A. H. Bidin, A. M. Lokman, W. A. R. W. Mohd, and T. Tsuchiya, “Initial Intervention Study of KANSEI Robotic Implementation for Elderly,” Procedia Computer Science, Vol.105, pp. 87-92, 2017. https://doi.org/10.1016/j.procs.2017.01.205
  30. [30] S. Hashimoto, “KANSEI Robotics to Open a New Epoch of Human-Machine Relationship – Machine with a Heart –,” The 15th IEEE Int. Symp. on Robot and Human Interactive Communication (ROMAN 2006), 2006. https://doi.org/10.1109/ROMAN.2006.314385
  31. [31] Y. Gan, Y. Ji, S. Jiang, X. Liu, Z. Feng, Y. Li, and Y. Liu, “Integrating Aesthetic and Emotional Preferences in Social Robot Design: An Affective Design Approach With KANSEI Engineering and Deep Convolutional Generative Adversarial Network,” Int. J. of Industrial Ergonomics, Vol.83, Article No.103128, 2021.
  32. [32] Y. Shiokawa, A. Tazo, M. Konyo, and T. Maeno, “Hybrid Display of Realistic Tactile Sense Using Ultrasonic Vibrator and Force Display,” 2008 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), pp. 3008-3013, 2008 (in Japanese). https://doi.org/10.1299/jsmermd.2008._1A1-H20_1
  33. [33] Y. Tanaka, Y. Horita, A. Sano, and H. Fujimoto, “Tactile Sensing Utilizing Human Tactile Perception,” 2011 IEEE World Haptics Conf., pp. 621-626, 2011. https://doi.org/10.1109/WHC.2011.5945557
  34. [34] M. Mochimaru and M. Kouchi, “A KANSEI Model to Estimate the Impression Ratings of Spectacle Frames on Various Faces,” 2005 Digital Human Modeling for Design and Engineering Symp., Article No.2005-01-2693, 2005. https://doi.org/10.4271/2005-01-2693

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 22, 2024