ABSTRACT
We demonstrate an interactive conversation with an android named ERICA. In this demonstration the user can converse with ERICA on a number of topics. We demonstrate both the dialog management system and the eye gaze behavior of ERICA used for indicating attention and turn taking.
- S. Andrist, X. Z. Tan, M. Gleicher, and B. Mutlu. Conversational gaze aversion for humanlike robots. In Proc. 2014 ACM/IEEE Int. Conf. on Human-robot interaction, pages 25–32. ACM, 2014. Google ScholarDigital Library
- D.Glas, T.Minato, C.Ishi, T.Kawahara, and H.Ishiguro. Erica: The erato intelligent conversatioal android. In Proc. ROMAN, 2016 (to appear), 2016.Google Scholar
- D. Glas, S. Satake, T. Kanda, and N. Hagita. An interaction design framework for social robots. In Robotics: Science and Systems, volume 7, page 89, 2012.Google ScholarCross Ref
- A. Lee and T. Kawahara. Recent development of open-source speech recognition engine Julius. In Proc. APSIPA ASC 2009, pages 131–137, 2009.Google Scholar
- T.Kawahara, T.Yamaguchi, K.Inoue, K.Takanashi, and N.Ward. Prediction and generation of backchannel form for attentive listening systems. In Proc. INTERSPEECH, 2016 (to appear), 2016.Google ScholarCross Ref
- K. Yoshino and T. Kawahara. Conversational system for information navigation based on POMDP with user focus tracking. Comput. Speech Lang., 34(1):275–291, 2015. Google ScholarDigital Library
- Z. Yu, D. Bohus, and E. Horvitz. Incremental coordination: Attention-centric speech production in a physically situated conversational agent. In 16th Annual Meeting of the Special Interest Group on Discourse and Dialogue, page 402, 2015.Google ScholarCross Ref
Index Terms
- Multimodal interaction with the autonomous Android ERICA
Recommendations
Spontaneous spoken dialogues with the furhat human-like robot head
HRI '14: Proceedings of the 2014 ACM/IEEE international conference on Human-robot interactionFurhat [1] is a robot head that deploys a back-projected animated face that is realistic and human-like in anatomy. Furhat relies on a state-of-the-art facial animation architecture allowing accurate synchronized lip movements with speech, and the ...
Multimodal multiparty social interaction with the furhat head
ICMI '12: Proceedings of the 14th ACM international conference on Multimodal interactionWe will show in this demonstrator an advanced multimodal and multiparty spoken conversational system using Furhat, a robot head based on projected facial animation. Furhat is a human-like interface that utilizes facial animation for physical robot heads ...
Multimodal human discourse: gesture and speech
Gesture and speech combine to form a rich basis for human conversational interaction. To exploit these modalities in HCI, we need to understand the interplay between them and the way in which they support communication. We propose a framework for the ...
Comments