ABSTRACT
In order for humans and robots to work effectively together, they need to be able to converse about abilities, goals and achievements. Thus, we are developing an interaction infrastructure called the ``Human-Robot Interaction Operating System'' (HRI/OS). The HRI/OS provides a structured software framework for building human-robot teams, supports a variety of user interfaces, enables humans and robots to engage in task-oriented dialogue, and facilitates integration of robots through an extensible API.
- S. Afantenos, K. Liontou, et al. An introduction to the summarization of evolving events: Linear and non-linear evolution. In International Workshop on Natural Language Understanding and Cognitive Science, 2005.Google Scholar
- R. Ambrose, H. Aldridge, et al. Robonaut: Nasa's space humanoid. IEEE Intelligent Systems Journal, Aug 2000. Google ScholarDigital Library
- J. Anderson and C. Lebiere. Atomic components of thought. Erlbaum, Mahwah, NJ, 1988.Google Scholar
- J. Bradshaw et al. Software Agents, chapter KAoS: Toward an industrial-strength open agent architecture. MIT Press, 1997. Google ScholarDigital Library
- N. Cassimatis, J. Trafton, et al. Integrating cognition, perception, and action through mental simulation in robots. Robotics and Autonomous Systems, 49(1-2), Nov 2004.Google Scholar
- A. Cheyer and D. Martin. The open agent architecture. Journal of Autonomous Agents and Multi-Agent Systems, 4(1):143--148, March 2001. Google ScholarDigital Library
- W. Clancey et al. Automating capcom using mobile agents and robotic assistants. In Proc. AIAA 1st Space Exploration Conference, 2005.Google ScholarCross Ref
- M. Denecke. Rapid prototyping for spoken dialogue systems. In Proc. 19th International Conference on Computational linguistics, 2002. Google ScholarDigital Library
- T. Fong and I. Nourbakhsh. Interaction challenges in human-robot space exploration. ACM Interactions, 12(2):42--45, 2005. Google ScholarDigital Library
- T. Fong, I. Nourbakhsh, et al. The peer-to-peer human-robot interaction project. In Space 2005, number AIAA 2005-6750. AIAA, 2005.Google Scholar
- T. Fong, C. Thorpe, and C. Baur. Collaboration, dialogue, and human-robot interaction. In Proc. 10th International Symposium on Robotics Research. Springer, 2001.Google Scholar
- T. Fong, C. Thorpe, and C. Baur. Multi-robot remote driving with collaborative control. IEEE Transactions on Industrial Electronics, 50(4), 2003.Google ScholarCross Ref
- K. Gajos, L. Weisman, and H. Shrobe. Design principles for resource management systems for intelligent spaces. In Proc. Second International Workshop on Self-Adaptive Software, 2001. Google ScholarDigital Library
- B. Gerkey, R. Vaughan, and A. Howard. Player/Stage project: Tools for multi-robot and distributed sensor systems. In Proc. International Conference on Advanced Robotics. 2003.Google Scholar
- M. Henning. A new approach to object-oriented middleware. IEEE Internet Computing, 8(1), 2004. Google ScholarDigital Library
- L. Hiatt, J. Trafton, et al. A cognitive model for spatial perspective taking. In Proc. 6th International Conference on Cognitive Modelling. 2004.Google Scholar
- C. Martin, D. Schreckenghost, et al. Aiding collaboration among humans and complex software agents. In Spring Symposium. AAAI, 2003.Google Scholar
- C. Martin, D. Schreckenghost, et al. An environment for distributed collaboration among humans and software agents. In Proc. International Conference on Autonomous Agents and Multi-Agent Systems, 2003. Google ScholarDigital Library
- J. Reitsema, W. Chun, et al. Team-centered virtual interactive presence for adjustable autonomy. In Space 2005, number AIAA 2005-6606. AIAA, 2005.Google Scholar
- C. Reynerson. Design considerations for remotely operated welding in space: Task definition and visual weld monitoring experiment. Master's thesis, MIT, 1993. Dept. of Ocean Engineering.Google Scholar
- M. Roman, C. Hess, et al. Gaia: a middleware infrastructure to enable active spaces. IEEE Pervasive Computing, 1(4), 2002. Google ScholarDigital Library
- C. Russell et al. Considerations of metal joining processes for space fabrication, construction and repair. In Proc. SAMPE Technical Conference. 1991.Google Scholar
- J. Scholtz. Human-robot interactions: Creating synergistic cyber forces. In A. Schultz and L. Parker, editors, Multi-robot systems: from swarms to intelligent automata. Kluwer, 2002.Google Scholar
- R. Simmons and D. Apfelbaum. A task description language for robot control. In Proc. Conference on Intelligent Robots and Systems, 1998.Google ScholarCross Ref
- J. Sousa and D. Garlan. Aura: an architectural framework for user mobility in ubiquitous computing environments. In Proc. IEEE/IFIP Conference on Software Architecture, 2002. Google ScholarDigital Library
- A. Tews, M. Mataric, and G. Sukhatme. A scalable approach to human-robot interaction. In Proc. IEEE International Conference on Robotics and Automation, 2003.Google ScholarCross Ref
- J. Trafton, N. Cassimatis, et al. Enabling effective human-robot interaction using perspective-taking in robots. IEEE Trans. on Systems, Man and Cybernetics, Part A, 49(4), July 2005. Google ScholarDigital Library
- H. Utz, G. Mayer, and G. Kraetzschmar. Middleware logging facilities for experimentation and evaluation in robotics. In German Conference on AI, 2004.Google Scholar
- A. Waibel, M. Bett, and M. Finke. Meeting browser: Tracking and summarising meetings. In Broadcast News Workshop. DARPA, 2003.Google Scholar
- T. Winograd. HCI in the New Millennium, chapter Interaction spaces for 21st century computing. Addison Wesley, 2001.Google Scholar
Index Terms
- The human-robot interaction operating system
Recommendations
Lexical Entrainment in Multi-party Human–Robot Interaction
Social RoboticsAbstractThis paper reports lexical entrainment in a multi-party human–robot interaction, wherein one robot and two humans serve as participants. Humans tend to use the same terms as their interlocutors while making conversation. This phenomenon is called ...
Enabling Multimodal Human–Robot Interaction for the Karlsruhe Humanoid Robot
In this paper, we present our work in building technologies for natural multimodal human-robot interaction. We present our systems for spontaneous speech recognition, multimodal dialogue processing, and visual perception of a user, which includes ...
Precision timing in human-robot interaction: coordination of head movement and utterance
CHI '08: Proceedings of the SIGCHI Conference on Human Factors in Computing SystemsAs research over the last several decades has shown that non-verbal actions such as face and head movement play a crucial role in human interaction, such resources are also likely to play an important role in human-robot interaction. In developing a ...
Comments