Abstract
This paper makes the case for designing interactive robots with their expressive movement in mind. As people are highly sensitive to physical movement and spatiotemporal affordances, well-designed robot motion can communicate, engage, and offer dynamic possibilities beyond the machines' surface appearance or pragmatic motion paths. We present techniques for movement centric design, including character animation sketches, video prototyping, interactive movement explorations, Wizard of Oz studies, and skeletal prototypes. To illustrate our design approach, we discuss four case studies: a social head for a robotic musician, a robotic speaker dock listening companion, a desktop telepresence robot, and a service robot performing assistive and communicative tasks. We then relate our approach to the design of non-anthropomorphic robots and robotic objects, a design strategy that could facilitate the feasibility of real-world human-robot interaction.
- Adalgeirsson, S. O., & Breazeal, C. (2010). MeBot: A robotic platform for socially embodied telepresence. In Proceedings of the 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI) (pp. 15--22), Osaka, Japan. Google ScholarDigital Library
- Akers, D. (2006). Wizard of Oz for participatory design: Inventing a gestural interface for 3D selection of neural pathway estimates. In Proceedings of CHI '06: Extended Abstracts on Human Factors in Computing Systems (pp. 454--459), Montreal, Canada. Google ScholarDigital Library
- Alami, R., Clodic, A., Montreuil, V., Sisbot, E. A., & Chatila, R. (2005). Task planning for human-robot interaction. In Proceedings of the 2005 joint conference on smart objects and ambient intelligence (SOC EUSAI) (pp. 81--85). New York, NY: ACM Press. Google ScholarDigital Library
- Argyle, M. (1988). Bodily Communication (2nd ed.). Methuen & Co, UK.Google Scholar
- Aucouturier, J., Ogai, Y., & Ikegami, T. (2008). Making a robot dance to music using chaotic itinerancy in a network of Fitzhugh-Nagumo neurons. Neural Information Processing, 4985, pp. 647--656. Google ScholarDigital Library
- Baldwin, D. A., & Baird, J. A. (2001). Discerning intentions in dynamic human action. Trends in Cognitive Sciences, 5(4), 171--178.Google ScholarCross Ref
- Baron-Cohen, S. (1991). Precursors to a theory of mind: Understanding attention in others. In A. Whiten (Ed.), Natural theories of mind (pp. 233--250). Oxford, UK: Blackwell Press.Google Scholar
- Barrett, H. C., Todd, P. M., Miller, G. F., & Blythe, P. W. (2005). Accurate judgments of intention from motion cues alone: A cross-cultural study. Evolution and Human Behavior, 26(4), 313--331.Google ScholarCross Ref
- Bates, J. (1994). The role of emotion in believable agents. Communications of the ACM, 37(7), 122--125. Google ScholarDigital Library
- Blythe, P. W., Todd, P. M., & Miller, G. F. (1999). How motion reveals intention: Categorizing social interactions. In Gigerenzer, G., and Todd, P. M. (eds), Simple Heuristics that Make Us Smart. Oxford University Press, New York.Google Scholar
- Bohren, J., Rusu, R. B., Jones, E. G., Marder-Eppstein, E., Pantofaru, C., Wise, M., .., Holzer, S.(2011). Towards autonomous robotic butlers: Lessons learned with the PR2. In Proceedings from the IEEE International Conference on Robotics and Automation (ICRA) (pp. 5568--5575).Google Scholar
- Breazeal, C., Brooks, A., Chilongo, D., Gray, J., Hoffman, G., Kidd, C., ... Lockerd, A. (2004). Working collaboratively with Humanoid Robots. In Proceedings of the IEEE RAS/RSJ International Conference on Humanoid Robots (Humanoids). Santa Monica, CA.Google ScholarCross Ref
- Breazeal, C., Wang, A., & Picard, R. (2007). Experiments with a robotic computer: Body, affect and cognition interactions. In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction (pp. 153--160). New York, NY: ACM. Google ScholarDigital Library
- Bretan, M., Cicconet, M., Nikolaidis, R., & Weinberg, G. (2012). Developing and Composing for a Robotic Musician Using Different Modes of Interaction. In Proceedings of the International Computer Music Conference (ICMC) (pp. 498--503), Ljubljana, Slovenia.Google Scholar
- Chambers, J. (2011). Artificial Defense Mechanisms P. Antonelli (Ed.). New York, NY: The Museum of Modern Art.Google Scholar
- Clark, H. H. (1996). Using Language. Cambridge, UK: Cambridge University Press.Google Scholar
- Clark, H. H. (2005). Coordinating with each other in a material world. Discourse Studies, 7(4--5), 507--525. Dennett, D. C. (1987). Three kinds of intentional psychology. In The intentional Stance (chap. 3). Cambridge, MA: MIT Press.Google Scholar
- DiSalvo, C. F., Gemperle, F., Forlizzi, J., & Kiesler, S. (2002). All robots are not created equal: The Design and Perception of Humanoid Robot Heads. In Proceedings of the 4th Conference on Designing Interactive Systems (DIS) (pp. 321--326). New York, NY: ACM Press. Google ScholarDigital Library
- Duffy, B. R. (2003). Anthropomorphism and the social robot. Robotics and Autonomous Systems, 42(3--4), 177--190.Google Scholar
- Ekman, P., & Friesen, W. (1969). The repertoire of nonverbal behavior: Categories, origins, usage, and coding. Semiotica, 1(1), 49--98.Google ScholarCross Ref
- Fink, J. (2012). Anthropomorphism and human likeness in the design of robots and human-robot interaction. Social Robotics, 199--208. Google ScholarDigital Library
- Fiske, S., & Taylor, S. (1991). Social cognition. New York, NY: McGraw-Hill.Google Scholar
- Gao, T., Newman, G. E., & Scholl, B. J. (2009). The psychophysics of chasing : A case study in the perception of animacy. Cognitive Psychology, 59(2), 154--179.Google ScholarCross Ref
- Gibson, J. J. (1977). The concept of affordances. Perceiving, acting, and knowing, 67--82, Lawrence Erlbaum Associates, New Jersey.Google Scholar
- Goldberg, K., & Kehoe, B. (2013, January). Cloud Robotics and Automation: A Survey of Related Work (Tech. Rep. No. UCB/EECS-2013-5). EECS Department, University of California, Berkeley.Google Scholar
- Gray, J., Hoffman, G., Adalgeirsson, S. O., Berlin, M., & Breazeal, C. (2010). Expressive, interactive robots: Tools, techniques, and insights based on collaborations. In HRI 2010 Workshop: What Do Collaborations With the Arts Have to Say About HRI?Google Scholar
- Hall, E. T. (1969). The hidden dimension. New York, NY: Anchor Books.Google Scholar
- Heider, F., & Simmel, M. (1944). An experimental study of apparent behavior. The American Journal of Psychology, 57(2), 243--259.Google ScholarCross Ref
- Hendriks, B., Meerbeek, B., Boess, S., Pauws, S., & Sonneveld, M. (2011). Robot vacuum cleaner personality and behavior. International Journal of Social Robotics, 3(2), 187--195.Google ScholarCross Ref
- Hoffman, G. (2005). HRI: Four Lessons from Acting Method (Tech. Rep.). Cambridge, MA, USA: MIT Media Laboratory.Google Scholar
- Hoffman, G. (2012). Dumb robots, smart phones: A case study of music listening companionship. In Proceedings of the IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Atlanta, Georgia.Google ScholarCross Ref
- Hoffman, G., & Breazeal, C. (2009). Effects of anticipatory perceptual simulation on practiced human-robot tasks. Autonomous Robots, 28(4), 403--423. Google ScholarDigital Library
- Hoffman, G., Kubat, R. R., & Breazeal, C. (2008). A hybrid control system for puppeterring a live robotic stage actor. In Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Munich, Germany.Google Scholar
- Hoffman, G., & Vanunu, K. (2013). Effects of robotic companionship on music enjoyment and agent perception. In Proceedings of the 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI). Google ScholarDigital Library
- Hoffman, G., & Weinberg, G. (2011). Interactive improvisation with a robotic marimba player. Autonomous Robots, 31(2--3), 133--153. Google ScholarDigital Library
- Höysniemi, J., Hämäläinen, P., & Turkki, L. (2004). Wizard of Oz prototyping of computer vision based action games for children. In Proceedings of the 2004 Conference on Interaction Design and Children: Building a Community (pp. 27--34), College Park, Maryland. Google ScholarDigital Library
- Hudson, S., Fogarty, J., Atkeson, C., Avrahami, D., Forlizzi, J., Kiesler, S., ... Yang, J. (2003). Predicting human interruptibility with sensors: a Wizard of Oz feasibility study. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 257--264), Ft. Lauderdale, Florida. Google ScholarDigital Library
- Inhatowicz, E. (1970). Cybernetic Art. Available from http://www.senster.com/ihnatowicz/senster/. Johansson, G. (1973). Visual perception of biological motion and a model for its analysis. Perception & Psychophysics, 14(2), 201--211.Google Scholar
- Jones, C. (1965). The Dot and the Line. Available at http://vimeo.com/71888010Google Scholar
- Ju, W., & Takayama, L. (2009). Approachability: How people interpret automatic door movement as gesture. International Journal of Design, 3(2), 1--10.Google Scholar
- Kelley, J. F. (1983). An empirical methodology for writing user-friendly natural language computer applications. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 193--196), Boston, Massachuesetts. Google ScholarDigital Library
- Kim, J., Kwak, S., & Kim, M. (2009). Entertainment robot personality design based on basic factors on motions: A case study with Rolly. In Proceedings of the 18th IEEE International Symposium on Robot and Human Interactive Communication, (RO-MAN) (pp. 803--808).Google ScholarCross Ref
- Kirsh, D., & Maglio, P. (1994). On distinguishing epistemic from pragmatic action. Cognitive Science, 18(4), 513--549.Google ScholarCross Ref
- Knapp, M. L., & Hall, J. A. (2002). Nonverbal communication in human interaction (5th ed.). Fort Worth: Harcourt Brace College Publishers.Google Scholar
- Knight, H. (2011). Eight lessons learned about non-verbal interactions through robot theater. In Social robotics (pp. 42--51). Springer. Google ScholarDigital Library
- Kozima, H., Michalowski, M. P., & Nakagawa, C. (2009). Keepon: A playful robot for research, therapy, and entertainment. International Journal of Social Robotics, 1(1), 3--18.Google ScholarCross Ref
- Kozlowski, L. T., & Cutting, J. E. (1977). Recognizing the sex of a walker from a dynamic point-light display. Perception & Psychophysics, 21(6), 575--580.Google ScholarCross Ref
- Kraut, R. E. (1978). Verbal and nonverbal cues in the perception of lying. Journal of Personality and Social Psychology, 36(4), 380.Google ScholarCross Ref
- Lasseter, J. (1986). Luxo Jr. Pixar Animation Studios. Pixar. Available from http://www.pixar.com/short_films/Theatrical-Shorts/Luxo-Jr.Google Scholar
- Lasseter, J. (1987). Principles of traditional animation applied to 3D computer animation. Computer Graphics, 21(4), 35--44. Google ScholarDigital Library
- Lasseter, J. (2001). Tricks to animating characters with a computer. ACM SIGGRAPH Computer Graphics, 35(2), 45--47. Google ScholarDigital Library
- Loula, F., Prasad, S., Harber, K., & Shiffrar, M. (2005). Recognizing people from their movement. Journal of Experimental Psychology: Human Perception and Performance, 31(1), 210--20.Google ScholarCross Ref
- Malle, B., Moses, L., & Baldwin, D. (Eds.). (2001). Intentions and Intentionality. MIT Press, Cambridge, Massachusetts.Google Scholar
- Maulsby, D., Greenberg, S., & Mander, R. (1993). Prototyping an intelligent agent through Wizard of Oz. In Proceedings of the Conference on Human Factors in Computing Systems (INTERACT '93 & CHI '93) (pp. 277--284), Amsterdam, The Netherlands. Google ScholarDigital Library
- Meerbeek, B., Saerbeck, M., & Bartneck, C. (2009). Towards a design method for expressive robots. In Proceedings of the 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI), (pp. 277--278), San Diego, California. Google ScholarDigital Library
- Michalowski, M., Sabanovic, S., & Kozima, H. (2007). A Dancing Robot for Rhythmic Social Interaction. In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction (pp. 89--96). Arlington, VA. Google ScholarDigital Library
- Michotte, A. (1946). La perception de la causalité (Etudes Psychology, Vol. VI.).Google Scholar
- Moore, N.-J., Hickson, M., & Stacks, D. W. (2010). Nonverbal communication: Studies and applications. London: Oxford University Press.Google Scholar
- Mori, M. (1970). The uncanny valley. Energy, 7(4), 33--35.Google Scholar
- Mumm, J., & Mutlu, B. (2011). Human-robot proxemics: Physical and psychological distancing in human-robot interaction. In Proceedings of the 6th International Conference on Human-Robot Interaction (HRI)(p. 331). New York, NY: ACM Press. Google ScholarDigital Library
- Murphy, R., Shell, D., Guerin, A., Duncan, B., Fine, B., Pratt, K., & Sourntos, T. (2010). A Midsummer Nights Dream (With Flying Robots). Autonomous Robots, 30(2), 143--156. Google ScholarDigital Library
- Nomura, T., Suzuki, T., Kanda, T., Han, J., Shin, N., Burke, J., & Kato, K. (2008). What people assume about humanoid and animal-type robots: Cross-cultural analysis between Japan, Korea, and the United States. International Journal of Humanoid Robotics, 5(1), 25--46.Google ScholarCross Ref
- Norman, D. A. (1999). Affordance, conventions, and design. Interactions, 6(3), 38--43. Google ScholarDigital Library
- Paulos, E., & Canny, J. (1998). PRoP: Personal roving presence. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 296--303), Los Angeles, California. Google ScholarDigital Library
- Riek, L. D. (2012). Wizard of Oz studies in HRI: A systematic review and new reporting guidelines. Journal of Human-Robot Interaction, 1(1). Google ScholarDigital Library
- Santos, K. B. (2012). The Huggable: A socially assistive robot for pediatric care. Unpublished doctoral dissertation, Massachusetts Institute of Technology.Google Scholar
- Scholl, B., & Tremoulet, P. (2000). Perceptual causality and animacy. Trends in Cognitive Sciences, 4(8), 299--309.Google ScholarCross Ref
- Setapen, A. (2011). Shared Attention for Human-Robot Interaction. Unpublished doctoral dissertation, Massachusetts Institute of Technology.Google Scholar
- Sharma, M., Hildebrandt, D., Newman, G., Young, J. E., & Eskicioglu, R. (2013). Communicating affect via flight path Exploring use of the Laban Effort System for designing affective locomotion paths. In Proceedings of the 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI) (pp. 293--300) Google ScholarDigital Library
- Sirkin, D., & Ju, W. (2012). Consistency in physical and on-screen action improves perceptions of telepresence robots. Proceedings of the Seventh Annual ACM/IEEE International Conference on Human-Robot Interaction (HRI). Google ScholarDigital Library
- Takayama, L., Dooley, D., & Ju, W. (2011). Expressing thought: Improving robot readability with animation principles. In Proceedings of the 6th International Conference on Human-Robot Interaction (HRI) (pp. 69--76). ACM Press Google ScholarDigital Library
- Thomas, F., & Johnston, O. (1995). The Illusion of Life: Disney Animation (revised ed.). New York: Hyperion.Google Scholar
- Thornton, I. M., Pinto, J., & Shiffrar, M. (1998). The visual perception of human locomotion. Cognitive Neuropsychology, 15, 535--552.Google ScholarCross Ref
- Tomasello, M. (1999). The cultural ecology of young childrens interactions with objects and artifacts. Ecological Approaches to Cognition: Essays in Honor of Ulric Neisser, 153--170.Google Scholar
- Venolia, G., Tang, J., Cervantes, R., Bly, S., Robertson, G., Lee, B., & Inkpen, K. (2010). Embodied social proxy: Mediating interpersonal connection in hub-and-satellite teams. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 1049--1058) Google ScholarDigital Library
- Vertelney, L. (1995). Using video to prototype user interfaces. In Human-Computer Interaction (pp. 142--146). Google ScholarDigital Library
- Weinberg, G., & Driscoll, S. (2007). The design of a perceptual and improvisational robotic marimba player. In Proceedings from the 16th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN) (pp. 769--774). Jeju, Korea: IEEEGoogle ScholarCross Ref
- Wistort, R., & Breazeal, C. (2011). TofuDraw: A mixed-reality choreography tool for authoring robot character performance. In Proceedings from the IDC Conference Idc (pp. 213--216) Google ScholarDigital Library
- Woods, S. N., Walters, M. L., Koay, K. L., & Dautenhahn, K. (2006). Methodological issues in HRI: A comparison of live and video-based methods in robot to human approach direction trials. In Robot and human interactive communication,. In Proceedings from the 15th IEEE International Symposium on Robot and Human Communication (RO-MAN) (pp. 51--58).Google Scholar
- Yankelovich, N., Simpson, N., Kaplan, J., & Provino, J. (2007). Porta-person: Telepresence for the connected conference room. In CHI'07: Extended Abstracts on Human Factors in Computing Systems (pp. 2789--2794) Google ScholarDigital Library
Index Terms
- Designing robots with movement in mind
Recommendations
Towards Designing Companion Robots with the End in Mind
HRI '23: Companion of the 2023 ACM/IEEE International Conference on Human-Robot InteractionThis paper presents an early-stage idea of using 'robot death' as an integral component of human-robot interaction design for companion robots. Reviewing previous discussions around the deaths of companion robots in real-life and popular culture contexts,...
Integrating Animation Artists into the Animation Design of Social Robots: An Open-Source Robot Animation Software
HRI '16: The Eleventh ACM/IEEE International Conference on Human Robot InteractionMovements are an important part of robot design and we need dedicated tools to design them. As previous research has shown, 3D animation techniques are of great use to animate a robot. However, most robots don't benefit from an animation tool and ...
It’s Not Warm But That’s Okay: About Robots That Avoid Human Stereotypes
NordiCHI '22: Nordic Human-Computer Interaction ConferenceRobot designs for social applications, such as care, often imitate human appearance and behavior. However, anthropomorphizing technology may reinforce connections between stereotypical characteristics and a robot’s assumed social category such as ...
Comments