skip to main content
research-article
Open Access

Designing robots with movement in mind

Published:28 February 2014Publication History
Skip Abstract Section

Abstract

This paper makes the case for designing interactive robots with their expressive movement in mind. As people are highly sensitive to physical movement and spatiotemporal affordances, well-designed robot motion can communicate, engage, and offer dynamic possibilities beyond the machines' surface appearance or pragmatic motion paths. We present techniques for movement centric design, including character animation sketches, video prototyping, interactive movement explorations, Wizard of Oz studies, and skeletal prototypes. To illustrate our design approach, we discuss four case studies: a social head for a robotic musician, a robotic speaker dock listening companion, a desktop telepresence robot, and a service robot performing assistive and communicative tasks. We then relate our approach to the design of non-anthropomorphic robots and robotic objects, a design strategy that could facilitate the feasibility of real-world human-robot interaction.

References

  1. Adalgeirsson, S. O., & Breazeal, C. (2010). MeBot: A robotic platform for socially embodied telepresence. In Proceedings of the 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI) (pp. 15--22), Osaka, Japan. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Akers, D. (2006). Wizard of Oz for participatory design: Inventing a gestural interface for 3D selection of neural pathway estimates. In Proceedings of CHI '06: Extended Abstracts on Human Factors in Computing Systems (pp. 454--459), Montreal, Canada. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Alami, R., Clodic, A., Montreuil, V., Sisbot, E. A., & Chatila, R. (2005). Task planning for human-robot interaction. In Proceedings of the 2005 joint conference on smart objects and ambient intelligence (SOC EUSAI) (pp. 81--85). New York, NY: ACM Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Argyle, M. (1988). Bodily Communication (2nd ed.). Methuen & Co, UK.Google ScholarGoogle Scholar
  5. Aucouturier, J., Ogai, Y., & Ikegami, T. (2008). Making a robot dance to music using chaotic itinerancy in a network of Fitzhugh-Nagumo neurons. Neural Information Processing, 4985, pp. 647--656. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Baldwin, D. A., & Baird, J. A. (2001). Discerning intentions in dynamic human action. Trends in Cognitive Sciences, 5(4), 171--178.Google ScholarGoogle ScholarCross RefCross Ref
  7. Baron-Cohen, S. (1991). Precursors to a theory of mind: Understanding attention in others. In A. Whiten (Ed.), Natural theories of mind (pp. 233--250). Oxford, UK: Blackwell Press.Google ScholarGoogle Scholar
  8. Barrett, H. C., Todd, P. M., Miller, G. F., & Blythe, P. W. (2005). Accurate judgments of intention from motion cues alone: A cross-cultural study. Evolution and Human Behavior, 26(4), 313--331.Google ScholarGoogle ScholarCross RefCross Ref
  9. Bates, J. (1994). The role of emotion in believable agents. Communications of the ACM, 37(7), 122--125. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Blythe, P. W., Todd, P. M., & Miller, G. F. (1999). How motion reveals intention: Categorizing social interactions. In Gigerenzer, G., and Todd, P. M. (eds), Simple Heuristics that Make Us Smart. Oxford University Press, New York.Google ScholarGoogle Scholar
  11. Bohren, J., Rusu, R. B., Jones, E. G., Marder-Eppstein, E., Pantofaru, C., Wise, M., .., Holzer, S.(2011). Towards autonomous robotic butlers: Lessons learned with the PR2. In Proceedings from the IEEE International Conference on Robotics and Automation (ICRA) (pp. 5568--5575).Google ScholarGoogle Scholar
  12. Breazeal, C., Brooks, A., Chilongo, D., Gray, J., Hoffman, G., Kidd, C., ... Lockerd, A. (2004). Working collaboratively with Humanoid Robots. In Proceedings of the IEEE RAS/RSJ International Conference on Humanoid Robots (Humanoids). Santa Monica, CA.Google ScholarGoogle ScholarCross RefCross Ref
  13. Breazeal, C., Wang, A., & Picard, R. (2007). Experiments with a robotic computer: Body, affect and cognition interactions. In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction (pp. 153--160). New York, NY: ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Bretan, M., Cicconet, M., Nikolaidis, R., & Weinberg, G. (2012). Developing and Composing for a Robotic Musician Using Different Modes of Interaction. In Proceedings of the International Computer Music Conference (ICMC) (pp. 498--503), Ljubljana, Slovenia.Google ScholarGoogle Scholar
  15. Chambers, J. (2011). Artificial Defense Mechanisms P. Antonelli (Ed.). New York, NY: The Museum of Modern Art.Google ScholarGoogle Scholar
  16. Clark, H. H. (1996). Using Language. Cambridge, UK: Cambridge University Press.Google ScholarGoogle Scholar
  17. Clark, H. H. (2005). Coordinating with each other in a material world. Discourse Studies, 7(4--5), 507--525. Dennett, D. C. (1987). Three kinds of intentional psychology. In The intentional Stance (chap. 3). Cambridge, MA: MIT Press.Google ScholarGoogle Scholar
  18. DiSalvo, C. F., Gemperle, F., Forlizzi, J., & Kiesler, S. (2002). All robots are not created equal: The Design and Perception of Humanoid Robot Heads. In Proceedings of the 4th Conference on Designing Interactive Systems (DIS) (pp. 321--326). New York, NY: ACM Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Duffy, B. R. (2003). Anthropomorphism and the social robot. Robotics and Autonomous Systems, 42(3--4), 177--190.Google ScholarGoogle Scholar
  20. Ekman, P., & Friesen, W. (1969). The repertoire of nonverbal behavior: Categories, origins, usage, and coding. Semiotica, 1(1), 49--98.Google ScholarGoogle ScholarCross RefCross Ref
  21. Fink, J. (2012). Anthropomorphism and human likeness in the design of robots and human-robot interaction. Social Robotics, 199--208. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Fiske, S., & Taylor, S. (1991). Social cognition. New York, NY: McGraw-Hill.Google ScholarGoogle Scholar
  23. Gao, T., Newman, G. E., & Scholl, B. J. (2009). The psychophysics of chasing : A case study in the perception of animacy. Cognitive Psychology, 59(2), 154--179.Google ScholarGoogle ScholarCross RefCross Ref
  24. Gibson, J. J. (1977). The concept of affordances. Perceiving, acting, and knowing, 67--82, Lawrence Erlbaum Associates, New Jersey.Google ScholarGoogle Scholar
  25. Goldberg, K., & Kehoe, B. (2013, January). Cloud Robotics and Automation: A Survey of Related Work (Tech. Rep. No. UCB/EECS-2013-5). EECS Department, University of California, Berkeley.Google ScholarGoogle Scholar
  26. Gray, J., Hoffman, G., Adalgeirsson, S. O., Berlin, M., & Breazeal, C. (2010). Expressive, interactive robots: Tools, techniques, and insights based on collaborations. In HRI 2010 Workshop: What Do Collaborations With the Arts Have to Say About HRI?Google ScholarGoogle Scholar
  27. Hall, E. T. (1969). The hidden dimension. New York, NY: Anchor Books.Google ScholarGoogle Scholar
  28. Heider, F., & Simmel, M. (1944). An experimental study of apparent behavior. The American Journal of Psychology, 57(2), 243--259.Google ScholarGoogle ScholarCross RefCross Ref
  29. Hendriks, B., Meerbeek, B., Boess, S., Pauws, S., & Sonneveld, M. (2011). Robot vacuum cleaner personality and behavior. International Journal of Social Robotics, 3(2), 187--195.Google ScholarGoogle ScholarCross RefCross Ref
  30. Hoffman, G. (2005). HRI: Four Lessons from Acting Method (Tech. Rep.). Cambridge, MA, USA: MIT Media Laboratory.Google ScholarGoogle Scholar
  31. Hoffman, G. (2012). Dumb robots, smart phones: A case study of music listening companionship. In Proceedings of the IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Atlanta, Georgia.Google ScholarGoogle ScholarCross RefCross Ref
  32. Hoffman, G., & Breazeal, C. (2009). Effects of anticipatory perceptual simulation on practiced human-robot tasks. Autonomous Robots, 28(4), 403--423. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. Hoffman, G., Kubat, R. R., & Breazeal, C. (2008). A hybrid control system for puppeterring a live robotic stage actor. In Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Munich, Germany.Google ScholarGoogle Scholar
  34. Hoffman, G., & Vanunu, K. (2013). Effects of robotic companionship on music enjoyment and agent perception. In Proceedings of the 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI). Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. Hoffman, G., & Weinberg, G. (2011). Interactive improvisation with a robotic marimba player. Autonomous Robots, 31(2--3), 133--153. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Höysniemi, J., Hämäläinen, P., & Turkki, L. (2004). Wizard of Oz prototyping of computer vision based action games for children. In Proceedings of the 2004 Conference on Interaction Design and Children: Building a Community (pp. 27--34), College Park, Maryland. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. Hudson, S., Fogarty, J., Atkeson, C., Avrahami, D., Forlizzi, J., Kiesler, S., ... Yang, J. (2003). Predicting human interruptibility with sensors: a Wizard of Oz feasibility study. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 257--264), Ft. Lauderdale, Florida. Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. Inhatowicz, E. (1970). Cybernetic Art. Available from http://www.senster.com/ihnatowicz/senster/. Johansson, G. (1973). Visual perception of biological motion and a model for its analysis. Perception & Psychophysics, 14(2), 201--211.Google ScholarGoogle Scholar
  39. Jones, C. (1965). The Dot and the Line. Available at http://vimeo.com/71888010Google ScholarGoogle Scholar
  40. Ju, W., & Takayama, L. (2009). Approachability: How people interpret automatic door movement as gesture. International Journal of Design, 3(2), 1--10.Google ScholarGoogle Scholar
  41. Kelley, J. F. (1983). An empirical methodology for writing user-friendly natural language computer applications. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 193--196), Boston, Massachuesetts. Google ScholarGoogle ScholarDigital LibraryDigital Library
  42. Kim, J., Kwak, S., & Kim, M. (2009). Entertainment robot personality design based on basic factors on motions: A case study with Rolly. In Proceedings of the 18th IEEE International Symposium on Robot and Human Interactive Communication, (RO-MAN) (pp. 803--808).Google ScholarGoogle ScholarCross RefCross Ref
  43. Kirsh, D., & Maglio, P. (1994). On distinguishing epistemic from pragmatic action. Cognitive Science, 18(4), 513--549.Google ScholarGoogle ScholarCross RefCross Ref
  44. Knapp, M. L., & Hall, J. A. (2002). Nonverbal communication in human interaction (5th ed.). Fort Worth: Harcourt Brace College Publishers.Google ScholarGoogle Scholar
  45. Knight, H. (2011). Eight lessons learned about non-verbal interactions through robot theater. In Social robotics (pp. 42--51). Springer. Google ScholarGoogle ScholarDigital LibraryDigital Library
  46. Kozima, H., Michalowski, M. P., & Nakagawa, C. (2009). Keepon: A playful robot for research, therapy, and entertainment. International Journal of Social Robotics, 1(1), 3--18.Google ScholarGoogle ScholarCross RefCross Ref
  47. Kozlowski, L. T., & Cutting, J. E. (1977). Recognizing the sex of a walker from a dynamic point-light display. Perception & Psychophysics, 21(6), 575--580.Google ScholarGoogle ScholarCross RefCross Ref
  48. Kraut, R. E. (1978). Verbal and nonverbal cues in the perception of lying. Journal of Personality and Social Psychology, 36(4), 380.Google ScholarGoogle ScholarCross RefCross Ref
  49. Lasseter, J. (1986). Luxo Jr. Pixar Animation Studios. Pixar. Available from http://www.pixar.com/short_films/Theatrical-Shorts/Luxo-Jr.Google ScholarGoogle Scholar
  50. Lasseter, J. (1987). Principles of traditional animation applied to 3D computer animation. Computer Graphics, 21(4), 35--44. Google ScholarGoogle ScholarDigital LibraryDigital Library
  51. Lasseter, J. (2001). Tricks to animating characters with a computer. ACM SIGGRAPH Computer Graphics, 35(2), 45--47. Google ScholarGoogle ScholarDigital LibraryDigital Library
  52. Loula, F., Prasad, S., Harber, K., & Shiffrar, M. (2005). Recognizing people from their movement. Journal of Experimental Psychology: Human Perception and Performance, 31(1), 210--20.Google ScholarGoogle ScholarCross RefCross Ref
  53. Malle, B., Moses, L., & Baldwin, D. (Eds.). (2001). Intentions and Intentionality. MIT Press, Cambridge, Massachusetts.Google ScholarGoogle Scholar
  54. Maulsby, D., Greenberg, S., & Mander, R. (1993). Prototyping an intelligent agent through Wizard of Oz. In Proceedings of the Conference on Human Factors in Computing Systems (INTERACT '93 & CHI '93) (pp. 277--284), Amsterdam, The Netherlands. Google ScholarGoogle ScholarDigital LibraryDigital Library
  55. Meerbeek, B., Saerbeck, M., & Bartneck, C. (2009). Towards a design method for expressive robots. In Proceedings of the 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI), (pp. 277--278), San Diego, California. Google ScholarGoogle ScholarDigital LibraryDigital Library
  56. Michalowski, M., Sabanovic, S., & Kozima, H. (2007). A Dancing Robot for Rhythmic Social Interaction. In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction (pp. 89--96). Arlington, VA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  57. Michotte, A. (1946). La perception de la causalité (Etudes Psychology, Vol. VI.).Google ScholarGoogle Scholar
  58. Moore, N.-J., Hickson, M., & Stacks, D. W. (2010). Nonverbal communication: Studies and applications. London: Oxford University Press.Google ScholarGoogle Scholar
  59. Mori, M. (1970). The uncanny valley. Energy, 7(4), 33--35.Google ScholarGoogle Scholar
  60. Mumm, J., & Mutlu, B. (2011). Human-robot proxemics: Physical and psychological distancing in human-robot interaction. In Proceedings of the 6th International Conference on Human-Robot Interaction (HRI)(p. 331). New York, NY: ACM Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  61. Murphy, R., Shell, D., Guerin, A., Duncan, B., Fine, B., Pratt, K., & Sourntos, T. (2010). A Midsummer Nights Dream (With Flying Robots). Autonomous Robots, 30(2), 143--156. Google ScholarGoogle ScholarDigital LibraryDigital Library
  62. Nomura, T., Suzuki, T., Kanda, T., Han, J., Shin, N., Burke, J., & Kato, K. (2008). What people assume about humanoid and animal-type robots: Cross-cultural analysis between Japan, Korea, and the United States. International Journal of Humanoid Robotics, 5(1), 25--46.Google ScholarGoogle ScholarCross RefCross Ref
  63. Norman, D. A. (1999). Affordance, conventions, and design. Interactions, 6(3), 38--43. Google ScholarGoogle ScholarDigital LibraryDigital Library
  64. Paulos, E., & Canny, J. (1998). PRoP: Personal roving presence. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 296--303), Los Angeles, California. Google ScholarGoogle ScholarDigital LibraryDigital Library
  65. Riek, L. D. (2012). Wizard of Oz studies in HRI: A systematic review and new reporting guidelines. Journal of Human-Robot Interaction, 1(1). Google ScholarGoogle ScholarDigital LibraryDigital Library
  66. Santos, K. B. (2012). The Huggable: A socially assistive robot for pediatric care. Unpublished doctoral dissertation, Massachusetts Institute of Technology.Google ScholarGoogle Scholar
  67. Scholl, B., & Tremoulet, P. (2000). Perceptual causality and animacy. Trends in Cognitive Sciences, 4(8), 299--309.Google ScholarGoogle ScholarCross RefCross Ref
  68. Setapen, A. (2011). Shared Attention for Human-Robot Interaction. Unpublished doctoral dissertation, Massachusetts Institute of Technology.Google ScholarGoogle Scholar
  69. Sharma, M., Hildebrandt, D., Newman, G., Young, J. E., & Eskicioglu, R. (2013). Communicating affect via flight path Exploring use of the Laban Effort System for designing affective locomotion paths. In Proceedings of the 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI) (pp. 293--300) Google ScholarGoogle ScholarDigital LibraryDigital Library
  70. Sirkin, D., & Ju, W. (2012). Consistency in physical and on-screen action improves perceptions of telepresence robots. Proceedings of the Seventh Annual ACM/IEEE International Conference on Human-Robot Interaction (HRI). Google ScholarGoogle ScholarDigital LibraryDigital Library
  71. Takayama, L., Dooley, D., & Ju, W. (2011). Expressing thought: Improving robot readability with animation principles. In Proceedings of the 6th International Conference on Human-Robot Interaction (HRI) (pp. 69--76). ACM Press Google ScholarGoogle ScholarDigital LibraryDigital Library
  72. Thomas, F., & Johnston, O. (1995). The Illusion of Life: Disney Animation (revised ed.). New York: Hyperion.Google ScholarGoogle Scholar
  73. Thornton, I. M., Pinto, J., & Shiffrar, M. (1998). The visual perception of human locomotion. Cognitive Neuropsychology, 15, 535--552.Google ScholarGoogle ScholarCross RefCross Ref
  74. Tomasello, M. (1999). The cultural ecology of young childrens interactions with objects and artifacts. Ecological Approaches to Cognition: Essays in Honor of Ulric Neisser, 153--170.Google ScholarGoogle Scholar
  75. Venolia, G., Tang, J., Cervantes, R., Bly, S., Robertson, G., Lee, B., & Inkpen, K. (2010). Embodied social proxy: Mediating interpersonal connection in hub-and-satellite teams. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 1049--1058) Google ScholarGoogle ScholarDigital LibraryDigital Library
  76. Vertelney, L. (1995). Using video to prototype user interfaces. In Human-Computer Interaction (pp. 142--146). Google ScholarGoogle ScholarDigital LibraryDigital Library
  77. Weinberg, G., & Driscoll, S. (2007). The design of a perceptual and improvisational robotic marimba player. In Proceedings from the 16th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN) (pp. 769--774). Jeju, Korea: IEEEGoogle ScholarGoogle ScholarCross RefCross Ref
  78. Wistort, R., & Breazeal, C. (2011). TofuDraw: A mixed-reality choreography tool for authoring robot character performance. In Proceedings from the IDC Conference Idc (pp. 213--216) Google ScholarGoogle ScholarDigital LibraryDigital Library
  79. Woods, S. N., Walters, M. L., Koay, K. L., & Dautenhahn, K. (2006). Methodological issues in HRI: A comparison of live and video-based methods in robot to human approach direction trials. In Robot and human interactive communication,. In Proceedings from the 15th IEEE International Symposium on Robot and Human Communication (RO-MAN) (pp. 51--58).Google ScholarGoogle Scholar
  80. Yankelovich, N., Simpson, N., Kaplan, J., & Provino, J. (2007). Porta-person: Telepresence for the connected conference room. In CHI'07: Extended Abstracts on Human Factors in Computing Systems (pp. 2789--2794) Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Designing robots with movement in mind
    Index terms have been assigned to the content through auto-classification.

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in

    Full Access

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader