skip to main content
research-article
Free Access

Signing on the tactile line: A multimodal system for teaching handwriting to blind children

Published:08 August 2011Publication History
Skip Abstract Section

Abstract

We present McSig, a multimodal system for teaching blind children cursive handwriting so that they can create a personal signature. For blind people handwriting is very difficult to learn as it is a near-zero feedback activity that is needed only occasionally, yet in important situations; for example, to make an attractive and repeatable signature for legal contracts. McSig aids the teaching of signatures by translating digital ink from the teacher's stylus gestures into three non-visual forms: (1) audio pan and pitch represents the x and y movement of the stylus; (2) kinaesthetic information is provided to the student through a force-feedback haptic pen that mimics the teacher's stylus movement; and (3) a physical tactile line on the writing sheet is created by the haptic pen.

McSig has been developed over two major iterations of design, usability testing and evaluation. The final step of the first iteration was a short evaluation with eight visually impaired children. The results suggested that McSig had the highest potential benefit for congenitally and totally blind children and also indicated some areas where McSig could be enhanced. The second prototype incorporated significant modifications to the system, improving the audio, tactile and force-feedback. We then ran a detailed, longitudinal evaluation over 14 weeks with three of the congenitally blind children to assess McSig's effectiveness in teaching the creation of signatures. Results demonstrated the effectiveness of McSig—they all made considerable progress in learning to create a recognizable signature. By the end of ten lessons, two of the children could form a complete, repeatable signature unaided, the third could do so with a little verbal prompting. Furthermore, during this project, we have learnt valuable lessons about providing consistent feedback between different communications channels (by manual interactions, haptic device, pen correction) that will be of interest to others developing multimodal systems.

Skip Supplemental Material Section

Supplemental Material

tochi_mcsig_final.mpg

mpg

105.8 MB

References

  1. Amirabdollahian, F., Loureiro, R., and Harwin, W. 2002. Minimum jerk trajectory control for rehabilitation and haptic applications. Robot. Automat. 4, 3380--3385.Google ScholarGoogle Scholar
  2. Arter, C., McCall, S., and Bowyer, T. 1996. Handwriting and children with visual impairments. British J. Spec. Ed. 23, 1, 25--28.Google ScholarGoogle ScholarCross RefCross Ref
  3. Astrom, K. J. and Hagglund, T. 1995. Pid Controllers. International Society for Measurement and Control.Google ScholarGoogle Scholar
  4. Brewster, S. A., Wright, P. C., and Edwards, A. D. N. 1993. An evaluation of earcons for use in auditory human-computer interfaces. In Proceedings of ACM/IFIP (INTERACT'93). ACM, 222--227. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Crossan, A. and Brewster, S. 2008. Multimodal trajectory playback for teaching shape information and trajectories to visually impaired computer users. ACM Trans. Access. Comput. 1, 2. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Crossan, A., Williamson, J., and Brewster, S. 2006. A general purpose control-based playback for force feedback systems. In Proceedings of the Eurohaptics Conference.Google ScholarGoogle Scholar
  7. Drew, C. J., Hardman, M. L., and Hosp, J. L. 2008. Designing and Conducting Research in Education. Sage Publications Inc.Google ScholarGoogle Scholar
  8. Flowers, J. H. and Hauer, T. A. 1995. Musical versus visual graphs: Cross-modal equivalence in perception of time series data. Human Factors 37, 553--569.Google ScholarGoogle ScholarCross RefCross Ref
  9. Freedom Scientific. http://www.freedomscientific.com/(accessed 6/07.)Google ScholarGoogle Scholar
  10. Henmi, K. and Yoshikawa, T. 1998. Virtual lesson and its application to virtual calligraphy system. In Proceedings of the IEEE International Conference on Robotics and Automation 1275--1280.Google ScholarGoogle Scholar
  11. Hoggan, E. and Brewster, S. 2007. Designing audio and tactile crossmodal icons for mobile devices. In Proceedings of the 9th International Conference on Multimodal Interfaces. ACM, 162--169. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Kurze, M. 1996. Tdraw: A computer-based tactile drawing tool for blind people. In Proceedings of the ACM Conference on Assistive Technologies. ACM, 131--138. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Lee, J. C., Dietz, P., Leigh, D., Yerazunis, W., and Hudson, S. E. 2004. Haptic pen: A tactile feedback stylus for touch screens. In Proceedings of the ACM Symposium on User Interface Software and Technology (UIST). ACM, 291--294. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Mansur, D. L., Blattner, M., and Joy, K. 1985. Sound-graphs: A numerical data analysis method for the blind. J. Med. Syst. 9, 163--174.Google ScholarGoogle ScholarCross RefCross Ref
  15. Mullins, J., Mawson, C., and Nahavandi, S. 2005. haptic handwriting aid for training and rehabilitation. Sys. Man Cybernet. 3, 2690--2694.Google ScholarGoogle ScholarCross RefCross Ref
  16. Nagappan, N., Maximilien, E. M., Bhat, T., and Williams, L. 2008. realizing quality improvement through test driven development: results and experiences of four industrial teams. Empir. Softw. Eng. 13, 289--302. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Plimmer, B., Crossan, A., Brewster, S., and Blagojevic, R. 2008. Multimodal collaborative handwriting training for visually-impaired people. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI'08). ACM, 393-402. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Rassmus-Gröhn, K., Magnusson, C., and Eftring, H. 2006. User evaluations of a virtual haptic-audio line drawing prototype. HAID'06. Lecture Notes in Computer Science, vol. 4129. Springer. 81--91. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Sallnäs, E.-L., Moll, J., and Severinson-Eklundh, K. 2007. Group work about geometrical concepts among blind and sighted pupils using haptic interfaces. In Proceedings of the 2nd Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environments and Teleoperator Systems. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Sassoon, R. 2003. Handwriting: The Way to Teach It. Chapman.Google ScholarGoogle Scholar
  21. SensAble_Technologies 2010. http://www.sensable.com/haptic-phantom-omni.htm (accessed 6/10).Google ScholarGoogle Scholar
  22. Taylor, J. 2001. Handwriting: Multisensory Approaches to Assessing and Improving Handwriting Skills. David Fulton, London.Google ScholarGoogle Scholar
  23. Teo, C., Burdet, E., and Lim, H. 2002. A robotic teacher of chinese handwriting. In Proceedings of the Symposium for Haptic Interfaces for Virtual Environment and Teleoperator Systems. 335--341. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Wells, J. L. 1986. A tactile training programme: For use in conjunction with secondary school social studies, Auckland College of Education.Google ScholarGoogle Scholar
  25. Yu, W. and Brewster, S. A. 2003. Evaluation of multimodal graphs for blind people. J. Univ Acces Inf. Soc. 2, (3), 105--124.Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Zhao, H., Plaisant, C., Shneiderman, B., and Lazar, J. 2008. Data sonification for users with visual impairment: a case study with georeferenced data. ACM Trans. Comput. Human Interact. 15, 1, 1--28. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Signing on the tactile line: A multimodal system for teaching handwriting to blind children

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in

    Full Access

    • Published in

      cover image ACM Transactions on Computer-Human Interaction
      ACM Transactions on Computer-Human Interaction  Volume 18, Issue 3
      July 2011
      208 pages
      ISSN:1073-0516
      EISSN:1557-7325
      DOI:10.1145/1993060
      Issue’s Table of Contents

      Copyright © 2011 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 8 August 2011
      • Accepted: 1 March 2011
      • Revised: 1 January 2011
      • Received: 1 July 2010
      Published in tochi Volume 18, Issue 3

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article
      • Research
      • Refereed

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader