Abstract
We present McSig, a multimodal system for teaching blind children cursive handwriting so that they can create a personal signature. For blind people handwriting is very difficult to learn as it is a near-zero feedback activity that is needed only occasionally, yet in important situations; for example, to make an attractive and repeatable signature for legal contracts. McSig aids the teaching of signatures by translating digital ink from the teacher's stylus gestures into three non-visual forms: (1) audio pan and pitch represents the x and y movement of the stylus; (2) kinaesthetic information is provided to the student through a force-feedback haptic pen that mimics the teacher's stylus movement; and (3) a physical tactile line on the writing sheet is created by the haptic pen.
McSig has been developed over two major iterations of design, usability testing and evaluation. The final step of the first iteration was a short evaluation with eight visually impaired children. The results suggested that McSig had the highest potential benefit for congenitally and totally blind children and also indicated some areas where McSig could be enhanced. The second prototype incorporated significant modifications to the system, improving the audio, tactile and force-feedback. We then ran a detailed, longitudinal evaluation over 14 weeks with three of the congenitally blind children to assess McSig's effectiveness in teaching the creation of signatures. Results demonstrated the effectiveness of McSig—they all made considerable progress in learning to create a recognizable signature. By the end of ten lessons, two of the children could form a complete, repeatable signature unaided, the third could do so with a little verbal prompting. Furthermore, during this project, we have learnt valuable lessons about providing consistent feedback between different communications channels (by manual interactions, haptic device, pen correction) that will be of interest to others developing multimodal systems.
Supplemental Material
- Amirabdollahian, F., Loureiro, R., and Harwin, W. 2002. Minimum jerk trajectory control for rehabilitation and haptic applications. Robot. Automat. 4, 3380--3385.Google Scholar
- Arter, C., McCall, S., and Bowyer, T. 1996. Handwriting and children with visual impairments. British J. Spec. Ed. 23, 1, 25--28.Google ScholarCross Ref
- Astrom, K. J. and Hagglund, T. 1995. Pid Controllers. International Society for Measurement and Control.Google Scholar
- Brewster, S. A., Wright, P. C., and Edwards, A. D. N. 1993. An evaluation of earcons for use in auditory human-computer interfaces. In Proceedings of ACM/IFIP (INTERACT'93). ACM, 222--227. Google ScholarDigital Library
- Crossan, A. and Brewster, S. 2008. Multimodal trajectory playback for teaching shape information and trajectories to visually impaired computer users. ACM Trans. Access. Comput. 1, 2. Google ScholarDigital Library
- Crossan, A., Williamson, J., and Brewster, S. 2006. A general purpose control-based playback for force feedback systems. In Proceedings of the Eurohaptics Conference.Google Scholar
- Drew, C. J., Hardman, M. L., and Hosp, J. L. 2008. Designing and Conducting Research in Education. Sage Publications Inc.Google Scholar
- Flowers, J. H. and Hauer, T. A. 1995. Musical versus visual graphs: Cross-modal equivalence in perception of time series data. Human Factors 37, 553--569.Google ScholarCross Ref
- Freedom Scientific. http://www.freedomscientific.com/(accessed 6/07.)Google Scholar
- Henmi, K. and Yoshikawa, T. 1998. Virtual lesson and its application to virtual calligraphy system. In Proceedings of the IEEE International Conference on Robotics and Automation 1275--1280.Google Scholar
- Hoggan, E. and Brewster, S. 2007. Designing audio and tactile crossmodal icons for mobile devices. In Proceedings of the 9th International Conference on Multimodal Interfaces. ACM, 162--169. Google ScholarDigital Library
- Kurze, M. 1996. Tdraw: A computer-based tactile drawing tool for blind people. In Proceedings of the ACM Conference on Assistive Technologies. ACM, 131--138. Google ScholarDigital Library
- Lee, J. C., Dietz, P., Leigh, D., Yerazunis, W., and Hudson, S. E. 2004. Haptic pen: A tactile feedback stylus for touch screens. In Proceedings of the ACM Symposium on User Interface Software and Technology (UIST). ACM, 291--294. Google ScholarDigital Library
- Mansur, D. L., Blattner, M., and Joy, K. 1985. Sound-graphs: A numerical data analysis method for the blind. J. Med. Syst. 9, 163--174.Google ScholarCross Ref
- Mullins, J., Mawson, C., and Nahavandi, S. 2005. haptic handwriting aid for training and rehabilitation. Sys. Man Cybernet. 3, 2690--2694.Google ScholarCross Ref
- Nagappan, N., Maximilien, E. M., Bhat, T., and Williams, L. 2008. realizing quality improvement through test driven development: results and experiences of four industrial teams. Empir. Softw. Eng. 13, 289--302. Google ScholarDigital Library
- Plimmer, B., Crossan, A., Brewster, S., and Blagojevic, R. 2008. Multimodal collaborative handwriting training for visually-impaired people. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI'08). ACM, 393-402. Google ScholarDigital Library
- Rassmus-Gröhn, K., Magnusson, C., and Eftring, H. 2006. User evaluations of a virtual haptic-audio line drawing prototype. HAID'06. Lecture Notes in Computer Science, vol. 4129. Springer. 81--91. Google ScholarDigital Library
- Sallnäs, E.-L., Moll, J., and Severinson-Eklundh, K. 2007. Group work about geometrical concepts among blind and sighted pupils using haptic interfaces. In Proceedings of the 2nd Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environments and Teleoperator Systems. Google ScholarDigital Library
- Sassoon, R. 2003. Handwriting: The Way to Teach It. Chapman.Google Scholar
- SensAble_Technologies 2010. http://www.sensable.com/haptic-phantom-omni.htm (accessed 6/10).Google Scholar
- Taylor, J. 2001. Handwriting: Multisensory Approaches to Assessing and Improving Handwriting Skills. David Fulton, London.Google Scholar
- Teo, C., Burdet, E., and Lim, H. 2002. A robotic teacher of chinese handwriting. In Proceedings of the Symposium for Haptic Interfaces for Virtual Environment and Teleoperator Systems. 335--341. Google ScholarDigital Library
- Wells, J. L. 1986. A tactile training programme: For use in conjunction with secondary school social studies, Auckland College of Education.Google Scholar
- Yu, W. and Brewster, S. A. 2003. Evaluation of multimodal graphs for blind people. J. Univ Acces Inf. Soc. 2, (3), 105--124.Google ScholarDigital Library
- Zhao, H., Plaisant, C., Shneiderman, B., and Lazar, J. 2008. Data sonification for users with visual impairment: a case study with georeferenced data. ACM Trans. Comput. Human Interact. 15, 1, 1--28. Google ScholarDigital Library
Index Terms
- Signing on the tactile line: A multimodal system for teaching handwriting to blind children
Recommendations
Tactile Letters: A Tangible Tabletop with Texture Cues Supporting Alphabetic Learning for Dyslexic Children
TEI '15: Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied InteractionDyslexic children have great difficulty in learning to read. While research in HCI suggests that tangible user interfaces (TUIs) have the potential to support children learning to read, few studies have explored how to help dyslexic children learn to ...
Accessible smartphones for blind users: A case study for a wayfinding system
While progress on assistive technologies have been made, some blind users still face several problems opening and using basic functionalities when interacting with touch interfaces. Sometimes, people with visual impairments may also have problems ...
From tactile to virtual: using a smartwatch to improve spatial map exploration for visually impaired users
MobileHCI '16: Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and ServicesTactile raised-line maps are paper maps widely used by visually impaired people. We designed a mobile technique, based on hand tracking and a smartwatch, in order to leverage pervasive access to virtual maps. We use the smartwatch to render localized ...
Comments