Skip to main content

A Wearable Walking-Aid System For Visual-Impaired People Based on Monocular Top-View Transform

  • Conference paper
  • First Online:
Transactions on Engineering Technologies
  • 1244 Accesses

Abstract

This paper presents a wearable system which can provide walking-aids for visual-impaired people in an outdoor environment. Unlike many existing systems that rely on stereo cameras or combination of other sensors, the proposed system aims to do the job by using just single camera mounted at user’s belly. One of the main difficulties of using single camera in outdoor navigation task is the discrimination of obstacles with cluttered background. To solve this problem, this paper makes use of the inhomogeneous re-sampling property of top-view transform. By mapping the original image to a top-view virtual plane using top-view transform, background edges in the near-field are sub-sampled while obstacle edges in the far-field are oversampled. Morphology filters with connected component analysis are used to enhance obstacle edges as edge-blobs with larger size, whereas sparse edges from background are filtered out. Based on the identified obstacles, safe path can be estimated by tracking a polar edge-blob histogram on the top-view domain. To deliver the safe direction to the user, an audio message interface is designed. The system is tested in different outdoor scenes with complex road conditions, and its efficiency has been confirmed.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. D. Dakopoulos, N.G. Bourbakis, Wearable obstacle avoidance electronic travel aids for blind: a survey, IEEE Trans. Syst. Man Cybern. 40(1), 25–35, (2010)

    Google Scholar 

  2. L.A. Johnson, C.M. Higgins, A navigation aid for the blind using tactile-visual sensory substitution, in Proceedings of 28th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 6268–6292, NewYork, 2006

    Google Scholar 

  3. N. Bourbakis, Sensing 3D dynamic space for blind. IEEE Eng. Med. Biol. Mag. 27(1), 49–55 (2008)

    Article  Google Scholar 

  4. D. Dakopoulos, S.K. Boddhu, N. Bourbakis, A 2D vibration array as an assistive device for visually impaired, in Proceedings of 7th IEEE International Conference Bioinformatics and Bioengineering, vol. 1 (Boston, 2007), pp. 930–937

    Google Scholar 

  5. S. Meers, K. Ward, A substitute vision system for providing 3D perception and GPS navigation via electro-tactile stimulation, in Proceedings 1st International Conference Sensing Technology, pp. 21–23, Palmerston North, 2005

    Google Scholar 

  6. A. Rodríguez, J.J. Yebes, P.F. Alcantarilla, L.M. Bergasa, J. Almazán, A. Cela, Assisting the visually impaired: obstacle detection and warning system by acoustic feedback. Sensors 12(12), 17476–17496 (2012)

    Article  Google Scholar 

  7. A. Hub, J. Diepstraten, T. Ertl, Design and development of an indoor navigation and object identification system for the blind,in Proceedings of ACM SIGACCESS Accessibility Computing, pp. 147–152, New York, Jan 2004

    Google Scholar 

  8. G. Sainarayanan, R. Nagarajan, S. Yaacob, Fuzzy image processing scheme for autonomous navigation of human blind. Appl. Softw. Comput. 7(1), 257–264 (2007)

    Article  Google Scholar 

  9. Q. Lin, Y. Han, Safe path estimation for visual-impaired people using polar edge-blob histogram, in Lecture Notes in Engineering and Computer Science: Proceedings of The World Congress on Engineering and Computer Science, WCECS 2013, pp 401–405, San Francisco, 23–25 Oct 2013

    Google Scholar 

  10. M. Bertozzi, M. Broggi, GOLD: a parallel real-time stereo vision system for generic obstacle and lane detection. IEEE Trans. Image Proc. 7(1), 62–81 (1998)

    Article  Google Scholar 

  11. J.H. Yu, H.I. Chung, H.S. Hahn, Walking assistance system for sight impaired people based on a multimodal transformation technique, in Proceedings of ICROS-SICE International Joint Conference, pp. 1639–1643. Tokyo, Aug 2009

    Google Scholar 

Download references

Acknowledgment

This research is supported by Next-Generation Information Computing Development Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Education, Science and Technology (No. 2012M3C4A7032182)). This work (Grants No. C0119991) is also supported by Business for Cooperative R&D between Industry, Academy, & Research Institute funded by Korea Small and Medium Business Administration in 2013.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Youngjoon Han .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer Science+Business Media Dordrecht

About this paper

Cite this paper

Lin, Q., Han, Y. (2014). A Wearable Walking-Aid System For Visual-Impaired People Based on Monocular Top-View Transform. In: Kim, H., Ao, SI., Amouzegar, M. (eds) Transactions on Engineering Technologies. Springer, Dordrecht. https://doi.org/10.1007/978-94-017-9115-1_38

Download citation

  • DOI: https://doi.org/10.1007/978-94-017-9115-1_38

  • Published:

  • Publisher Name: Springer, Dordrecht

  • Print ISBN: 978-94-017-9114-4

  • Online ISBN: 978-94-017-9115-1

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics