Skip to main content

A Robotic Augmented Reality Virtual Window for Law Enforcement Operations

  • Conference paper
  • First Online:
Book cover Virtual, Augmented and Mixed Reality. Design and Interaction (HCII 2020)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 12190))

Included in the following conference series:

  • 2496 Accesses

Abstract

In room-clearing tasks, SWAT team members suffer from a lack of initial environmental information: knowledge about what is in a room and what relevance or threat level it represents for mission parameters. Normally this gap in situation awareness is rectified only upon room entry, forcing SWAT team members to rely on quick responses and near-instinctual reactions. This can lead to dangerously escalating situations or important missed information which, in turn, can increase the likelihood of injury and even mortality. Thus, we present an x-ray vision system for the dynamic scanning and display of room content, using a robotic platform to mitigate operator risk. This system maps a room using a robot-equipped stereo depth camera and, using an augmented reality (AR) system, presents the resulting geographic information according to the perspective of each officer. This intervention has the potential to notably lower risk and increase officer situation awareness, all while team members are in the relative safety of cover. With these potential stakes, it is important to test the viability of this system natively and in an operational SWAT team context.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Change history

  • 10 July 2020

    The original version of this chapter was revised. The acknowledgement was inadvertently forgotten. It has been added.

References

  1. Bane, R., Hollerer, T.: Interactive tools for virtual x-ray vision in mobile augmented reality. In: Third IEEE and ACM International Symposium on Mixed and Augmented Reality, pp. 231–239. IEEE (2004)

    Google Scholar 

  2. Bethel, C.L., Carruth, D., Garrison, T.: Discoveries from integrating robots into swat team training exercises. In: 2012 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), pp. 1–8. IEEE (2012)

    Google Scholar 

  3. Bethel, C.L., May, D.C., Louine, J.: Use of technology by law enforcement. In: 2016 International Police Executive Symposium (IPES). IPES (2016)

    Google Scholar 

  4. Bi, Y., et al.: An MAV localization and mapping system based on dual realsense cameras. In: International Micro Air Vehicle Competition and Conference, University of Singapore, Singapore, Technical report (2016)

    Google Scholar 

  5. Bichlmeier, C., Sielhorst, T., Heining, S.M., Navab, N.: Improving depth perception in medical ar. In: Horsch, A., Deserno, T.M., Handels, H., Meinzer, H.P., Tolxdorff, T. (eds.) Bildverarbeitung für die Medizin 2007, pp. 217–221. Springer, Heidelberg (2007). https://doi.org/10.1007/978-3-540-71091-2_44

  6. Biswas, J., Veloso, M.: Depth camera based indoor mobile robot localization and navigation. In: 2012 IEEE International Conference on Robotics and Automation, pp. 1697–1702, May 2012. https://doi.org/10.1109/ICRA.2012.6224766

  7. Bradshaw, M.F., Parton, A.D., Eagle, R.A.: The interaction of binocular disparity and motion parallax in determining perceived depth and perceived size. Perception 27(11), 1317–1331 (1998)

    Article  Google Scholar 

  8. Bradshaw, M.F., Parton, A.D., Glennerster, A.: The task-dependent use of binocular disparity and motion parallax information. Vis. Res. 40(27), 3725–3734 (2000)

    Article  Google Scholar 

  9. Byagowi, A., Moussavi, Z.: Design of a virtual reality navigational (VRN) experiment for assessment of egocentric spatial cognition. In: 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 4812–4815. IEEE (2012)

    Google Scholar 

  10. Cutting, J., Vishton, P.: Perception of space and motion. In: Handbook of Perception and Cognition, 1st edn. Academic Press Inc, San Diego (1995)

    Google Scholar 

  11. Drascic, D., Milgram, P.: Perceptual issues in augmented reality. In: Proceedings-SPIE the International Society for Optical Engineering, vol. 2653, pp. 123–134. SPIE International Society for Optical Engineering (1996)

    Google Scholar 

  12. Ellis, S.R., Adelstein, B.D., Yeom, K.: Human control in rotated frames: anisotropies in the misalignment disturbance function of pitch, roll, and yaw. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 56, pp. 1336–1340. SAGE Publications, Los Angeles (2012)

    Google Scholar 

  13. Endsley, M.R.: Measurement of situation awareness in dynamic systems. Hum. Factors 37, 65–84 (1995)

    Article  Google Scholar 

  14. Endsley, M.R., Garland, D.J., et al.: Theoretical underpinnings of situation awareness: a critical review. Situation Awareness Anal. Measur. 1, 24 (2000)

    Google Scholar 

  15. Endsley, M.R., Selcon, S.J., Hardiman, T.D., Croft, D.G.: A comparative analysis of SAGAT and SART for evaluations of situation awareness. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 42, pp. 82–86. SAGE Publications, Los Angeles (1998)

    Google Scholar 

  16. Furness, L.: The Application of Head-Mounted Displays to Airborne Reconnaissance and Weapon Delivery. Wright-Patterson Air Force Base, Ohio, USA (1969)

    Google Scholar 

  17. Gatsoulis, Y., Virk, G.S., Dehghani-Sanij, A.A.: On the measurement of situation awareness for effective human-robot interaction in teleoperated systems. J. Cogn. Eng. Decis. Making 4(1), 69–98 (2010)

    Article  Google Scholar 

  18. Hopp, T., Gangadharbatla, H.: Novelty effects in augmented reality advertising environments: the influence of exposure time and self-efficacy. J. Curr. Issues Res. Adv. 37(2), 113–130 (2016)

    Article  Google Scholar 

  19. Pcllibrary (2020). Accessed 06 Feb 2020

    Google Scholar 

  20. Slam with d435i (2019). Accessed 06 Feb 2020

    Google Scholar 

  21. Pcx (2020). Accessed 06 Feb 2020

    Google Scholar 

  22. Intel realsense d435 (2019). Accessed 06 Feb 2020

    Google Scholar 

  23. Johnston, E.B., Cumming, B.G., Landy, M.S.: Integration of stereopsis and motion shape cues. Vis. Res. 34(17), 2259–2275 (1994)

    Article  Google Scholar 

  24. Jones, J.A., Swan II, J.E., Singh, G., Kolstad, E., Ellis, S.R.: The effects of virtual reality, augmented reality, and motion parallax on egocentric depth perception. In: Proceedings of the 5th Symposium on Applied Perception in Graphics and Visualization, pp. 9–14. ACM (2008)

    Google Scholar 

  25. Julesz, B.: Binocular depth perception without familiarity cues: random-dot stereo images with controlled spatial and temporal properties clarify problems in stereopsis. Science 145(3630), 356–362 (1964)

    Article  Google Scholar 

  26. Jurgens, V., Cockburn, A., Billinghurst, M.: Depth cues for augmented reality stakeout. In: Proceedings of the 7th ACM SIGCHI New Zealand Chapter’s International Conference on Computer-Human Interaction: Design Centered HCI, pp. 117–124 (2006)

    Google Scholar 

  27. Kersten-Oertel, M., Jannin, P., Collins, D.L.: The state of the art of visualization in mixed reality image guided surgery. Comput. Med. Imaging Graph. 37(2), 98–112 (2013)

    Article  Google Scholar 

  28. Klatzky, R.L., Wu, B., Stetten, G.: The disembodied eye: consequences of displacing perception from action. Vis. Res. 50(24), 2618–2626 (2010)

    Article  Google Scholar 

  29. Konrad, R., Padmanaban, N., Cooper, E., Wetzstein, G.: Computational focus-tunable near-eye displays. In: ACM SIGGRAPH 2016 Emerging Technologies, p. 3. ACM (2016)

    Google Scholar 

  30. Kunz, B.R., Wouters, L., Smith, D., Thompson, W.B., Creem-Regehr, S.H.: Revisiting the effect of quality of graphics on distance judgments in virtual environments: a comparison of verbal reports and blind walking. Attention Percept. Psychophys. 71(6), 1284–1293 (2009)

    Article  Google Scholar 

  31. Lalejini, A., Duckworth, D., Sween, R., Bethel, C.L., Carruth, D.: Evaluation of supervisory control interfaces for mobile robot integration with tactical teams. In: 2014 IEEE International Workshop on Advanced Robotics and its Social Impacts, pp. 1–6. IEEE (2014)

    Google Scholar 

  32. Liu, S., et al.: A multi-plane optical see-through head mounted display design for augmented reality applications. J. Soc. Inf. Display 24(4), 246–251 (2016)

    Article  Google Scholar 

  33. Livingston, M.A., Ai, Z., Swan, J.E., Smallman, H.S.: Indoor vs. outdoor depth perception for mobile augmented reality. In: 2009 IEEE Virtual Reality Conference, pp. 55–62. IEEE (2009)

    Google Scholar 

  34. Livingston, M.A., Dey, A., Sandor, C., Thomas, B.H.: Pursuit of “x-ray vision” for augmented reality. In: Huang, W., Alem, L., Livingston, M. (eds.) Human Factors in Augmented Reality Environments, pp. 67–107. Springer, Heidelberg (2013). https://doi.org/10.1007/978-1-4614-4205-9_4

  35. Livingston, M.A., et al.: Resolving multiple occluded layers in augmented reality. In: Proceedings of the Second IEEE and ACM International Symposium on Mixed and Augmented Reality, pp. 56–65. IEEE (2003)

    Google Scholar 

  36. Loomis, J.M., Knapp, J.M., et al.: Visual perception of egocentric distance in real and virtual environments. Virtual Adapt. Environ. 11, 21–46 (2003)

    Google Scholar 

  37. McIntosh, R.D., Lashley, G.: Matching boxes: familiar size influences action programming. Neuropsychologia 46(9), 2441–2444 (2008)

    Article  Google Scholar 

  38. O’Shea, R.P., Blackburn, S.G., Ono, H.: Contrast as a depth cue. Vis. Res. 34(12), 1595–1604 (1994)

    Article  Google Scholar 

  39. Palmisano, S.: Perceiving self-motion in depth: the role of stereoscopic motion and changing-size cues. Percept. Psychophys. 58(8), 1168–1176 (1996)

    Article  Google Scholar 

  40. Phillips, N., Massey, K., Arefin, M.S., Swan, J.E.: Design, assembly, calibration, and measurement of an augmented reality haploscope. In: 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp. 1770–1774. IEEE (2019)

    Google Scholar 

  41. Rieser, J.J., Pick, H.L., Ashmead, D.H., Garing, A.E.: Calibration of human locomotion and models of perceptual-motor organization. J. Exp. Psychol.: Hum. Percept. Perform. 21(3), 480 (1995)

    Google Scholar 

  42. Rogers, B., Graham, M.: Motion parallax as an independent cue for depth perception. Perception 8(2), 125–134 (1979)

    Article  Google Scholar 

  43. Sarter, N.B., Woods, D.D.: Situation awareness: a critical but ill-defined phenomenon. Int. J. Aviat. Psychol. 1(1), 45–57 (1991)

    Article  Google Scholar 

  44. Singh, G., Ellis, S.R., Swan II, J.E.: The effect of focal distance, age, and brightness on near-field augmented reality depth matching. IEEE Trans. Vis. Comput. Graph. (2018)

    Google Scholar 

  45. Smith, K.U., Smith, W.M.: Perception and Motion. American Psychological Association (1962)

    Google Scholar 

  46. Stevens, J., Eifert, L.: Augmented reality technology in U.S. army training (wIP). In: Proceedings of the 2014 Summer Simulation Multiconference, pp. 1–6. Society for Computer Simulation International, 2685679 (2014)

    Google Scholar 

  47. Taylor, A.G.: Develop Microsoft Hololens Apps Now. Springer, Heidelberg (2016). https://doi.org/10.1007/978-1-4842-2202-7

  48. Vidal, M., Amorim, M.A., Berthoz, A.: Navigating in a virtual three-dimensional maze: how do egocentric and allocentric reference frames interact? Cogn. Brain Res. 19(3), 244–258 (2004)

    Article  Google Scholar 

  49. Voshell, M., Woods, D.D., Phillips, F.: Overcoming the keyhole in human-robot coordination: simulation and evaluation. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 49, pp. 442–446. Sage Publications, Los Angeles (2005)

    Google Scholar 

  50. Willemsen, P., Colton, M.B., Creem-Regehr, S.H., Thompson, W.B.: The effects of head-mounted display mechanics on distance judgments in virtual environments. In: Proceedings of the 1st Symposium on Applied Perception in Graphics and Visualization, pp. 35–38. ACM (2004)

    Google Scholar 

Download references

Acknowledgements

This material is based upon work supported by the National Science Foundation, under awards IIS-1937565, to J.E. Swan II and C.L. Bethel, and IIS-1320909, to J.E. Swan II. We acknowledge a productive collaboration with Mark Ballard, Chief of Police, Police Department, City of Starkville, MS, USA. We also acknowledge the contributions of Mohammed Safayet Arefin.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Cindy L. Bethel .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Phillips, N., Kruse, B., Khan, F.A., Swan II, J.E., Bethel, C.L. (2020). A Robotic Augmented Reality Virtual Window for Law Enforcement Operations. In: Chen, J.Y.C., Fragomeni, G. (eds) Virtual, Augmented and Mixed Reality. Design and Interaction. HCII 2020. Lecture Notes in Computer Science(), vol 12190. Springer, Cham. https://doi.org/10.1007/978-3-030-49695-1_40

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-49695-1_40

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-49694-4

  • Online ISBN: 978-3-030-49695-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics