Abstract
Simultaneous localisation and mapping (SLAM) relies on low-cost on-board sensors such as cameras and inertial measurement units. It is crucial that the surroundings are visible to the cameras to maximise the accuracy of the system. An estimation strategy is proposed to augment ORB-SLAM2 that considers feature extraction capability, distribution of the extracted features in the image frame, and the ability of the algorithm to track features over time. The method is tested on challenging datasets, and the output is evaluated against different visibility conditions. The proposed method is shown to react appropriately and consistently to ‘less visible’ conditions such as fog, sunlight, and rapid motion in real time, with minimal computational load.
Supported by the Department of Aerospace Engineering and the Department of Automatic Control and Systems Engineering at the University of Sheffield. Also this work is supported by the UK’s Engineering and Physical Sciences Research Council (EPSRC) Programme Grant EP/S016813/1.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Blanco, J.L., Moreno, F.A., Gonzalez-Jimenez, J.: The málaga urban dataset: high-rate stereo and lidars in a realistic urban scenario. Int. J. Robot. Res. 33(2), 207–214 (2014)
European Patent Office: Patents and self-driving vehicles. Technical report, European Patent Office (2018)
Fonder, M., Droogenbroeck, M.V.: Mid-air: a multi-modal dataset for extremely low altitude drone flights. In: Conference on Computer Vision and Pattern Recognition Workshop (CVPRW) (2019)
Li, W., et al.: InteriorNet: mega-scale multi-sensor photo-realistic indoor scenes dataset. In: British Machine Vision Conference (BMVC) (2018)
Mur-Artal, R., Tardós, J.: ORB-SLAM2: an open-source slam system for monocular, stereo, and RGB-D cameras. IEEE Trans. Rob. 33(5), 1255–1262 (2017)
Narasimhan, S.G., Nayar, S.K.: Shedding light on the weather. In: 2003 Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 1, p. I (2003)
Pomerleau, D.: Visibility estimation from a moving vehicle using the RALPH vision system. In: Proceedings of Conference on Intelligent Transportation Systems, pp. 906–911 (1997)
Qin, T., Li, P., Shen, S.: VINS-Mono: a robust and versatile monocular visual-inertial state estimator. IEEE Trans. Rob. 34(4), 1004–1020 (2018)
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Haggart, R., Aitken, J.M. (2021). Online Scene Visibility Estimation as a Complement to SLAM in UAVs. In: Fox, C., Gao, J., Ghalamzan Esfahani, A., Saaj, M., Hanheide, M., Parsons, S. (eds) Towards Autonomous Robotic Systems. TAROS 2021. Lecture Notes in Computer Science(), vol 13054. Springer, Cham. https://doi.org/10.1007/978-3-030-89177-0_38
Download citation
DOI: https://doi.org/10.1007/978-3-030-89177-0_38
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-89176-3
Online ISBN: 978-3-030-89177-0
eBook Packages: Computer ScienceComputer Science (R0)