Research on Positioning Accuracy of Mobile Robot in Indoor Environment Based on Improved RTABMAP Algorithm
Abstract
:1. Introduction
- An improved algorithm for RTABMAP is presented in this paper. The main idea is to use an EKF to fuse the IMU with the wheel odometry and provide a new state estimation model to achieve more accurate location estimation.
- We have validated the improved method through experiments conducted in public datasets (indoors) as well as real environments, which include enclosed rooms and open corridors.
2. Related Work
2.1. IMU State Estimation Model
- Acquire and group the data.
- 2.
- Calculate the average value of the data.
- 3.
- Obtain the Allan variance.
2.2. IMU Fusion Method Based on Extended Kalman Filter
- Define the prior estimate, as shown in Equation (14).
- Define the prior estimation error covariance matrix, as shown in Equation (15).
- Define posterior estimation as shown in Equation (16).
- Define the posterior estimation error covariance matrix, as shown in Equation (18).
2.3. RTABMAP Improvement Method
3. Experiment
3.1. Evaluation and Analysis of EuRoC Dataset
3.1.1. Running in Dataset V1_01_easy
3.1.2. Running in Dataset V1_03_difficult
3.2. Real Environment Verification and Analysis
3.2.1. Runing in Closed Room Scenario 1
3.2.2. Runing in Closed Room Scenario 2
3.2.3. Testing in Indoor Corridors
4. Results and Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Belter, D.; Nowicki, M.R. Optimization-based legged odometry and sensor fusion for legged robot continuous localization. Robot. Auton. Syst. 2019, 111, 110–124. [Google Scholar] [CrossRef]
- Garcia-Fidalgo, E.; Ortiz, A. Vision-based topological mapping and localization methods: A survey. Robot. Auton. Syst. 2015, 64, 1–20. [Google Scholar] [CrossRef]
- Angladon, V.; Gasparini, S.; Charvillat, V. An evaluation of real-time RGB-D visual odometry algorithms on mobile devices. J. Real-Time Image Process. 2019, 16, 1643–1660. [Google Scholar] [CrossRef]
- Han, Y.; Tang, C.; Xiao, K. RGB-D Dense Map Construction Based on Improved ORB-SLAM2 Algorithm. J. Hunan Univ. 2023, 2, 52–62. [Google Scholar]
- Guclu, O.; Can, A.B. Fast and effective loop closure detection to improve SLAM performance. J. Intell. Robot. Syst. 2019, 93, 495–517. [Google Scholar] [CrossRef]
- Wang, Z.; Wu, Y.; Niu, Q. Multi-Sensor Fusion in Automated Driving: A Survey. IEEE Access 2020, 8, 2847–2868. [Google Scholar] [CrossRef]
- Mourikis, A.I.; Roumeliotis, S.I. A Multi-State Constraint Kalman Filter for Vision-Aided Inertial Navigation. In Proceedings of the 2007 IEEE International Conference on Robotics and Automation, Rome, Italy, 10–14 April 2007; pp. 3565–3572. [Google Scholar]
- Li, Y.; Tang, X.; Li, Z. Multi-sensor information fusion for mobile robots. J. Northwestern Polytech. Univ. 2021, 39 (Suppl. 1), 59–65. [Google Scholar]
- Shi, J.; Zha, F.; Sun, L. A Survey of Visual-Inertial Slam for Mobile Robots. Robot 2020, 42, 734–748. [Google Scholar]
- Bloesch, M.; Omari, S.; Hutter, M.; Siegwart, R. Robust visual inertial odometry using a direct EKF-based approach. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–2 October 2015; pp. 298–304. [Google Scholar]
- Bloesch, M.; Burri, M.; Omari, S.; Hutter, M.; Siegwart, R. Iterated Extended Kalman Filter Based Visual-Inertial Odometry Using Direct Photometric Feedback. Int. J. Robot. Res. 2017, 36, 1053–1072. [Google Scholar] [CrossRef]
- Leutenegger, S.; Lynen, S.; Bosse, M.; Siegwart, R.; Furgale, P. Keyframe-Based Visual–Inertial Odometry Using Nonlinear Optimization. Int. J. Robot. Res. 2014, 34, 314–334. [Google Scholar] [CrossRef]
- Qin, T.; Li, P.; Shen, S. Vins-mono: A robust and versatile monocular visual-inertial state estimator. IEEE Trans. Robot. 2018, 34, 1004–1020. [Google Scholar] [CrossRef]
- Mur-Artal, R.; Montiel, J.M.M.; Tardós, J.D. Orb-Slam: A Versatile and Accurate Monocular Slam System. IEEE Trans. Robot. 2015, 31, 1147–1163. [Google Scholar] [CrossRef]
- Mur-Artal, R.; Tardós, J.D. Orb-slam2: An open-source slam system for monocular, stereo, and RGB-D cameras. IEEE Trans. Robot. 2017, 33, 1255–1262. [Google Scholar] [CrossRef]
- Stumberg, L.V.; Cremers, D. DM-VIO: Delayed Marginalization Visual-Inertial Odometry. IEEE Robot. Autom. Lett. 2022, 7, 1408–1415. [Google Scholar] [CrossRef]
- Liu, M.; Tao, Y.; Wang, Z. Research on Simultaneous localization and mapping Algorithm based on lidar and IMU. Math. Biosci. Eng. 2023, 20, 8954–8974. [Google Scholar] [CrossRef] [PubMed]
- Labbé, M.; Michaud, F. RTAB-Map as an open-source lidar and visual simultaneous localization and mapping library for large-scale and long-term online operation. J. Field Robot. 2019, 36, 416–446. [Google Scholar] [CrossRef]
- Ban, C.; Ren, G.; Wang, B. Research on self-adaptive EKF algorithm for robot attitude measurement based on IMU. Chin. J. Sci. Instrum. 2020, 41, 33–39. [Google Scholar]
- Wang, J.; Xu, S.; Cheng, N.; You, Y.; Zhang, X.; Tang, Z.; Yang, X. Orientation Estimation Algorithm for Motion Based on Multi-Sensor. Comput. Syst. Appl. 2015, 24, 134–139. [Google Scholar]
- Duan, X.; Jiang, W.; Yang, G. Research on calibration testing method of ADIS16488 MEMS IMU. J. Test Meas. Technol. 2018, 32, 19–25. [Google Scholar]
- Teng, Z.; Han, B.; Cao, J. PLI-SLAM: A Tightly-Coupled Stereo Visual-Inertial SLAM System with Point and Line Features. Remote Sens. 2023, 15, 4678. [Google Scholar] [CrossRef]
- Jiang, C.; Chen, S.; Chen, Y. A MEMS IMU de-noising method using long short term memory recurrent neural networks (LSTM-RNN). Sensors 2018, 18, 3470. [Google Scholar] [CrossRef]
- Gao, Y.; Shi, D.; Li, R. Gyro-Net: IMU Gyroscopes Random Errors Compensation Method Based on Deep Learning. IEEE Robot. Autom. Lett. 2022, 8, 1471–1478. [Google Scholar] [CrossRef]
- Xu, Q.; Gao, Z.; Yang, C. High-Accuracy Positioning in GNSS-Blocked Areas by Using the MSCKF-Based SF-RTK/IMU/Camera Tight Integration. Remote Sens. 2023, 15, 3005. [Google Scholar] [CrossRef]
- Jiang, X.; Zhang, X.; Li, M. Random Error Analysis Method for MEMS Gyroscope Based on Allan Variance. J. Test Meas. Technol. 2017, 3, 190–195. [Google Scholar]
- Song, H.; Yang, P.; Xu, L. Analysis and Processing on Stochastic Error of MEMS Sensor. Chin. J. Sens. Actuators 2013, 12, 1719–1723. [Google Scholar]
- The Kalibr Visual-Inertial Calibration Toolbox. Available online: https://github.com/ethz-asl/kalibr (accessed on 27 June 2023).
- Imu_Utils: A Ros Package Tool to Analyze the IMU Performance. Available online: https://github.com/gaowenliang/imu_utils (accessed on 3 July 2023).
- Colonnier, F.; Della Vedova, L.; Orchard, G. ESPEE: Event-Based Sensor Pose Estimation Using an Extended Kalman Filter. Sensors 2021, 21, 7840. [Google Scholar] [CrossRef] [PubMed]
- Mallios, A.; Ridao, P.; Ribas, D.; Maurelli, F.; Pétillot, Y.R. EKF-SLAM for AUV navigation under probabilistic sonar scan-matching. In Proceedings of the 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, Taipei, China, 18–22 October 2010; pp. 4404–4411. [Google Scholar]
- Yan, Y.; Zhang, B.; Zhou, J.; Zhang, Y.; Liu, X. Real-Time Localization and Mapping Utilizing Multi-Sensor Fusion and Visual–IMU–Wheel Odometry for Agricultural Robots in Unstructured, Dynamic and GPS-Denied Greenhouse Environments. Agronomy 2022, 12, 1740. [Google Scholar] [CrossRef]
- Robot_Localization: A Package of Nonlinear State Estimation Nodes. Available online: https://github.com/cra-ros-pkg/robot_localization (accessed on 20 July 2023).
- Huai, Z.; Huang, G. Robocentric visual–inertial odometry. Int. J. Robot. Res. 2022, 41, 667–689. [Google Scholar] [CrossRef]
- Jung, J.H.; Cha, J.; Chung, J.Y. Monocular visual-inertial-wheel odometry using low-grade IMU in urban areas. IEEE Trans. Intell. Transp. Syst. 2020, 23, 925–938. [Google Scholar] [CrossRef]
- Zhan, H.; Weerasekera, C.S.; Bian, J.W.; Reid, I. Visual odometry revisited: What should be learnt? In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 4203–4210. [Google Scholar]
- EVO: Python Package for the Evaluation of Odometry and SLAM. Available online: https://github.com/MichaelGrupp/evo (accessed on 8 August 2023).
- Burri, M.; Nikolic, J.; Gohl, P. The EuRoC micro aerial vehicle datasets. Int. J. Robot. Res. 2016, 35, 1157–1163. [Google Scholar] [CrossRef]
Inputs | Outputs | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Method | Camera | IMU | Lidar | Wheel | Pose | Occupancy | Point Cloud | ||||
Mono | Stereo | RGB-D | 2D | 3D | 2D | 3D | |||||
VINS-Mono | ✓ | ✓ | ✓ | Sparse | |||||||
ORB-SLAM2 | ✓ | ✓ | ✓ | Sparse | |||||||
RTABMAP | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | Dense | |
DM-VIO | ✓ | ✓ | ✓ | Dense | |||||||
VIWO | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | Dense |
Variables | Meanings |
---|---|
Sampling interval of IMU sensor | 200 Hz |
Gyroscope measurement noise variance, | 0.00109 rad/s |
Accelerometer measurement noise variance, | 0.0234 rad/s |
Gyroscope measurement “bias Instability”, | 0.00000859 m/s2 |
Accelerometer measurement “bias Instability”, | 0.000697 m/s2 |
Dataset ID | Method | Std | RMSE | Min | Median | Mean | Max |
---|---|---|---|---|---|---|---|
V101 | RTABMAP-VIWO | 0.04894 | 0.08539 | 0.00757 | 0.06353 | 0.06997 | 0.20492 |
RTABMAP | 0.11488 | 0.16415 | 0.01657 | 0.06311 | 0.11725 | 0.47280 |
Dataset ID | Method | Std | RMSE | Min | Median | Mean | Max |
---|---|---|---|---|---|---|---|
V103 | RTABMAP-VIWO | 0.06715 | 0.11993 | 0.02001 | 0.08238 | 0.09937 | 0.56543 |
RTABMAP | 0.16965 | 0.27017 | 0.01908 | 0.14518 | 0.21027 | 0.73483 | |
ORB-SAM2 | 0.10144 | 0.15709 | 0.01212 | 0.08991 | 0.11994 | 0.49687 |
Test ID | Method | Std | RMSE | Min | Median | Mean | Max |
---|---|---|---|---|---|---|---|
Test 1 | RTABMAP-VIWO | 0.03236 | 0.05736 | 0.00430 | 0.03713 | 0.04736 | 0.21107 |
RTABMAP | 0.03411 | 0.11640 | 0.02933 | 0.11352 | 0.11129 | 0.19791 | |
ORB-SAM2 | 0.16599 | 0.39933 | 0.06341 | 0.36843 | 0.36320 | 0.98348 | |
Test 2 | RTABMAP-VIWO | 0.03336 | 0.06411 | 0.00685 | 0.04551 | 0.05475 | 0.14472 |
RTABMAP | 0.09341 | 0.22949 | 0.02395 | 0.19688 | 0.20961 | 0.39448 | |
ORB-SAM2 | 0.17345 | 0.34326 | 0.01885 | 0.29120 | 0.29621 | 0.84962 | |
Test 3 | RTABMAP-VIWO | 0.22203 | 0.36744 | 0.04565 | 0.22955 | 0.29277 | 0.78441 |
RTABMAP | 0.37641 | 0.51839 | 0.01877 | 0.18227 | 0.35644 | 1.38872 | |
ORB-SAM2 | 0.88130 | 1.50840 | 0.25462 | 0.88784 | 1.22417 | 4.34613 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhou, S.; Li, Z.; Lv, Z.; Zhou, C.; Wu, P.; Zhu, C.; Liu, W. Research on Positioning Accuracy of Mobile Robot in Indoor Environment Based on Improved RTABMAP Algorithm. Sensors 2023, 23, 9468. https://doi.org/10.3390/s23239468
Zhou S, Li Z, Lv Z, Zhou C, Wu P, Zhu C, Liu W. Research on Positioning Accuracy of Mobile Robot in Indoor Environment Based on Improved RTABMAP Algorithm. Sensors. 2023; 23(23):9468. https://doi.org/10.3390/s23239468
Chicago/Turabian StyleZhou, Shijie, Zelun Li, Zhongliang Lv, Chuande Zhou, Pengcheng Wu, Changshuang Zhu, and Wei Liu. 2023. "Research on Positioning Accuracy of Mobile Robot in Indoor Environment Based on Improved RTABMAP Algorithm" Sensors 23, no. 23: 9468. https://doi.org/10.3390/s23239468