Next Article in Journal
Efficient Feature Learning Approach for Raw Industrial Vibration Data Using Two-Stage Learning Framework
Previous Article in Journal
Physics of Composites for Low-Frequency Magnetoelectric Devices
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Integrated Indoor Positioning System of Greenhouse Robot Based on UWB/IMU/ODOM/LIDAR

College of Mechanical and Electrical Engineering, Hunan Agriculture University, Changsha 410128, China
*
Author to whom correspondence should be addressed.
Sensors 2022, 22(13), 4819; https://doi.org/10.3390/s22134819
Submission received: 13 May 2022 / Revised: 17 June 2022 / Accepted: 23 June 2022 / Published: 25 June 2022
(This article belongs to the Section Remote Sensors)

Abstract

:
Conventional mobile robots employ LIDAR for indoor global positioning and navigation, thus having strict requirements for the ground environment. Under the complicated ground conditions in the greenhouse, the accumulative error of odometer (ODOM) that arises from wheel slip is easy to occur during the long-time operation of the robot, which decreases the accuracy of robot positioning and mapping. To solve the above problem, an integrated positioning system based on UWB (ultra-wideband)/IMU (inertial measurement unit)/ODOM/LIDAR is proposed. First, UWB/IMU/ODOM is integrated by the Extended Kalman Filter (EKF) algorithm to obtain the estimated positioning information. Second, LIDAR is integrated with the established two-dimensional (2D) map by the Adaptive Monte Carlo Localization (AMCL) algorithm to achieve the global positioning of the robot. As indicated by the experiments, the integrated positioning system based on UWB/IMU/ODOM/LIDAR effectively reduced the positioning accumulative error of the robot in the greenhouse environment. At the three moving speeds, including 0.3 m/s, 0.5 m/s, and 0.7 m/s, the maximum lateral error is lower than 0.1 m, and the maximum lateral root mean square error (RMSE) reaches 0.04 m. For global positioning, the RMSEs of the x-axis direction, the y-axis direction, and the overall positioning are estimated as 0.092, 0.069, and 0.079 m, respectively, and the average positioning time of the system is obtained as 72.1 ms. This was sufficient for robot operation in greenhouse situations that need precise positioning and navigation.

1. Introduction

Stable and reliable absolute positioning information can be obtained by global navigation satellite system (GNSS) in the outdoor environment [1], whereas under the indoor conditions of greenhouses, it is impossible to apply GNSS for accurate positioning due to the signal occlusion. At present, the positioning and navigation methods of greenhouse robots worldwide primarily consist of guide rail, machine vision navigation, ultrasonic navigation, and multi-sensor fusion navigation [2,3,4,5]. Feng Qingchun et al. [6] designed a greenhouse robot to walk autonomously and pick tomatoes on the set track. This positioning greenhouse robot has high accuracy and reliability and can achieve centimeter-level positioning. However, specific tracks should be set in advance, which is costly and inflexible [7]. Li Tianhua et al. [8] proposed controlling the direction of view of the camera through the pan-tilt so the visual axis of the camera could always be parallel to the road. Moreover, machine vision was adopted to extract the pixel coordinates of the horizontal center point at the end of the road to obtain navigation information, as an attempt to achieve navigation. The maximum deviation of using the telephoto camera in the straight-line driving of the greenhouse should be lower than 0.15 m. The navigation method based on machine vision is capable of achieving more accurate path tracking, whereas it is difficult to maintain the stability of the positioning system in the greenhouse since it is dependent on good lighting conditions. Mosalanejad et al. [9] designed a spray robot to measure the distance of obstacles with ultrasonic sensors, and this robot could autonomously walk in the greenhouse. The lateral Root Mean Square Error (RMSE) was found to be less than 0.08 m at different driving speeds. The method is only dependent on a single ultrasonic sensor to locate and navigate the robot, which is susceptible to environmental interference and has poor flexibility. LIDAR, a novel type of navigation and positioning sensor, has become the mainstream sensor of robot navigation for its advantages of high stability, high precision, and high instantaneity [10,11]. Hou Jialin et al. [12] used the integration of front and rear dual-lidar and wheel encoder based on Simultaneous Localization and Mapping (SLAM) algorithm to achieve the global positioning, mapping, and navigation functions of the robot in the greenhouse. The RMSE was found to be less than 0.11 m between the actual path and the target path of the on-board system in the greenhouse constant speed navigation, and the horizontal and vertical RMSE of the target point navigation was measured to be less than 0.12 m. However, this method easily results in positioning accumulative error after long-time work. In general, the global positioning method based on LIDAR should integrate multiple positioning information for the accurate positioning and navigation of the robot. At present, IMU/ODOM/LIDAR integrated method has been the most extensively used by indoor robots, whereas it is easy to produce accumulative error caused by wheel slip under complicated ground conditions in greenhouse [13,14]. In addition, the structured site of the greenhouse and the mass planting of the same crops lead to highly similar environments, which may cause the problem of LIDAR scanning matching failure. As a result, the probability of kidnapped robots problem in global positioning increases [15], and the positioning and navigation accuracy of the greenhouse robot is affected. Therefore, introducing a sensor capable of providing absolute location information can effectively decrease the probability of the above problem. Ultra-wideband (UWB) technology, because of its relatively high absolute positioning accuracy, has become a positioning system employed by indoor mobile robots to determine absolute coordinates instead of outdoor GNSS positioning [16]. The UWB positioning system platform is reasonably built in an open environment which is capable of achieving decimeter-level positioning, and it is characterized by strong real-time data, no accumulative error, and high positioning accuracy. It applies to the positioning and tracking of indoor static or dynamic objects, whereas the data of its ranging method based on signal arrival time fluctuates [17]. Lin Xiangze et al. [18] introduced UWB positioning into agricultural vehicles in greenhouses, solved ranging error by k-means algorithm and truncation processing method, and obtained accurate positioning information. The average static positioning accuracy is 0.07 m, the probability of dynamic accuracy reaching 0.08 m is 31.3%, and the probability of higher than 0.15 m is 35%. This method adopts UWB as the only location information source, so it is difficult to provide positioning information with high continuity and high stability.
To summarize, in the greenhouse environment, the conventional method of indoor global positioning and navigation for mobile robots using LIDAR suffers from the problems of low mapping and positioning accuracy caused by positioning accumulative error, whereas UWB positioning can provide global positioning information with high precision, no accumulative error, and strong real-time. A positioning system based on UWB/IMU/ODOM/LIDAR, which can correct the greenhouse robot's accumulated positioning error in global positioning, is developed in this paper. This system can present accurate and stable positioning information to the greenhouse robot, which is the goal of this research.

2. Materials and Methods

2.1. Composition and Design of Positioning System

The integrated positioning system in this paper primarily consists of an UWB positioning system, a robot platform, and a remote monitoring platform, as presented in Figure 1. To be specific, the UWB positioning system consists of four base stations (including base stations A, B, C, and D), as well as a positioning label. In the robot platform, four-wheel drive and differential steering are adopted. The respective wheel is equipped with a shock absorber, thus effectively reducing the odometer (ODOM) error arising from wheel slip under the complex ground conditions of the greenhouse. The robot platform is equipped with a UWB label, a LIDAR, an inertial measurement unit (IMU), as well as four photoelectric encoders. The remote monitoring platform employs a notebook computer to achieve the communication and remote monitoring of the robot platform via Wi-Fi.
The UWB positioning system adopts D-DWG-PG 2.5 positioning module designed by Guangzhou LENET Technology Co., Ltd. (Guangzhou, China). The maximum communication distance between modules is 130 m. In this paper, four modules serve as the base stations, and the square positioning area is built. The distance between UWB positioning label and the four base stations is measured by time of flight (TOF) method, and then the absolute positioning information of the robot platform is solved by triangulation algorithm [17]. This platform also adopts the RPLIDAR-S2 2D LIDAR of Silan technology to acquire the point cloud information of the environment. The LIDAR adopts TOF for ranging, with excellent light resistance and high measurement accuracy. It has 360° horizontal scanning range, 0.12° horizontal angle resolution, 0.05~30 m measurement radius, ±3 cm measurement accuracy, and 15 Hz scanning frequency. It is capable of efficiently acquiring the 2D environmental information of the greenhouse, thus becoming suitable for all-weather operation of robot platform. The LIDAR is installed on the top of the robot platform, which scans 360° environmental information. IMU employs iFLYTEK's WHEELTEC-N100 9-axis attitude sensor to acquire yaw information, with an angular resolution of 0.1°. To be specific, it is installed on the top of the robot platform and located at the rotation center of the robot platform. Four photoelectric encoders are installed on four motor drive wheels, respectively, to serve as an ODOM for providing speed and mileage information. The robot platform adopts STM32 module serves as the lower computer to control the DC motor to drive the wheels, and the driving speed of the robot platform is monitored and fed back via the photoelectric encoder. The robot platform adopts Raspberry Pi 4B installed with Ubuntu (18.04) and ROS (Melodic) system as the upper computer. On the basis of ROS(Melodic), this platform achieves the positioning and navigation algorithm, sensor data monitoring, and sending speed control instructions to the lower computer [19].

2.2. Integrated Positioning Method Based on UWB/IMU/ODOM/LIDAR

The map construction method based on IMU/ODOM/LIDAR is mainly to integrate the estimated pose information of IMU/ODOM via the front-end input, and the back-end scans the surrounding environment using LIDAR to obtain the positioning information in the environment. Lastly, the environment map is generated based on SLAM algorithm [20,21,22]. The above method has been found to be suitable for the indoor environment with flat ground, and it is highly dependent on the accuracy of sensors. The ground conditions of greenhouses are relatively complex, and the working area is generally wide. The accumulative error generated by ODOM and IMU will affect the accuracy of mapping and positioning. The integrated positioning method shown in Figure 2 uses UWB, IMU, ODOM, and LIDAR to address these issues.
The motion model and measurement model of greenhouse robot in the actual motion process are nonlinear. The Extended Kalman filter (EKF) algorithm can successfully cope with nonlinear systems, making it ideal for integrating numerous sensor inputs and estimating the relative pose of the robot [23]. Based on the particle filter algorithm, the Adaptive Monte Carlo (AMCL) is capable of fusing LIDAR information and map information for matching to monitor the global pose of the robot, and it exhibits a high degree of robustness. This makes it ideal for global positioning of the robot and navigation [24]. The combination of EKF and AMCL can allow them to complement each other for more accurate global positioning. Essentially, this paper's integrated positioning method is broken down into two stages. UWB/IMU/ODOM are initially integrated using the EKF algorithm to offer self-estimated robot platform pose information. EKF covers two steps, including prediction and measurement. The system uses the motion state information ( V X , O , V Y , O , ω O ) provided by ODOM as the input of the prediction stage, while applying the yaw angle θ IMU provided by IMU and the absolute coordinate information ( X UWB , Y UWB ) provided by UWB as the input of the measurement stage to obtain the estimated pose information ( X EKF , Y EKF , θ EKF ) of the robot platform. There are additionally two phases in the AMCL algorithm: prediction and measurement. At the second stage, the estimated pose information obtained at the first stage is the input of the prediction, and the pose information ( X L , Y L , θ L ) obtained by matching the LIDAR with the pre-established 2D grid map of the greenhouse serves as the input of the measurement stage [25], so as to achieve the global positioning of the robot in the greenhouse.

2.3. State Space Model

Assuming that the working area of the robot platform in the greenhouse is an ideal horizontal 2D environment, the state vector of the system is the pose of the robot, and the state vector of the robot platform at time t is x t = [ X t , Y t , θ t ] T , where X t and Y t denote the position of the geometric center of the robot platform, and θ t represents the yaw angle of the robot platform in the navigation coordinate system. The motion model and measurement model of the motion system are developed as follows using EKF and AMCL algorithms based on filtering [26]:
{ x t = g ( u t , x t - 1 ) + ε t z t = h ( x t ) + v t
where x t denotes the state quantity of the system at time t; u t represents the control quantity of the system; ε t expresses the motion noise of the system; z t is the measured value of the system at time t; v t denotes the measured noise of the system.

2.3.1. EKF Algorithm Integrates UWB/IMU/ODOM Positioning Data

According to the dead reckoning approach and the differential motion model of the mobile robot, the robot can be regarded as moving at a uniform speed in a short time [27]. The pose information ( V X , O , V Y , O , ω O ) of the ODOM serves as the input of the control quantity in the prediction stage, and the pose of the robot platform at time t is expressed as:
x t _ = x t 1 + ( cos θ t 1 sin θ t 1 0 sin θ t 1 cos θ t 1 0 0 0 1 ) ( V X , O V Y , O ω O ) d t
where V X , O , V Y , O , and ω O denote the motion speed of the robot platform in the x-axis direction, y-axis direction, and yaw angle in the navigation coordinate system, respectively.
The covariance matrix of the system state vector at time t in the prediction stage is written as:
P t _ = f x P t 1 f x T + f w ε t f w T
where P t denotes the covariance matrix of the state quantity x t ; ε t represents the covariance matrix of the motion noise at time t; f x and f w denote the Jacobian matrix of the motion model and the motion noise, respectively.
For greenhouse environment, the motion noise of ODOM measured in reference [26] is written as:
ε t = ( 0.01 0 0 0 0 . 01 0 0 0 0 . 5 )
In accordance with EKF, it takes the yaw angle θ t , IMU of IMU and the 2D positioning information ( X t , UWB , Y t , UWB ) provided by UWB as the input of state measurement value at time t, and the measurement model is expressed as:
z t = ( X t , UWB Y t , UWB θ t , IMU ) + v t
where z t denotes the system measurement value at time t; v t represents the measurement noise at time t.
The calculated kalman gain coefficient is calculated as:
K t = P t _ H T ( H P t _ H T + v t ) 1
where H denotes the Jacobian matrix of the measurement model; v t represents the covariance matrix of the measurement noise. Combined with the sensor accuracy given by IMU and UWB manufacturers, the Rosbag toolkit in ROS system is adopted to record the positioning data of robot platform. After the derived EKF formula is analyzed by software simulation [28], the optimal measurement noise is set as:
v t = ( 0.05 0 0 0 0 . 05 0 0 0 0 . 1 )
Corrected state quantity:
x t = x t _ + K t ( z t x t _ )
The covariance matrix of the corrected state quantity is calculated as:
P t = ( I 3 K t H ) P t _
where I 3 denotes a three-order identity matrix.

2.3.2. AMCL Algorithm Fuses LIDAR Positioning Data

AMCL algorithm falls into four stages, including particle initialization, prediction, measurement, and resampling [29]. The implementation steps are elucidated below:
(1).
Particle initialization
Taking the estimated pose information x 0 = [ X EKF , Y EKF , θ EKF ] T obtained at the initial time EKF as the input of the initial estimated pose, the number of randomly generated particles n = 300, the generated initial particle set is recorded as { ( x 0 i , w 0 i ) } i = 1 300 , and the weight of each particle is equated with w 0 i = 1 / 300 .
(2).
Prediction stage
With the estimated pose information x 0 = [ X EKF , Y EKF , θ EKF ] T obtained by EKF algorithm as the input of the prediction stage, the state of particles at time t is updated as:
x _ t = x t 1 , EKF + ε t
(3).
Measurement stage
Measurement noise v t of LIDAR is known. The pose information z L = [ X L , Y L , θ L ] T obtained by matching the LIDAR with the pre-established 2D grid map of the greenhouse is employed as the input of the measurement stage, and the particle weight w t i at time t is obtained:
w t i = f R ( z t , L x t i _ ) w t 1 i
where f R ( · ) denotes the probability distribution function of measurement noise v t .
The particle set after measurement stage update is { ( x t i , w t i ) } i = 1 300 . The weights of all particles are updated and then normalized to obtain w t i _ :
w t i _ = w t i / i = 1 300 w t i
Through the weight and pose information of the latest particle set, the optimal pose estimation of the robot platform at time t can be yielded as:
x t = i = 1 300 w t i _ x t i
where w t i _ denotes the weight of particle i after updating at time t; x t i represents the optimal pose estimation of particle i after updating at time t.
(4).
Resampling
The particles are filtered in accordance with the weight of all particles. The particles with higher weight are closer to the real pose. The threshold of effective particle number is set as N f f = 20, and the number of high weight particles is calculated as N e f f :
N e f f = 1 / i = 1 300 ( w t i _ ) 2
When N e f f < N f f , the particles are resampled by KLD algorithm [30], and the updated particles are introduced into (2) prediction stage for cyclic calculation. Accordingly, the longer the robot platform moves in the map, the more the sensor positioning information is obtained, the more accurate the positioning is.

2.4. Layout of Experiment Site

The experiment was performed in a vegetable greenhouse at Yunyuan Scientific Research Base of Hunan Agricultural University, Changsha, Hunan Province, China. As depicted in Figure 3, the layout in the greenhouse consisted of four field ridges, three aisles between ridges and a longitudinal aisle. The width of the field ridges is 1 m, and the width of the aisle between ridges is 0.9 m. The above aisles were paved with cement. Vegetable crops with an average height of 0.4 m were planted on the field ridges. A square area of 14 m by 8 m was selected for the test site, and a baffle was adopted to separate the test area in the aisle between ridges. Four UWB base stations were set at the four vertices of the test square area, respectively, and then fixed at a height of 0.7 m above the ground with a support. A 2D positioning and navigation coordinate system was built (Figure 3) with base station A as the origin of the coordinate system. The reference trajectory and four inflection points, including a (1.80, 1.80) m, b (12.60, 1.80) m, c (12.60, 5.65) m, and d (1.80, 5.65) m (Figure 3), were preset in the test site. During the test, the remote monitoring platform was employed to remotely monitor and record the real-time data of the robot platform via Wi-Fi.
This paper employs the RMSE as a measure of positioning accuracy [31]. The formula corresponding to this measurement is as follows:
RMSE = 1 N n = 1 N ( m n m 0 n ) 2
where N denotes the total number of sample data; n represents the sample data; m represents the actual value of the sample data; m 0 represents the target value of the the sample data.
Cartographer algorithm [12], an algorithm based on graph optimization, primarily creates a local map by fusing multi-sensor data, and it achieves loop detection using the local map. This algorithm is capable of reducing a certain amount of accumulative error in the process of mapping, and it applies to the map construction of greenhouses.

3. Experiments and Results

3.1. Precision Comparison Experiment of Greenhouse Mapping and Positioning

In this paper, the navigation boundary was obtained by scanning the crop-row and the sidewall of greenhouses with LIDAR [32,33]. The robot platform was manually controlled to move in the cement aisle area of the greenhouse in accordance with the same moving track and completed the map construction. A combination of the EKF algorithm and the AMCL algorithm is utilized for combining the data in the IMU/ODOM/LIDAR integrated positioning method. Moreover, the map is constructed by integrating IMU/ODOM/LIDAR and UWB/IMU/ODOM/LIDAR with SLAM algorithm, respectively, as presented in Figure 4.
To compare the mapping accuracy of the two combinations of sensors in the greenhouse environment, as presented in Figure 4, we marked the five feature areas in the environment, and the feature areas include the width of the greenhouse and the center width of the aisle between adjacent field ridges. The feature areas of the 2D map were mapped using the RVIZ software of ROS system to determine the map measured value [34], which was compared with the actual measured value. The results are listed in Table 1. As indicated by the results, a maximum error of 0.11 m was detected in the feature areas of the combined IMU/ODOM/LIDAR mapping, suggesting that the map may have drifted and that the error accrues with time. The maximum error of the feature areas of the combined mapping based on UWB/IMU/ODOM/LIDAR was 0.03 m, significantly lower than that of the IMU/ODOM/LIDAR mapping method.
The constructed greenhouse grid map was loaded using the RVIZ software of ROS system, and the positioning data of the robot in the map was obtained by the AMCL algorithm. After the robot's pose was initialized, the robot platform was manually controlled to track the reference trajectory at the speed of 0.5 m/s and recorded the trajectory of the robot in various maps and the trajectory of the single UWB positioning in real-time, as depicted in Figure 5a. Compare the lateral error between the recorded trajectory and the reference trajectory, as depicted in Figure 5b.
The lateral error of various positioning techniques is statistically and analytically calculated, and the findings are summarised in Table 2. The lateral average error of UWB positioning is 0.047 m, the maximum lateral error is 0.157 m, and the RMSE is 0.051 m. The maximum lateral error of the IMU/ODOM/LIDAR integrated positioning method is more than 0.2 m, and the RMSE is 0.103 m. The lateral average error of UWB/IMU/ODOM/LIDAR integrated positioning method in this paper is 0.027 m, and the maximum error is less than 0.1 m. The lateral RMSE is lowered by 33.3% and 67% when compared to UWB positioning and IMU/ODOM/LIDAR integrated positioning methods.
Based on the multi-sensor integrated method in reference [35], the EKF algorithm is adopted to integrate UWB/IMU/ODOM/LIDAR to compare the integrated positioning accuracy between the integrated algorithm in this paper and the conventional EKF algorithm. In the experiment, the robot platform adopts the single EKF algorithm and the EKF/AMCL combined algorithm to track the reference trajectory at 0.5 m/s, and the recorded lateral error results are presented in Figure 6. The lateral errors of the two integrated algorithms are analyzed, and the results are listed in Table 3. As revealed by the results, the maximum lateral error of single EKF algorithm is higher than 0.1 m, and the lateral RMSE of EKF/AMCL combined algorithm is reduced by 22.7% compared with single EKF algorithm.
The lateral error of UWB/IMU/ODOM/LIDAR integrated positioning method is recorded at three moving speeds to verify the positioning accuracy of the positioning method in this paper at different speeds, and the results are presented in Figure 7. The lateral error of the robot platform in different moving speeds is analyzed, and the results are listed in Table 4. As revealed by the results, the lateral error increases slowly with the increase of the speed. At 0.7 m/s, the average lateral error is obtained as 0.036 m, the maximum error is 0.095 m, and the lateral RMSE is 0.04 m.

3.2. Target Points Positioning Experiment

As part of this experiment, we compared the performance of the positioning system, the IMU/ODOM/LIDAR integrated positioning method, and the single UWB positioning. Twenty-four target points were set on the reference trajectory. The robot platform was manually controlled to track the reference trajectory, and guarantee that the center point of the robot platform coincides with the target point every time it reaches a target point. Every time the robot platform reaches a target point, halt and record the positioning data of different positioning methods at the current time for ten times and take the average value, and the positioning results are as depicted in Figure 8. As indicated by the results, the positioning accuracy of UWB in the greenhouse environment was between 0.04~0.23 m, whereas the positioning data fluctuated and was not consistent. The IMU/ODOM/LIDAR integrated positioning method produces accumulative error throughout robot platform movement, but its data continuity is strong, and the fluctuation is modest. An integrated positioning strategy using UWB/IMU/ODOM/LIDAR in conjunction with the other two positioning methods presented in this study provided the most accurate positioning results.
The target points positioning experiment data was analyzed, and the results are listed in Table 5. There is 45.5% and 41.5% improvement in overall positioning accuracy when using the UWB/IMU/ODOM/LIDAR integrated positioning method in comparison to using the single UWB positioning and the IMU/ODOM/LIDAR integrated positioning method, respectively. The RMSEs of x-axis direction, y-axis direction and overall positioning were found as 0.092, 0.069, and 0.079 m, respectively, and the maximum positioning error was 0.102 m. These results show an improvement in robot platform precision in the greenhouse.

3.3. Analysis of System Positioning Time

In the experiment, the respective positioning time of 1000 frames of data of the positioning system is recorded, as shown in Figure 9. The results suggest that the average positioning time of the system is 72.1 ms, and the longest positioning time is less than 80 ms.

4. Conclusions

This paper combines UWB positioning technology with slam mapping technology based on 2D LIDAR. Additionally, the use of EKF and AMCL algorithm for multi-sensor fusion develops the integrated indoor positioning system of greenhouse robot based on UWB/IMU/ODOM/LIDAR. The results have demonstrated that this method provides higher precision positioning for greenhouse robots.
The main research conclusions were drawn:
  • UWB/IMU/ODOM/LIDAR-based integrated positioning method is proposed in this study. First, the estimated pose information is obtained by EKF integrating the positioning data of UWB/IMU/ODOM. On this basis, the 2D map of the greenhouse was created by scanning crop-rows with LIDAR. Second, AMCL integrated the LIDAR and map information to achieve global positioning of the greenhouse robot, which was accomplished.
  • The precision comparison experiment results of greenhouse mapping and positioning demonstrate that the UWB/IMU/ODOM/LIDAR integrated positioning method in this paper improves the mapping and positioning accuracy compared with the IMU/ODOM/LIDAR integrated positioning method extensively used by conventional indoor mobile robots. At different moving speeds, the lateral error of the positioning method in this paper increases slowly over speed. At 0.7 m/s, the maximum error is 0.095m and the lateral RMSE is 0.04 m. The experimental results of target points positioning indicate that the positioning accuracy of UWB/IMU/ODOM/LIDAR integrated positioning method in this paper increased by 45.5% and 41.5%, respectively, compared with single UWB positioning and IMU/ODOM/LIDAR integrated positioning method. The RMSEs of x-axis direction, y-axis direction, and overall positioning are obtained as 0.092, 0.069, and 0.079 m, respectively, the maximum positioning error is 0.102 m, and the average positioning time of the system is 72.1 ms, thus meeting the positioning accuracy and positioning time requirements of robot navigation in greenhouse operation. Comparing the above test results to the results of existing studies [8,18,36,37], the positioning system proposed in this paper provided a higher level of positioning accuracy.
Since there may be irregular crops and irregular planting gaps in non-standard greenhouses, LIDAR has difficulty in obtaining a clear navigation boundary, which affects the accuracy of positioning and navigation [38]. The point cloud data scanned by LIDAR should be analyzed in depth. In addition, the multi-sensor fusion positioning method proposed in this paper was confirmed to be suitable for the relatively open greenhouse environment where dwarf crops are planted, and it could be more dependent on UWB to provide reliable location information. In further research, the greenhouse environment with tall crops will be investigated.

Author Contributions

Conceptualization, Z.L. and Y.X.; methodology, Z.L., Y.X. and X.L.; software, Z.L.; validation, Y.X.; formal analysis, Z.L. and Y.X.; investigation, Z.L. and Y.X.; resources, Y.X.; data curation, Z.L., Y.L. and Z.H.; writing—original draft preparation, Z.L., X.L. and Y.X.; writing—review and editing, Y.X. and Y.L.; visualization, Z.H. and X.L.; supervision, X.D.; project administration, Y.L.; funding acquisition, Y.X. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Natural Science Foundation of Hunan Province of China and grant number 2021JJ30363, and the Scientific Research Fund of the Hunan Provincial Education Department of China and grant number 19A224.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

All data are presented in this article in the form of figures and tables.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Jagelčák, J.; Gnap, J.; Kuba, O.; Frnda, J.; Kostrzewski, M. Determination of Turning Radius and Lateral Acceleration of Vehicle by GNSS/INS Sensor. Sensors 2022, 22, 2298. [Google Scholar] [CrossRef] [PubMed]
  2. Liu, J.Z. Research progress analysis of robotic harvesting technologies in greenhouse. Trans. Chin. Soc. Agric. Mach. 2017, 48, 1–18. [Google Scholar]
  3. Yasin, J.N.; Mohamed, S.A.S.; Haghbayan, M.H.; Heikkonen, J.; Tenhunen, H.; Plosila, J. Low-cost ultrasonic based object detection and collision avoidance method for autonomous robots. Int. J. Inf. Technol. 2021, 13, 97–107. [Google Scholar] [CrossRef]
  4. Mahmud, M.S.A.; Abidin, M.S.Z.; Mohamed, Z.; Rahman, M.K.I.A.; Iida, M. Multi-objective path planner for an agricultural mobile robot in a virtual greenhouse environment. Comput. Electron. Agric. 2019, 157, 488–499. [Google Scholar] [CrossRef]
  5. Subramanian, V.; Thomas, F.B.; Arroyo, A.A. Development of machine vision and laser radar based autonomous vehicle guidance systems for citrus grove navigation. Comput. Electron. Agric. 2006, 53, 130–143. [Google Scholar] [CrossRef]
  6. Wang, X.; Wu, P.; Feng, Q.; Wang, G. Design and Test of Tomatoes Harvesting Robot. J. Agric. Mech. Res. 2016, 4, 94–98. [Google Scholar]
  7. Kootstra, G.; Wang, X.; Blok, P.M.; Hemming, J.; Henten, E.V. Selective Harvesting Robotics: Current Research, Trends, and Future Directions. Curr. Robot. Rep. 2021, 2, 95–104. [Google Scholar] [CrossRef]
  8. Li, T.H.; Wu, Z.H.; Lian, X.K.; Hou, J.L.; Shi, G.Y.; Wang, Q. Navigation line detection for greenhouse carrier vehicle based on fixed direction camera. Trans. Chin. Soc. Agric. Mach. 2018, 49, 8–13. [Google Scholar]
  9. Mosalanejad, H.; Minaei, S.; Borghei, A.; Farzaneh, B. Evaluation of navigation system of a robot designed for greenhouse spraying. Int. J. Smart Sens. Intell. Syst. 2020, 13, 1–9. [Google Scholar] [CrossRef]
  10. Guevara, J.; Cheein, F.A.A.; Gené-Mola, J.; Rosell-Polo, J.R.; Lopez, E.G. Analyzing and overcoming the effects of GNSS error on LiDAR based orchard parameters estimation. Comput. Electron. Agric. 2020, 170, 105255. [Google Scholar] [CrossRef]
  11. Shamsudin, A.U.; Ohno, K.; Hamada, R.; Kojima, S.; Westfechtel, T.; Suzuki, T.; Okada, Y.; Tadokoro, S.; Fujita, J.; Amano, H. Consistent map building in petrochemical complexes for firefighter robots using SLAM based on GPS and LIDAR. Robomech J. 2018, 5, 1–13. [Google Scholar]
  12. Hou, J.L.; Pu, W.Y.; Li, T.H.; Ding, X.M. Development of dual-lidar navigation system for greenhouse transportation robot. Trans. Chin. Soc. Agric. Eng. 2020, 36, 80–88. [Google Scholar]
  13. Chen, M. Dynamic Mapping for Domestic Service Robot. Doctoral Thesis, University of Science and Technology of China, Hefei, China, 2019. [Google Scholar]
  14. Tee, Y.K.; Han, Y.C. Lidar-Based 2D SLAM for Mobile Robot in an Indoor Environment: A Review. In Proceedings of the 2021 International Conference on Green Energy, Computing and Sustainable Technology (GECOST), Miri, Malaysia, 7–9 July 2021; pp. 1–7. [Google Scholar]
  15. Yu, S.; Yan, F.; Zhuang, Y.; Gu, D.B. A deep-learning-based strategy for kidnapped robot problem in similar indoor environment. J. Intell. Robot. Syst. 2020, 3, 765–775. [Google Scholar] [CrossRef]
  16. Barral, V.; Suárez-Casal, P.; Escudero, C.J.; García-Naya, J.A. Multi-sensor accurate forklift location and tracking simulation in industrial indoor environments. Electron 2019, 8, 1152. [Google Scholar] [CrossRef] [Green Version]
  17. Shen, B.Q.; Zhang, Z.M.; Shu, S.L. UWB-VIO integrated indoor location algorithm for mobile robots. J. Comput. Appl. 2022, 42, 1–8. [Google Scholar]
  18. Lin, X.Z.; Wang, X.; Lin, C.X.; Geng, J.; Xue, J.L.; Zheng, E.L. Location information collection and optimization for agricultural vehicle based on UWB. Trans. Chin. Soc. Agric. Mach. 2018, 49, 23–29. [Google Scholar]
  19. Reis, W.P.N.; Silva, G.J.; Junior, O.M.; Vivaldini, K.C.T. An extended analysis on tuning the parameters of Adaptive Monte Carlo Localization ROS package in an automated guided vehicle. Int. J. Adv. Manuf. Tech. 2021, 117, 1975–1995. [Google Scholar] [CrossRef]
  20. Liu, R.; He, Y.; Yuen, C.; Lau, B.P.L.; Ali, R.; Fu, W.P.; Cao, Z.Q. Cost-effective mapping of mobile robot based on the fusion of UWB and short-range 2-D LIDAR. IEEE/ASME Trans. Mechatron. 2021, 27, 3087957. [Google Scholar] [CrossRef]
  21. Wang, S.; Kobayashi, Y.; Ravankar, A.A.; Ravankar, A.; Emaru, T. A Novel Approach for Lidar-Based Robot Localization in a Scale-Drifted Map Constructed Using Monocular SLAM. Sensors 2019, 19, 2230. [Google Scholar] [CrossRef] [Green Version]
  22. Yufan, C.; Zijie, N. Simulation and Implementation of Slam Drawing Based on Ros Wheeled Mobile Robot. J. Phys. Conf. Ser. 2021, 1865, 042068. [Google Scholar]
  23. Kumar, P.S.; Dutt, V.B.S.S.I.; Ganesh, L. Performance evaluation of suitable navigation algorithm using raw measurements taken from stationary GPS receiver. Mater. Today Proc. 2020, 33, 3366–3371. [Google Scholar] [CrossRef]
  24. Liu, Y.X. Mobile Robot Localization Algorithm Based on Multi-sensor Fusion and Point Cloud Matching. Master’s Thesis, University of Electronic Science and Technology of China, Chengdu, China, 2020. [Google Scholar]
  25. Zhao, L. Mobile Robot Localization Methods Based on Multi-Source Information Fusion. Master’s Thesis, Harbin Institute of Technology, Harbin, China, 2020. [Google Scholar]
  26. Zhang, S.L. Research on Localization and Navigation of Indoor Mobile Robot Based on Multi-sensor Fusion. Master’s Thesis, University of Chinese Academy of Sciences(Changchun Institute of Optics, Fine Mechanicsand Physics, Chinese Academy of Sciences), Changchun, China, 2021. [Google Scholar]
  27. Feng, J.M. Research on Mobile Robot Localization Algorithm Based on Multi-sensor Fusion and Scanning Matching. Master’s Thesis, Northwest Normal University, Lanzhou, China, 2021. [Google Scholar]
  28. Li, Z.Q.; Chen, L.Q.; Zheng, Q.; Dou, X.Y.; Yang, L. Control of a path following caterpillar robot based on a sliding mode variable structure algorithm. Biosyst. Eng. 2019, 186, 293–306. [Google Scholar] [CrossRef]
  29. Zhang, H.; Chen, N.; Fan, G. Intelligent robot positioning algorithm based on particle filter. Comput. Appl. Softw. 2020, 37, 134–140. [Google Scholar]
  30. Kayhani, N.; Zhao, W.D.; McCabe, B.; Schoellig, A.P. Tag-based visual-inertial localization of unmanned aerial vehicles in indoor construction environments using an on-manifold extended Kalman filter. Autom. Constr. 2022, 135, 104112. [Google Scholar] [CrossRef]
  31. Qian, J.; Zi, B.; Wang, D.; Ma, Y.G.; Zhang, D. The design and development of an omni-directional mobile robot oriented to an intelligent manufacturing system. Sensors 2017, 17, 2073. [Google Scholar] [CrossRef] [Green Version]
  32. Li, C.Y.; Peng, C.; Zhang, Z.Q.; Miao, Y.L.; Zhang, M.; Li, H. Positioning and map construction for agricultural robots integrating ODOM information. Trans. Chin. Soc. Agric. Eng. 2021, 37, 16–23. [Google Scholar]
  33. Ponnambalam, V.R.; Bakken, M.; Moore, R.J.D.; Glenn, O.G.J.; Johan, F.P. Autonomous crop row guidance using adaptive multi-roi in strawberry fields. Sensors 2020, 20, 5249. [Google Scholar] [CrossRef] [PubMed]
  34. Jia, H. Design of SLAM and Navigation Robot Based on Cartographer Algorithm. Master’s Thesis, Shandong University, Jinan, China, 2019. [Google Scholar]
  35. Gai, R. Research on Robot Positioning Technology Based on Multi-sensor Information Fusion. Master’s Thesis, Beijing University of Civil Engineering and Architecture, Beijing, China, 2019. [Google Scholar]
  36. Huang, Z.C.; Jacky, T.L.; Zhao, X.; Fukuda, H.; Shiigi, S.; Nakanishi, H.; Suzuki, T.; Ogawa, Y.; Kondo, N. Position and orientation measurement system using spread spectrum sound for greenhouse robots. Biosyst. Eng. 2020, 198, 50–62. [Google Scholar] [CrossRef]
  37. Xu, Y.; Shmaliy, Y.S.; Ma, W.; Jiang, X.W.; Shen, T.; Bi, S.H.; Guo, H. Improving tightly LiDAR/compass/encoder-integrated mobile robot localization with uncertain sampling period utilizing EFIR filter. Mob. Netw. Appl. 2021, 26, 440–448. [Google Scholar] [CrossRef]
  38. Bukhori, I.; Ismail, Z.H. Detection of kidnapped robot problem in Monte Carlo localization based on the natural displacement of the robot. Int. J. Adv. Robot. Syst. 2017, 14. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Schematic diagram of the positioning system.
Figure 1. Schematic diagram of the positioning system.
Sensors 22 04819 g001
Figure 2. Integrated positioning frame diagram based on UWB/IMU/ODOM/LIDAR.
Figure 2. Integrated positioning frame diagram based on UWB/IMU/ODOM/LIDAR.
Sensors 22 04819 g002
Figure 3. Schematic diagram of greenhouse experiment site.
Figure 3. Schematic diagram of greenhouse experiment site.
Sensors 22 04819 g003
Figure 4. Greenhouse mapping with different combinations of sensors. (a) IMU/ODOM/LIDAR. (b) UWB/IMU/ODOM/LIDAR.
Figure 4. Greenhouse mapping with different combinations of sensors. (a) IMU/ODOM/LIDAR. (b) UWB/IMU/ODOM/LIDAR.
Sensors 22 04819 g004
Figure 5. Compare the trajectories of different positioning methods. (a) Trajectories. (b) Lateral error.
Figure 5. Compare the trajectories of different positioning methods. (a) Trajectories. (b) Lateral error.
Sensors 22 04819 g005
Figure 6. Comparison of lateral error between two integrated algorithms.
Figure 6. Comparison of lateral error between two integrated algorithms.
Sensors 22 04819 g006
Figure 7. Lateral error at different moving speeds.
Figure 7. Lateral error at different moving speeds.
Sensors 22 04819 g007
Figure 8. Different positioning methods compared for accuracy of target point positioning. (a) Positioning results. (b) Positioning error.
Figure 8. Different positioning methods compared for accuracy of target point positioning. (a) Positioning results. (b) Positioning error.
Sensors 22 04819 g008
Figure 9. Positioning time.
Figure 9. Positioning time.
Sensors 22 04819 g009
Table 1. Comparison of mapping error of different combinations of sensors.
Table 1. Comparison of mapping error of different combinations of sensors.
Feature AreaActual Measured
Value (m)
Map Measured Value (m)
IMU/ODOM/LIDARUWB/IMU/ODOM/LIDAR
18.408.468.43
21.901.961.89
31.901.891.88
41.901.791.90
51.901.851.88
Table 2. Statistics and analysis of lateral error of different positioning methods.
Table 2. Statistics and analysis of lateral error of different positioning methods.
Positioning MethodAverage Error
(m)
Maximum Error
(m)
RMSE (m)
UWB0.0470.1570.051
IMU/ODOM/LIDAR0.0670.2340.103
UWB/IMU/ODOM/LIDAR0.0270.0910.034
Table 3. Analysis of lateral error of two integrated algorithms.
Table 3. Analysis of lateral error of two integrated algorithms.
Integrated
Algorithm
Average Error
(m)
Maximum Error
(m)
RMSE (m)
EKF0.0390.1010.044
EKF/AMCL0.0270.0910.034
Table 4. Statistics and analysis of lateral error at different moving speeds.
Table 4. Statistics and analysis of lateral error at different moving speeds.
Moving SpeedAverage Error
(m)
Maximum Error
(m)
RMSE (m)
0.3 m/s0.0210.0670.03
0.5 m/s0.0270.0910.034
0.7 m/s0.0360.0950.04
Table 5. Error analysis of target points positioning with different positioning methods.
Table 5. Error analysis of target points positioning with different positioning methods.
Positioning MethodRMSE (m)Overall Maximum Error (m)
x-Axis
Direction
y-Axis
Direction
Overall
UWB0.1400.0830.1450.233
IMU/ODOM/LIDAR0.1270.0720.1350.281
UWB/IMU/ODOM/LIDAR0.0920.0690.0790.102
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Long, Z.; Xiang, Y.; Lei, X.; Li, Y.; Hu, Z.; Dai, X. Integrated Indoor Positioning System of Greenhouse Robot Based on UWB/IMU/ODOM/LIDAR. Sensors 2022, 22, 4819. https://doi.org/10.3390/s22134819

AMA Style

Long Z, Xiang Y, Lei X, Li Y, Hu Z, Dai X. Integrated Indoor Positioning System of Greenhouse Robot Based on UWB/IMU/ODOM/LIDAR. Sensors. 2022; 22(13):4819. https://doi.org/10.3390/s22134819

Chicago/Turabian Style

Long, Zhenhuan, Yang Xiang, Xiangming Lei, Yajun Li, Zhengfang Hu, and Xiufeng Dai. 2022. "Integrated Indoor Positioning System of Greenhouse Robot Based on UWB/IMU/ODOM/LIDAR" Sensors 22, no. 13: 4819. https://doi.org/10.3390/s22134819

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop