Next Article in Journal
A Novel Approach for Apple Freshness Prediction Based on Gas Sensor Array and Optimized Neural Network
Next Article in Special Issue
A Salient Object Detection Method Based on Boundary Enhancement
Previous Article in Journal
Predicting Rice Lodging Risk from the Distribution of Available Nitrogen in Soil Using UAS Images in a Paddy Field
Previous Article in Special Issue
An Interactive Image Segmentation Method Based on Multi-Level Semantic Fusion
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Affine Iterative Closest Point Algorithm Based on Color Information and Correntropy for Precise Point Set Registration

Key Laboratory of Autonomous Systems and Networked Control, Ministry of Education, Unmanned Aerial Vehicle Systems Engineering Technology Research Center of Guangdong, South China University of Technology, Guangzhou 510640, China
*
Author to whom correspondence should be addressed.
Sensors 2023, 23(14), 6475; https://doi.org/10.3390/s23146475
Submission received: 21 May 2023 / Revised: 26 June 2023 / Accepted: 30 June 2023 / Published: 17 July 2023
(This article belongs to the Special Issue Machine Learning Based 2D/3D Sensors Data Understanding and Analysis)

Abstract

:
In this paper, we propose a novel affine iterative closest point algorithm based on color information and correntropy, which can effectively deal with the registration problems with a large number of noise and outliers and small deformations in RGB-D datasets. Firstly, to alleviate the problem of low registration accuracy for data with weak geometric structures, we consider introducing color features into traditional affine algorithms to establish more accurate and reliable correspondences. Secondly, we introduce the correntropy measurement to overcome the influence of a large amount of noise and outliers in the RGB-D datasets, thereby further improving the registration accuracy. Experimental results demonstrate that the proposed registration algorithm has higher registration accuracy, with error reduction of almost 10 times, and achieves more stable robustness than other advanced algorithms.

1. Introduction

With the rapid development of 3D reconstruction in various fields, such as image processing [1,2,3,4,5,6], machine vision [7,8,9,10,11], and simultaneous localization and mapping (SLAM) [12,13,14], point set registration becomes more and more important, which is a key technique in image registration. The purpose is to match point sets from two or more images obtained at different times, different environments, or in different sensors. After geometric transformations, such as rotation and translation, the transformed point set is consistent in spatial expression. However, in practical applications, there are still several problems in point set registration, including (1) the correspondence between the two sets of points is unknown; and (2) there is insufficient information for point set matching when the local structures are unsalient. Besl and McKay [15] proposed an iterative closest point (ICP) algorithm, which is an efficient algorithm for solving the problem of rigid registration with two point sets.
In recent years, with the widespread application of the ICP algorithm, many scholars have carried out a lot of research around the ICP algorithm to improve it from the aspects of speed, convergence, and robustness. Among them, Blais et al. [16] adopted the method of random sampling of the original point set to boost the registration speed, but could not guarantee the accuracy of registration. Fitzgibbon [17] used the Levenberg–Marquardt algorithm to make the ICP registration algorithm converge faster. Sharp et al. [18] studied the ICP registration algorithm based on invariant features to decrease the probability of the algorithm falling into a local minimum and improve the robustness of the algorithm. However, the algorithm cannot solve the problem of a lot of noise interference in the dataset. For the registration problem of datasets with noise and outliers, Chetverikov et al. [19] proposed a TriICP algorithm to suppress noise and outliers. Phillip et al. [20] improved the trimmed ICP algorithm, which further improved the accuracy of registration. Myronenko et al. [21] proposed the Coherent Point Drift (CPD) algorithm, which used a probabilistic method to register the point sets. The model point set was represented by a Gaussian Mixed Model (GMM), and the target point set was regarded as the observation of the model point set. When these two point sets are registered, the correspondence is the maximum GMM posterior probability, which overcomes noise and outliers. Yang et al. [22] proposed a branch and bound solution based on searching the entire space SE (3), which solved the problem that the ICP algorithm was easily trapped in local extreme values. Although the accuracy of registration is improved, its calculation speed becomes slower. Du et al. [23] established a probabilistic model of the ICP algorithm and proposed the PICP algorithm to improve the robustness of the ICP algorithm.
In addition, some learning-based approaches have been proposed to address partial overlap registration [24,25,26,27]. For example, Zeng et al. [28] proposed a fast rigid registration method based on deep learning for 3D reconstruction, which can also be extended to other registration. RPNet [29] is less sensitive to initialization and more robust for point cloud registration. The network of this method obtains the corresponding soft allocation of points to solve the local overlap of point clouds problem. Deep closest point [30] uses dynamic graph neural network for feature extraction, and uses an attention module to generate features to establish correspondence. OMNet [31] estimates the overlapping masks to reject non-overlapping regions for partial-to-partial point cloud registration. Similarly, FINet [32] leverages the dual branches feature interaction and achieves better registration results. Yu et al. [24] presented a rotation invariant transformer to better address low overlap registration with large rotations. Gu et al. [25] proposed a recurrent framework for the 3D motion estimation to avoid the irregular data structure. Hu et al. [26] proposed a novel deep learning framework to generate omnidirectional 3D point clouds of human bodies by registering the front- and back-facing partial scans captured by a single-depth camera. Sun et al. [27] designed an end-to-end 3D graph deep learning framework of point cloud registration, which can simultaneously learn the detector (graph attention expression) and the descriptor (graph deep feature) for point cloud registration in a weakly supervised way, so that the learned detector and descriptor promote each other in the process of model optimization. Qin et al. [33] proposed Geometric Transformer to learn geometric feature for robust superpoint matching, which encodes pair-wise distances and triplet-wise angles, making it robust in low-overlap cases and invariant to rigid transformation.
However, in the actual application scene, the images to be registered often have some deformation. Therefore, the non-rigid registration problem has become another hot issue for many scholars. For affine registration problems, Feldmar et al. [34] tried to find the best global affine transformation by introducing differential information. Linear combinations of neighbor and similar rigid transformations are used to obtain more accurate local affine transformations between point sets. Amberg et al. [35] constrained the objective function by adding rigid regular terms and labeled terms to solve the affine transformation. Du et al. [36] put forward an affine registration algorithm based on Independent Components Analysis (ICA) to achieve accurate point set registration.
With the development of RGB-D imaging technology, many researchers pay more attention to the color information of the collected data [37]. Men et al. [38] added color information to the spatial information, which improved the registration accuracy and accelerated the convergence speed of the algorithm. Korn et al. [39] proposed an ICP algorithm based on Lab color information based on the Generalized-ICP algorithm, which improved the robustness of registration.
Although the above-mentioned algorithms improve the traditional registration algorithm from various aspects, they cannot match RGB-D data with a lot of noise and slight deformation well. In real scenes, the RGB-D data often contain a large number of moving objects, and the moving objects have a slight deformation during data collection, and the RGB-D data often have some noise and outlier interference, which reduces the registration accuracy. To solve these problems, this paper proposes a precise affine registration with color information and correntropy. First, this paper introduces color information into the algorithm as a feature to further improve the more accurate and reliable correspondence. Secondly, to solve the interference of noise and outliers in the dataset, this paper introduces the correntropy as a distance measurement method, which improves the registration accuracy. Experimental results demonstrate that our registration algorithm has higher registration accuracy and more stable robustness than other registration algorithms.
Overall, the contributions of this paper are summarized as:
  • We introduce color information into affine point cloud registration, which can increase the robustness of the algorithm.
  • we introduce the robust correntropy metric to address outliers and noises in the point clouds for more accurate registration.
The rest of this paper is organized as follows. Among them, in Section 2, a review of the traditional affine ICP algorithm. In Section 3, the precise affine registration with color information and correntropy is proposed. In Section 4, we use simulated data and real data to conduct experiments and analyze the experimental results. At last, the conclusion is summarized in the last section.

2. A Review of the Traditional Affine ICP Algorithm

The traditional affine ICP algorithm is a very widely used registration algorithm [40,41,42]. We usually define two point sets M = { m i } i = 1 N m and D = { d j } j = 1 N d in a space R n . Then, we can establish the objective function of affine registration between the two sets of points to be matched that we define, and give the objective function as the following least squares (LS) problem[43]:
min A , t , j { 1 , 2 , , N d } i = 1 N m ( A m i + t ) d j 2 s . t . det ( A ) 0
where A is the affine matrix, t is the translation vector.
The method of solving the algorithm is divided into two steps. Firstly, for any point m i in the point set M, the affine ICP algorithm finds the closest point in the point set D as the corresponding point; secondly, we can calculate the affine matrix A and translation vector t of the current loop; thirdly, the corresponding point information is updated according to the above steps, and the transformation matrix is continuously calculated until the registration error is minimum or the iteration time exceed the maximum number, then the cycle is terminated.

3. Precise Affine Registration with Correntropy and Color Information

In this section, we propose a precise affine registration algorithm with color information and correntropy to achieve accurate registration.

3.1. Problem Statement

In real-world scenarios, collected data from cameras and other equipment may have slight deformations, and some noise data may be mixed in at the same time. Although traditional affine registration can solve the problem of deformation registration of objects, the accuracy of matching weak geometric data with noise and outliers is not high, and Figure 1 verifies this problem. It can be seen from the figure that the affine registration algorithm is not accurate for solving such problems. For weak geometric data, the color information in the image is more obvious, so more reliable correspondences can be found, thereby improving the registration accuracy. Then, to improve the registration accuracy, we can introduce color information as a feature, thereby increasing the registration accuracy for weak geometric data. Therefore, we can get the following formula
min A , t , j { 1 , 2 , , N d } ( A m i + t d j 2 + α r i m r j d 2 + α g i m g j d 2 + α b i m b j d 2 )
where r i m , r j d , g i m , g j d , b i m , and b j d are the color information of two points, and α represents the weight ratio of color information, which is set to be 0.1. Color information is easily affected by changes in illumination. Therefore, when light changes are more obvious, the parameter α can be adjusted to a smaller value, thus reducing the effect of color errors.
The objective Equation (2) enables a more accurate and reliable correspondence between these two point sets. However, real datasets often contain a lot of noise and outliers, which leads to a reduction in registration accuracy. To solve this problem, this paper introduces a measure of correntropy, which replaces the measure of Euclidean distance measurement in the traditional registration method. The measurement method is as follows
s m i , d i = i = 1 N m exp m i d i 2 2 2 σ 2
where σ is a free variable that is set to 0.1. It can be known from the above formula that if the distance is larger, the correntropy distance is smaller and the registration result is worse; otherwise, the correntropy distance is larger, and the registration accuracy is higher. In other words, the correntropy metric falls into the robust kernel functions which assign large errors with bigger weights and assign small errors with smaller weights for alleviating the influence of inliers and outliers.
According to the above equation, we can build a new objective function:
max A , t , j { 1 , 2 , , N d } i = 1 N m exp ( A m i + t d j 2 + α r i m r j d 2 + α g i m g j d 2 + α b i m b j d 2 ) / 2 σ 2 s . t . det A 0

3.2. Precise Affine Registration with Color Information and Correntropy

To solve the registration problem described by Equation (4), this algorithm uses an iterative solution algorithm similar to the affine ICP algorithm. In each loop, the nearest neighbor search algorithm is used to establish the correspondence between these two point sets, and then the affine transformation is solved according to the correspondence. This algorithm is mainly divided into two steps, which are expressed as follows:
(1) According to the transformation in step ( k 1 ), the corresponding relationship is established between these two point sets:
c k ( i ) = a r g min j 1 , 2 , , N d ( A m i + t d j 2 + α r i m r j d 2 + α g i m g j d 2 + α b i m b j d 2 )
(2) Calculating ( A k , t k ) by the correntropy method for step k:
( A k , t k ) = arg max A , t i = 1 N m exp ( ( A m i + t d c k ( i ) 2 2 + α ( r i m r c k i d ) 2 + α ( g i m g c k i d ) 2 + α ( b i m b c k i d ) 2 ) / 2 σ 2 )
The algorithm iteratively calculates the affine matrix A and translation vector t to make it approach the optimal solution until the error converges to a preset value or exceeds the maximum number k of cycles.
For the problem of correspondence search described in Equation (5), we will use the k-d tree-based method [44,45,46] to search the nearest point in three dimensions. For the optimization problem described in Equation (6), we will give detailed algorithm solution steps and closed-form solution of the algorithm.
First, we calculate the approximate solution of t . Then, Equation (6) is rewritten as follows:
F ( A k , t k ) = i = 1 N m exp ( ( A m i + t d c k ( i ) 2 2 + α ( r i m r c k i d ) 2 + α ( g i m g c k i d ) 2 + α ( b i m b c k i d ) 2 ) / 2 σ 2 )
Assuming
l i = Δ exp ( ( A m i + t d c k ( i ) 2 + α ( r i m r c k i d ) 2 + α ( g i m g c k i d ) 2 + α ( b i m b c k i d ) 2 ) / 2 σ 2 )
By calculating the derivative of t for Equation (7), let F ( A , t ) t = 0 , we can get:
t k = i = 1 N m ( d c k ( i ) A k m i ) l i i = 1 N m l i
The algorithm is solved iteratively, and the final loop termination is conditional on registration accuracy. Although the value of each iteration is an approximate solution, the final registration result will not be affected.
Second, we calculate the approximate solution of A k . Substituting Equation (9) into (7), we can get a constraint optimization problem about A k . Then, we can get:
F ( A k ) = arg max A i = 1 N m exp ( ( A m i + i = 1 N m ( d c k ( i ) A k m i ) l i i = 1 N m l i d c k ( i ) 2 + i = 1 N m ( α ( r i m r c k i d ) 2 + α ( g i m g c k i d ) 2 + α ( b i m b c k i d ) 2 ) ) / 2 σ 2 )
Let x i = m i i = 1 N m m i l i i = 1 N m l i and y i = d c k ( i ) i = 1 N m d c k ( i ) l i i = 1 N m l i , we can get:
F ( A k ) = arg max A i = 1 N m exp ( ( A x i y i 2 2 + i = 1 N m ( α ( r i m r c k i d ) 2 + α ( g i m g c k i d ) 2 + α ( b i m b c k i d ) 2 ) ) / 2 σ 2 )
The process of solving A k is similar to[47,48], so we can solve A k in the same way, let U and V as N m × ( m + 1 ) matrices where each row represents the point m i and d c k ( i ) , respectively, then we can get
F ( A k ) = arg max A 1 N m T G V A k U
where 1 N d T represents a row vector of length N m , and G V A k U = b represents a column vector with entries G i = exp ( ( A x i y i 2 2 + α ( r i m r c k i d ) 2 + α ( g i m g c k i d ) 2 + α ( b i m b c k i d ) 2 ) / 2 σ 2 ) .
Then, we can get A k by calculation
A k = V T D ( b ) U ( U T D ( b ) U ) 1
where D ( b ) = d i a g ( b ) . The above algorithm is summarized in Table 1. Since the proposed algorithm is similar to the traditional affine ICP algorithm, the time complexity is also similar to the traditional affine ICP algorithm. The calculation time of the proposed algorithm is mainly in the search for the closest point of the first step. In this step, the search strategy of the k-d tree is used in this step, and step two hardly affects the calculation time of the algorithm. Therefore, the time complexity of the algorithm proposed in this paper is O ( N m ln N d ) . The code of our algorithm is implemented in MATLAB 2018. The PC environment is Intel (R) Core (TM) i5 3.00GHz, 4GB RAM.

4. Experimental Results

This section further verifies the accuracy and robustness of the proposed algorithm through experiments. First, the proposed algorithm is verified by simulation experiments, and we compare it with the ICP algorithm [15], the ICP algorithm with color information (CICP) [36], the scaling ICP algorithm (SICP) [49], the affine ICP algorithm (AICP) [50], and the affine algorithm with correntropy (ACICP) [51,52], GeoTransformer(Geo) [33], respectively. Among them, the simulation experiments include single objects and indoor scenes. In addition, we further test with real data.

4.1. Simulation Experiment

In this section, we use simulation experiments to verify the accuracy and robustness of the algorithm in this paper. First, we use RGB-D data collected by Kinect [53,54,55], termed AffineMatch. Second, we add affine deformation and some noise to the point cloud data, and the data before and after the transformation are used for registration. Finally, when the ground-truth is known, the relative errors are computed. In the experiments, { A * , t * } is the ground-truth, and the relative errors are defined as ε A = | | A A * | | 2 / | | A * | | 2 , and ε t = | | t t * | | 2 / | | t * | | 2 . The experimental results are shown as Table 2 and Figure 2, Figure 3 and Figure 4.
As shown in Table 2, comparing the proposed algorithm with the other six algorithms, it is verified that the proposed algorithm is better than other algorithms. For the ACICP algorithm, although the correntropy criterion ensures the robustness of the algorithm to noise and outliers, it still cannot effectively align the two point sets in the absence of the correct correspondence between the point sets. Although the Geo [33] method is currently an excellent deep learning registration method, the registration accuracy for this affine deformation is low. The proposed algorithm performs well on simulated datasets with weak geometric structures with small errors. Figure 2, Figure 3, Figure 4 and Figure 5 are the result of simulation experiments on single objects, including books, bags, pillows, and airplanes. Obviously, the results of our algorithm are better than the registration results of the ICP algorithm, the ICP algorithm with correntropy, the affine algorithm with color information in terms of registration accuracy.

4.2. Indoor Scenes Experiment

In this section, we use indoor scenes experiments to verify the accuracy and robustness of the algorithm in this paper. The data are captured by a Microsoft Kinect sensor, which can simultaneously obtain RGB and depth images. We capture the real data in indoor scenarios and four point cloud pairs are used to perform registration experiments. The experimental results are shown as Table 3. As shown in Table 3, comparing the proposed algorithm with the other six algorithms, it is verified that the proposed algorithm is better than other algorithms. Figure 6 is the results with indoor data, where the red ones represent the source point set and the green ones represent the target point set. It can be seen from experiments that our algorithm is better than the other algorithms and can match more points, so it achieves a more accurate registration. Moreover, to further verify that the algorithm proposed in this paper may converge faster and obtain more accurate and robust registration results, we use indoor data to compare the convergence of the algorithm proposed in this paper with the other five algorithms, as shown in Figure 7. We define our algorithm as Affine HCICP. It can be seen from the figure that our algorithm has a faster convergence speed and stronger accuracy and robustness, which further proves that the proposed algorithm performs better.

4.3. Experiments with Real Data

In this section, we test on real data and further prove that the proposed algorithm is better than the other five algorithms. Since the real data do not have accurate ground-truth, and the real data contain noise and outliers, we use the root mean square error (RMS) method to analyze the accuracy of the proposed algorithm and other algorithms under the real data. First of all, because after the registration algorithm is matched, most of the points can find their corresponding points. After a lot of experiments, we select the first 60% of the points after registration to measure the registration accuracy of the algorithm. Secondly, we use the corresponding point pair as the root mean square error ε to measure the registration accuracy of the algorithm. As shown in Table 4, we compare different registration algorithms on two sets of real data. It can be obtained that our algorithm has higher matching accuracy and robustness compared to the other five algorithms.
In addition, as shown in Figure 8, when tested on a real dataset, we can see that the proposed algorithm can match two point sets well, while the registration accuracy of other algorithms is not as good as the proposed algorithm. Thus, the accuracy and robustness of the proposed algorithm are verified. Moreover, to further verify that the proposed algorithm converges faster on real datasets and obtains more accurate and robust registration results, we use real data to conduct the convergence of the proposed algorithm with the other five algorithms. As shown in Figure 9, it can be seen from the test on real data that our algorithm has faster convergence speed, stronger accuracy, and robustness, which further proves that the algorithm performs better.

5. Conclusions

This paper proposes a precise affine registration algorithm with correntropy and color information. This algorithm can effectively handle the large noise and outliers and small deformation in the dataset in the registration algorithm. First, the algorithm introduces color features to establish more accurate and reliable correspondence. Second, this paper introduces the measurement of correntropy, which overcomes the influence of noise and outliers in the dataset, and further improves the registration accuracy. Experimental results demonstrate that our registration algorithm has higher registration accuracy and more stable robustness than other advanced algorithms. In the future, the robustness against illumination changes of the algorithm can be further studied.

Author Contributions

Conceptualization, L.L. and H.P.; methodology, L.L.; software, L.L.; validation, L.L.; formal analysis, L.L.; investigation, L.L.; resources, L.L.; data curation, L.L.; writing—original draft preparation, L.L.; writing—review and editing, L.L.; visualization, L.L.; supervision, H.P.; project administration, H.P.; funding acquisition, H.P. All authors have read and agreed to the published version of the manuscript.

Funding

Lianyungang International Automobile Green Intelligent Logistics Center Project

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Trained models with algorithm can be available upon reasonablerequest according to the instructions in ICP algorithm [15], the ICP algorithm with color information (CICP) [36], the scaling ICP algorithm (SICP) [49], the affine ICP algorithm (AICP) [50], and the affine algorithm with correntropy (ACICP) [51,52], GeoTransformer(Geo) [33].

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Liang, L.X. Precise iterative closest point algorithm for RGB-D data registration with noise and outliers. Neurocomputing 2020, 399, 361–368. [Google Scholar] [CrossRef]
  2. Yang, Y.; Fan, D.; Du, S.; Wang, M.; Chen, B.; Gao, Y. Point set registration with similarity and affine transformations based on bidirectional KMPE loss. IEEE Trans. Cybern. 2021, 51, 1678–1689. [Google Scholar] [CrossRef] [PubMed]
  3. Du, S.; Guo, Y.; Sanroma, G.; Ni, D.; Wu, G.; Shen, D. Building dynamic population graph for accurate correspondence detection. Med. Image Anal. 2015, 26, 256–267. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  4. Zhang, H.; Zeng, D.; Zhang, H.; Wang, J.; Liang, Z.; Ma, J. Applications of nonlocal means algorithm in low-dose X-ray CT image processing and reconstruction: A review. Med. Phys. 2017, 44, 1168–1185. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  5. Li, Y.; Zhang, H.; Bermudez, C.; Chen, Y.; Landman, B.; Vorobeychik, Y. Anatomical context protects deep learning from adversarial perturbations in medical imaging. Neurocomputing 2020, 379, 370–378. [Google Scholar] [CrossRef] [PubMed]
  6. Buenaventura, J.R.S.; Kobayashi, J.T.; Valles, L.M.P.; Goma, J.C.D.; Balan, A.K.D. Classification of varietal type of philippine rice grains using image processing through multi-View 3D reconstruction. In Proceedings of the 2nd International Conference on Computing and Big Data, Taichung, Taiwan, 18–20 October 2019; pp. 140–144. [Google Scholar]
  7. Rebecq, H.; Gallego, G.; Mueggler, E.; Scaramuzza, D. EMVS: Event-based multi-view stereo—3D reconstruction with an event camera in real-time. Int. J. Comput. Vis. 2018, 126, 1394–1414. [Google Scholar] [CrossRef] [Green Version]
  8. Khot, T.; Agrawal, S.; Tulsiani, S.; Mertz, C.; Lucey, S.; Hebert, M. Learning unsupervised multi-view stereopsis via robust photometric consistency. arXiv 2019, arXiv:1905.02706. [Google Scholar]
  9. Chu, Y.; Lin, H.; Yang, L.; Diao, Y.; Zhang, D.; Zhang, S.; Fan, X.; Shen, C.; Xu, B.; Yan, D. Discriminative globality-locality preserving extreme learning machine for image classification. Neurocomputing 2020, 387, 13–21. [Google Scholar] [CrossRef]
  10. Mescheder, L.; Oechsle, M.; Niemeyer, M.; Nowozin, S.; Geiger, A. Occupancy networks: Learning 3D reconstruction in function space. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA, 16–20 June 2019; pp. 4460–4470. [Google Scholar]
  11. Lin, C.; Kumar, A. Contactless and partial 3D fingerprint recognition using multi-view deep representation. Pattern Recognit. 2018, 83, 314–327. [Google Scholar] [CrossRef]
  12. Cadena, C.; Carlone, L.; Carrillo, H.; Latif, Y.; Scaramuzza, D.; Neira, J.; Reid, I.; Leonard, J.J. Past, present, and future of simultaneous localization and mapping: Toward the robust-perception age. IEEE Trans. Robot. 2016, 32, 1309–1332. [Google Scholar] [CrossRef] [Green Version]
  13. Wan, T.; Du, S.; Cui, W.; Yao, R.; Ge, Y.; Li, C.; Gao, Y.; Zheng, N. RGB-D point cloud registration based on salient object detection. IEEE Trans. Neural Netw. Learn. Syst. 2022, 33, 3547–3559. [Google Scholar] [CrossRef]
  14. Yousif, K.; Taguchi, Y.; Ramalingam, S. MonoRGBD-SLAM: Simultaneous localization and mapping using both monocular and RGBD cameras. In Proceedings of the IEEE International Conference on Robotics and Automation, Singapore, 29 May–3 June 2017; pp. 4495–4502. [Google Scholar]
  15. Besl, P.J.; McKay, N.D. Method for registration of 3-D shapes. Proceedings of Sensor Fusion IV: Control Paradigms and Data Structures, Boston, MA, USA, 14–15 November 1992; Volume 1611, pp. 586–606. [Google Scholar]
  16. Blais, G.; Levine, M.D. Registering multiview range data to create 3D computer objects. IEEE Trans. Pattern Anal. Mach. Intell. 1995, 17, 820–824. [Google Scholar] [CrossRef] [Green Version]
  17. Fitzgibbon, A.W. Robust registration of 2D and 3D point sets. Image Vis. Comput. 2003, 21, 1145–1153. [Google Scholar] [CrossRef]
  18. Sharp, G.C.; Lee, S.W.; Wehe, D.K. ICP registration using invariant features. IEEE Trans. Pattern Anal. Mach. Intell. 2002, 24, 90–102. [Google Scholar] [CrossRef] [Green Version]
  19. Chetverikov, D.; Stepanov, D.; Krsek, P. Robust Euclidean alignment of 3-D point sets: The trimmed iterative closest point algorithm. Image Vis. Comput. 2005, 23, 299–309. [Google Scholar] [CrossRef]
  20. Phillips, J.M.; Liu, R.; Tomasi, C. Outlier robust ICP for minimizing fractional RMSD. In Proceedings of the IEEE International Conference on 3-D Digital Imaging and Modeling (3DIM), Montreal, QC, Canada, 21–23 August 2007; pp. 427–434. [Google Scholar]
  21. Myronenko, A.; Song, X. Point set registration: Coherent point drift. IEEE Trans. Pattern Anal. Mach. Intell. 2010, 32, 2262–2275. [Google Scholar] [CrossRef] [Green Version]
  22. Yang, X.; Sheng, Y.; Sevigny, L.; Valin, P. Robust multisensor image registration with partial distance merits. In Proceedings of the Third International Conference on Information Fusion, Paris, France, 10–13 July 2000; Volume 11, pp. MOD3/23–MOD3/29. [Google Scholar]
  23. Du, S.; Liu, J.; Zhang, C.; Zhu, J.; Li, K. Probability iterative closest point algorithm for m-D point set registration with noise. Neurocomputing 2015, 157, 187–198. [Google Scholar] [CrossRef]
  24. Yu, H.; Qin, Z.; Hou, J.; Saleh, M.; Li, D.; Busam, B.; Ilic, S. Rotation-invariant transformer for point cloud matching. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Vancouver, BC, Canada, 18–22 June 2023; pp. 5384–5393. [Google Scholar]
  25. Gu, X.; Tang, C.; Yuan, W.; Dai, Z.; Zhu, S.; Tan, P. RCP: Recurrent closest point for point cloud. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, New Orleans, LO, USA, 19–24 June 2022; pp. 8216–8226. [Google Scholar]
  26. Hu, P.; Ho, E.S.; Munteanu, A. AlignBodyNet: Deep learning-based alignment of non-overlapping partial body point clouds from a single depth camera. IEEE Trans. Instrum. Meas. 2020, 72, 2502609. [Google Scholar] [CrossRef]
  27. Sun, L.; Zhang, Z.; Zhong, R.; Chen, D.; Zhang, L.; Zhu, L.; Wang, Q.; Wang, G.; Zou, J.; Wang, Y. A weakly supervised graph deep learning framework for point cloud registration. IEEE Trans. Geosci. Remote Sens. 2022, 60, 5702012. [Google Scholar] [CrossRef]
  28. Zeng, A.; Song, S.; Nießner, M.; Fisher, M.; Xiao, J.; Funkhouser, T. 3dmatch: Learning local geometric descriptors from rgb-d reconstructions. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 1802–1811. [Google Scholar]
  29. Yew, Z.J.; Lee, G.H. RPM-Net: Robust point matching using learned features. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, Virtual, 14–19 June 2020; pp. 11824–11833. [Google Scholar]
  30. Wang, Y.; Solomon, J.M. Deep closest point: Learning representations for point cloud registration. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Seoul, Republic of Korea, 27 October–2 November 2019; pp. 3523–3532. [Google Scholar]
  31. Bari, A.S.M.H.; Gavrilova, M.L. OMNet: Learning overlapping mask for partial-to-partial point cloud registration. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Vitural, 11–17 October 2021; pp. 3133–3141. [Google Scholar]
  32. Xu, H.; Ye, N.; Liu, S.; Zeng, B.; Liu, S. FINet: Dual branches feature interaction for partial-to-partial point cloud registration. In Proceedings of the Thirty-Sixth AAAI Conference on Artificial Intelligence, Virtual Event, 22 February–1 March 2022; pp. 1–14. [Google Scholar]
  33. Qin, Z.; Yu, H.; Wang, C.; Peng, Y.; Xu, K. Geometric transformer for fast and robust point cloud registration. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, New Orleans, LO, USA, 19–24 June 2022; pp. 11143–11152. [Google Scholar]
  34. Feldmar, J.; Rigid, N.A. Affine and locally affine registration of free-form surfaces. Int. J. Comput. Vis. 1996, 18, 99–119. [Google Scholar] [CrossRef] [Green Version]
  35. Amberg, B.; Romdhani, S.; Vetter, T. Optimal step nonrigid ICP algorithms for surface registration. Proceedings of IEEE International Conference on Computer Vision and Pattern Recognition, Minneapolis, MN, USA, 17–22 June 2007; pp. 1–8. [Google Scholar]
  36. Du, S.Y.; Zheng, N.N.; Meng, G.F.; Yuan, Z.J. Affine registration of point sets using ICP and ICA. IEEE Signal Process. Lett. 2008, 15, 689–692. [Google Scholar]
  37. Wan, T.; Du, S.; Xu, Y.; Xu, G.; Li, Z.; Chen, B.; Gao, Y. RGB-D point cloud registration via infrared and color camera. Multimed. Tools Appl. 2019, 78, 33223–33246. [Google Scholar] [CrossRef]
  38. Men, H.; Gebre, B.; Pochiraju, K. Color point cloud registration with 4D ICP algorithm. Proceedings of IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011; pp. 1511–1516. [Google Scholar]
  39. Korn, M.; Holzkothen, M.; Pauli, J. Color supported generalized-ICP. In Proceedings of the International Conference on Computer Vision Theory and Applications, Lisbon, Portugal, 5–8 January 2014; Volume 3, pp. 592–599. [Google Scholar]
  40. Du, S.; Zheng, N.; Ying, S.; Liu, J. Affine iterative closest point algorithm for point set registration. Pattern Recognit. Lett. 2010, 31, 791–799. [Google Scholar] [CrossRef]
  41. Cui, W.; Liu, J.; Du, S.; Liu, Y.; Wan, T.; Han, M.; Mou, Q.; Yang, J.; Gou, Y. Individual retrieval based on oral cavity point cloud data and correntropy-based registration algorithm. IET Image Process. 2020, 14, 2675–2681. [Google Scholar] [CrossRef]
  42. Cui, W.; Du, S.; Wan, T.; Yao, R.; Liu, Y.; Han, M.; Mou, Q.; Gou, Y.; Zheng, N. Robust and precise isotropic scaling registration algorithm using bi-directional distance and correntropy. Pattern Recognit. Lett. 2020, 138, 298–304. [Google Scholar] [CrossRef]
  43. Wang, H.; Xie, L. Convergence analysis of a least squared algorithm of linear switched identification. J. Control. Decis. 2020, 7, 379–396. [Google Scholar] [CrossRef]
  44. Ram, P.; Sinha, K. Revisiting kd-tree for Nearest Neighbor Search. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Anchorage, AK, USA, 4–8 August 2019; pp. 1378–1388. [Google Scholar]
  45. Hou, W.; Li, D.; Xu, C.; Zhang, H.; Li, T. An advanced k nearest neighbor classification algorithm based on KD-tree. In Proceedings of the International Conference of Safety Produce Informatization, Chongqing, China, 10–12 December 2018; pp. 902–905. [Google Scholar]
  46. Yin, L.; Andrews, J.; Heaton, T. Reducing process delays for real-time earthquake parameter estimation—An application of KD tree to large databases for Earthquake Early Warning. Comput. Geosci. 2018, 114, 22–29. [Google Scholar] [CrossRef]
  47. Du, S.; Zhang, C.; Wu, Z.; Liu, J.; Xue, J. Robust isotropic scaling ICP algorithm with bidirectional distance and bounded rotation angle. Neurocomputing 2016, 215, 160–168. [Google Scholar] [CrossRef]
  48. Du, S.; Bi, B.; Xu, G.; Zhu, J.; Zhang, X. Robust non-rigid point set registration via building tree dynamically. Multimed. Tools Appl. 2017, 76, 12065–12081. [Google Scholar] [CrossRef]
  49. Wu, Z.; Chen, H.; Du, S.; Fu, M.; Zhou, N.; Zheng, N. Correntropy based scale ICP algorithm for robust point set registration. Pattern Recognit. 2019, 93, 14–24. [Google Scholar] [CrossRef]
  50. Dong, J.; Cai, Z.; Du, S. Improvement of affine iterative closest point algorithm for partial registration. IET Comput. Vis. 2016, 11, 135–144. [Google Scholar] [CrossRef]
  51. Du, S.; Xu, G.; Zhang, S.; Zhang, X.; Gao, Y.; Chen, B. Robust rigid registration algorithm based on pointwise correspondence and correntropy. Pattern Recognit. Lett. 2020, 132, 91–98. [Google Scholar] [CrossRef]
  52. Chen, H.; Zhang, X.; Du, S.; Wu, Z.; Zheng, N. A correntropy-based affine iterative closest point algorithm for robust point set registration. IEEE/CAA J. Autom. Sin. 2019, 6, 981–991. [Google Scholar] [CrossRef]
  53. Wang, L.; Huynh, D.Q.; Koniusz, P. A comparative review of recent Kinect-based action recognition algorithms. IEEE Trans. Image Process. 2019, 29, 15–28. [Google Scholar] [CrossRef] [Green Version]
  54. Su, Y.; Gao, W.; Liu, Z.; Sun, S.; Fu, Y. Hybrid marker-based object tracking using Kinect v2. IEEE Trans. Instrum. Meas. 2020, 69, 6436–6445. [Google Scholar] [CrossRef]
  55. Bari, A.S.M.H.; Gavrilova, M.L. Artificial neural network based gait recognition using Kinect sensor. IEEE Access 2019, 7, 162708–162722. [Google Scholar] [CrossRef]
Figure 1. The registration result. (a) The original datasets. (b) The affine ICP registration result.
Figure 1. The registration result. (a) The original datasets. (b) The affine ICP registration result.
Sensors 23 06475 g001
Figure 2. The registration results of the simulation experiment by different methods. (a) The original datasets. (b) ICP. (c) SICP. (d) CICP. (e) The affine ICP algorithm. (f) ACICP. (g) Ours.
Figure 2. The registration results of the simulation experiment by different methods. (a) The original datasets. (b) ICP. (c) SICP. (d) CICP. (e) The affine ICP algorithm. (f) ACICP. (g) Ours.
Sensors 23 06475 g002
Figure 3. The registration results of the simulation experiment by different methods. (a) The original datasets. (b) ICP. (c) SICP. (d) CICP. (e) The affine ICP algorithm. (f) ACICP. (g) Ours.
Figure 3. The registration results of the simulation experiment by different methods. (a) The original datasets. (b) ICP. (c) SICP. (d) CICP. (e) The affine ICP algorithm. (f) ACICP. (g) Ours.
Sensors 23 06475 g003
Figure 4. The registration results of the simulation experiment by different methods. (a) The original datasets. (b) ICP. (c) SICP. (d) CICP. (e) The affine ICP algorithm. (f) ACICP. (g) Ours.
Figure 4. The registration results of the simulation experiment by different methods. (a) The original datasets. (b) ICP. (c) SICP. (d) CICP. (e) The affine ICP algorithm. (f) ACICP. (g) Ours.
Sensors 23 06475 g004
Figure 5. The registration results of the simulation experiment by different methods. (a) The original datasets. (b) ICP. (c) SICP. (d) CICP. (e) The affine ICP algorithm. (f) ACICP. (g) Ours.
Figure 5. The registration results of the simulation experiment by different methods. (a) The original datasets. (b) ICP. (c) SICP. (d) CICP. (e) The affine ICP algorithm. (f) ACICP. (g) Ours.
Sensors 23 06475 g005
Figure 6. The registration results of the indoor scenes experiment by different methods. (a) The original datasets. (b) ICP. (c) SICP. (d) CICP. (e) The affine ICP algorithm. (f) ACICP. (g) Ours.
Figure 6. The registration results of the indoor scenes experiment by different methods. (a) The original datasets. (b) ICP. (c) SICP. (d) CICP. (e) The affine ICP algorithm. (f) ACICP. (g) Ours.
Sensors 23 06475 g006aSensors 23 06475 g006b
Figure 7. Comparison the RMS convergence results of indoor scene data with different algorithms.
Figure 7. Comparison the RMS convergence results of indoor scene data with different algorithms.
Sensors 23 06475 g007
Figure 8. The registration results of the real data by different methods. (a) The original datasets. (b) ICP. (c) SICP. (d) CICP. (e) The affine ICP algorithm. (f) ACICP. (g) Ours.
Figure 8. The registration results of the real data by different methods. (a) The original datasets. (b) ICP. (c) SICP. (d) CICP. (e) The affine ICP algorithm. (f) ACICP. (g) Ours.
Sensors 23 06475 g008
Figure 9. Comparison the RMS convergence results of real data with different algorithms.
Figure 9. Comparison the RMS convergence results of real data with different algorithms.
Sensors 23 06475 g009
Table 1. The precise affine registration algorithm with color information and correntropy.
Table 1. The precise affine registration algorithm with color information and correntropy.
T h e p r e c i s e a f f i n e r e g i s t r a t i o n a l g o r i t h m w i t h c o l o r i n f o r m a t i o n a n d
c o r r e n t r o p y
I n p u t : T w o p o i n t s e t s M a n d D
I n i t i a l i z a t i o n : T h e i n i t i a l a f f i n e t r a n s f o r m a t i o n A 0 , t 0 .
R e p e a t t w o s t e p s :
S t e p 1 : A c c o r d i n g t o t h e t r a n s f o r m a t i o n i n s t e p ( k 1 ) , t h e c o r r e s p o n d i n g
r e l a t i o n s h i p i s e s t a b l i s h e d b e t w e e n t h e t w o p o i n t s e t s v i a E q u a t i o n ( 5 ) .
S t e p 2 : C a l c u l a t i n g ( A k , t k ) via Equation (6).
E n d
O u t p u t : A k , t k
Table 2. Comparison of errors with simulation data by different methods.
Table 2. Comparison of errors with simulation data by different methods.
DataErrorICPSICPCICPAICPACICPGeoOurs
1 ε A 4.24042.45674.23960.89090.41714.12030.0027
ε t 0.00470.00630.00460.00760.00240.00490.0001
2 ε A 2.31102.75732.31109.1136185.53562.24300.0017
ε t 0.00460.21910.00463.715373.54160.00450.0017
3 ε A 2.41922.37922.40470.56481.33162.31820.0002
ε t 0.04050.14010.03380.05330.76070.03899.9 × 10 6
4 ε A 2.37292.32842.34242.74841.45892.28940.0002
ε t 0.01870.10710.00702.82151.19700.01720.0001
Table 3. Comparison of errors with indoor datasets by different methods.
Table 3. Comparison of errors with indoor datasets by different methods.
DataErrorICPSICPCICPAICPACICPGeoOurs
1 ε A 2.44662.04992.44520.01880.41772.44240.0131
ε t 0.00550.00480.00050.00760.00020.00470.0003
2 ε A 4.24511.58724.14480.02579.24554.30120.0248
ε t 0.00890.02520.00860.00060.25150.00780.0004
3 ε A 2.44971.86832.45260.07670.07282.33420.0542
ε t 0.00100.00210.00120.00030.00020.00090.0002
4 ε A 4.00851.88494.02090.08930.09794.01430.0570
ε t 0.00270.02450.00320.00070.00100.00310.0007
Table 4. Comparison of errors by different methods.
Table 4. Comparison of errors by different methods.
The Error ε of Data 1The Error ε of Data 2
ICP1.7 × 10 5 2.0 × 10 4
SICP1.7 × 10 5 2.0 × 10 4
CICP8.9 × 10 5 3.4 × 10 4
AICP3.1 × 10 3 1.5 × 10 3
ACICP8.8 × 10 5 1.1 × 10 4
Ours8.2 × 10 6 6.7 × 10 6
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Liang, L.; Pei, H. Affine Iterative Closest Point Algorithm Based on Color Information and Correntropy for Precise Point Set Registration. Sensors 2023, 23, 6475. https://doi.org/10.3390/s23146475

AMA Style

Liang L, Pei H. Affine Iterative Closest Point Algorithm Based on Color Information and Correntropy for Precise Point Set Registration. Sensors. 2023; 23(14):6475. https://doi.org/10.3390/s23146475

Chicago/Turabian Style

Liang, Lexian, and Hailong Pei. 2023. "Affine Iterative Closest Point Algorithm Based on Color Information and Correntropy for Precise Point Set Registration" Sensors 23, no. 14: 6475. https://doi.org/10.3390/s23146475

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop