Abstract

In recent years, 3D laser scanning technology has been applied to tunnel engineering. Although more intelligent than traditional measurement technology, it is still challenging to estimate the real-time deformation of NATM tunnel excavation from laser detection and ranging point clouds. To further improve the measurement accuracy of 3D laser scanning technology in the tunnel construction process, this paper proposes an improved Kriging filtering algorithm. Considering the spatial correlation of the described object, the optimization method of point cloud grid filtering is studied. By analyzing the full-space deformation field of the tunnel lining, the deformation information of the measuring points on the surface of the tunnel lining is extracted. Based on the actual project, through the on-site monitoring comparison test, the three-dimensional laser point cloud data are grid processed and analyzed, and the deformation data obtained from the test are compared with the data measured by traditional methods. The experimental results show that the Kriging filtering algorithm can not only efficiently identify and extract the tunnel profile visualization data but also efficiently and accurately obtain the tunnel deformation. The measurement results obtained by using the proposed technology are in good agreement with those obtained by using traditional monitoring methods. Therefore, tunnel deformation monitoring based on 3D laser scanning technology can better reflect the evolution of the tunnel full-space deformation field under certain environmental conditions and can provide an effective safety warning for tunnel construction.

1. Introduction

Real-time and accurate monitoring of tunnel surrounding rock deformation and scientific prediction and prediction of the stability of surrounding rock in front of the working face are important means to guide on-site construction and feedback design, as well as necessary conditions to ensure tunnel construction safety. How to innovate deformation monitoring methods, improve the level of information and intelligence, and improve the accuracy of surrounding rock deformation monitoring is the key to ensuring the safety of tunnel construction. At present, the main method adopted for tunnel deformation monitoring and measurement is still to use a total station, convergence meter, and other instruments and equipment for measurement. For these measurements, monitoring measuring points (reflectors, etc.) need to be arranged on the tunnel wall in time. This traditional monitoring and measurement method is very troublesome in practical operation, with extremely low efficiency, and the error is not easy to control. It is very easy to make incorrect judgments on the surrounding rock stability evaluation during the monitoring process, making tunnel construction unsafe. 3D laser scanning compensates for the shortcomings of traditional measurement methods [13]. It can scan the tunnel lining surface with high precision, high density, and high speed to obtain the 3D information of the whole tunnel surface. Through data extraction and conversion, the tunnel deformation can be accurately measured.

In recent years, the application of 3D laser scanning technology to tunnel monitoring and measurement has been an important research direction for future development. Many scholars have made many achievements in the exploration and research of this technology [49]. Based on mobile laser scanning technology, many scholars at home and abroad have designed building information models (BIMs) to develop deformation monitoring systems for tunnels or other retaining structures [10, 11]. At the same time, 3D laser scanning technology can not only design the structural positioning and orientation system [1217] but also create a 3D building information model (BIM) [18, 19] based on the orientation system. In fact, by optimizing the deformation data extraction and improving the registration algorithm [2023], the accuracy of tunnel 3D modeling and deformation monitoring can be effectively improved [14, 2426]. In addition, research on intelligent identification and long-term stability evaluation of geotechnical structures based on 3D laser point clouds has also made great progress [2729]. The above research shows that the measurement accuracy of the highest accuracy 3D laser scanner can replace the total station in many applications and can meet the accuracy requirements of tunnel deformation monitoring.

At present, a large number of studies show that the point cloud filtering algorithm has become the key to whether the 3D laser scanning accuracy meets the application requirements [3033]. Therefore, it is necessary to study point cloud filtering. According to the different theoretical backgrounds of various filtering algorithms in recent years, filtering algorithms can be divided into six categories based on slope, surface fitting, segmentation, irregular triangulation, morphology, and machine learning [34]. The principle of the slope-based filtering algorithm [33, 35] is simple and easy to implement, but it relies too much on the threshold setting, which is not suitable for real-time processing of massive data in areas with large terrain fluctuations, and the filtering effect is low. The filtering algorithm based on surface fitting [32, 36, 37] depends on the selection of the interpolation method, and the multilevel iterative method adopted will be affected by the filtering results of each level, which is prone to error transmission and accumulation. The filtering effect of the segmentation-based filtering algorithm [38, 39] is too dependent on the results of clustering segmentation, and the selection of the point cloud segmentation method also has a significant impact on the filtering results. The irregular triangulation network algorithm [40] requires considerable memory and is sensitive to low-level noise, making it easy to misjudge low-lying objects. The filtering algorithm based on morphology [41, 42] has a simple principle and high implementation efficiency. However, the robustness of this algorithm in areas with sizeable topographic relief needs to be improved. How to improve its overall accuracy will be the focus of follow-up research. The filtering algorithm based on machine learning [43] needs a large number of training samples, and the samples must cover all possible terrain features, so it is challenging to obtain an excellent filtering effect due to the computer’s high voluntary requirement.

In summary, all the above point cloud filtering algorithms have limitations. How to improve the filtering algorithm and the algorithm’s accuracy becomes the key to the monitoring experiment’s success in this paper. As the Kriging interpolation algorithm has high accuracy in fast, automatic DEM generation, the overall accuracy evaluation method is operable. Therefore, this paper intends to combine the Kriging interpolation algorithm to extract and analyze point cloud data through field test section monitoring, compare the deformation data obtained from the test with the measurement data of traditional methods to verify the accuracy of the algorithm, and provide a theoretical basis for promoting the application of three-dimensional laser scanning technology in tunnel monitoring and measurement.

2. Kriging Filtering Algorithm

The Kriging interpolation method is a prediction method that uses the theory of structural analysis and variogram function to calculate the optimal and unbiased estimation of regional variables in limited space. The Kriging interpolation algorithm is a linear unbiased optimal estimation algorithm for studying spatial variation and spatial interpolation. The Kriging interpolation method not only considers the relationship between the position of the estimated point and the known data location but also considers the spatial correlation of variables. It is assumed that the regionalized variables are not independent of each other, have absolute randomness and structural characteristics, and satisfy the second-order stationary and intrinsic stationary conditions. Simultaneously, the Kriging method [44, 45] considers the spatial correlation property of the description object in the process of data gridding, which makes the interpolation result more scientific and closer to the actual situation. It can give the interpolation error (Kriging variance) so that the interpolation’s reliability is clear at a glance. In this paper, the classical Kriging spatial interpolation is extended and applied to point cloud grid filtering analysis to improve 3D laser scanning data acquisition accuracy.

The basic mathematical model of the Kriging interpolation method is as follows:where denotes the estimation value of the predicted point; denotes the value of reference points in the region adjacent to the predicted point involved in prediction; and denotes the Kriging weight coefficient, which is defined depending on the calculated results of the variation function under the conditions of unbiasedness and minimum variance property.

Equation (1) is the linear combination of n values; the principle of the Kriging method is to ensure that the estimator is unbiased and calculate n weight coefficients under the premise that the estimated variance is minimum .

Under the unbiasedness condition, it is a problem that the conditional extremum is calculated by the Lagrangian multiplier method to obtain the minimum estimated variance.

is the (n + 1) function of n weight coefficients and , where it is first used to calculate the partial derivative of F from and and finally obtain the Kriging equation set expressed by the value of the semi-variable function:where .

A semi-variable function or semi-mutation function is derived from the variance concept in spatial statistics; 1/2 of the difference variance between the Z(x) value and Z(x + h) value of regionalized variable Z(X) on point x and point x + h is defined as a semi-mutation function of regionalized variable Z(x) along the x-axis, denoted as :

The computational formula for the trial semi-variable function is

To estimate the unknown values of the regionalized variable, it is necessary to fit the semi-variable trial function into the corresponding theoretical semi-mutation function model. Semi-mutation function models provided by typical Kriging algorithms include Gaussian, linear, spherical, damping sine, and exponential models. The spherical variogram model near the origin is linear; the exponential variogram model is a straight line, while the Gaussian variogram model is a parabola. The spherical model is utilized in consideration that the paper is applied to the deformation monitoring of the tunnel. In this way, it considers the randomness of reservoir parameters and their correlation and obtains the best linear unbiased interpolation and the interpolation variance while meeting the condition of minimum interpolation variance. The specific model expression and function graph are shown as follows:where C0 denotes the nugget, C0 + C denotes the sill, C denotes the partial sill, denotes the range, and h denotes the lag distance. The filtering algorithm flow is shown in Figure 1, and the specific steps are as follows:(1)Calculate all samples, i.e., different distance.(2)Sort all distances from smallest to largest and divide them into n groups, each containing a certain number of distance values.(3)Fit the curve to obtain the fitting coefficient and calculate the average distance for each group of distance values. Finally, equation (6) is used to calculate the estimated value of the experimental semi-variogram for that group.(4)Select the semi-variogram model of equation (7), namely, the spherical model, and perform function fitting to obtain the experimental parameter of the model and obtain the expression for .(5)Take the distance between the corresponding sampling points and the points to be inserted into the variation function and calculate the coefficient matrices on the left and right sides of .(6)Calculate the estimated values of the points to be inserted, that is, for each point to be measured, inversely solve equation and obtain the corresponding weighting coefficients,(7)Take the calculation result of the previous step into equation (1) to solve the point estimation to be inserted.(8)Cycle steps (6) to (7) to calculate the estimated values of all interpolation points.

3. On-Site Monitoring Test Plan

In this paper, a Zoller+Fr ö hlich IMAGER 5010X 3D laser scanner is applied to the monitoring measurement of tunnel engineering in the management and living area of the Changsha municipal solid waste treatment site. The measured data are compared with the traditional monitoring measurement method. Germany Z + F 5010x 3D laser scanner’s data acquisition rate is more than 1 million points per second. The highest accuracy at 50 meters is 0.8 mm, and the range is 0.3–187.3 meters. It is a high-performance and high-precision flagship scanner that is very suitable for high-precision industrial surveys and tunnel engineering surveys. The precision information of the 3D laser scanner is shown in Table 1.

The higher the scanning quality of a 3D laser scanner, the lower the data acquisition rate and the lower the ranging noise. The higher the scanning resolution, the higher the data acquisition rate and the higher the ranging noise. For setting resolution and quality, the following simple rules can be followed:(1)In the vast majority of cases, choose “High” resolution, which can maintain a reasonable range of point spacing and number.(2)When it is necessary to scan for a long distance (>50 m) or when there is a high demand for details, choose “Superhigh.”(3)Ultrahigh and extremely high settings can only be used when scanning local details at long distances.(4)The vast majority of cases choose “normal” quality, which can maintain scanning time and data quality within a suitable run.(5)When scanning time is the first priority, select “low” quality.(6)The high and premium settings can only be used for long-distance scanning (>100 m), low reflectivity, and high data quality requirements (such as when generating smooth mesh models based on point clouds).

Therefore, within the range of 25 meters, with good target quality and reflection conditions, the measurement accuracy of the scanner system can reach approximately 0.5 mm, fully meeting the measurement requirements.

3.1. Project Overview

The tunnel site is located in the open space on the west side of the Heimifeng solid waste treatment plant in Qiaoyi Town, Wangcheng District, Changsha City. It is the southern part of the Wangxiang rock base. It is mainly composed of eroded structural landforms and belongs to the low mountain hilly landform. Low mountain hills and gullies mainly characterize it. The mountain veins are clear, the mountain tops are generally smooth, some are sharp, most of the ridges are open and slow, and some parts are narrow and form a narrow steep ridge. The starting pile number of the tunnel is K0 + 195 − K0 + 285, with a length of 90 m. The tunnel is a single-lane highway tunnel with a single-center curved sidewall structure, with a designed inner contour radius of 4.24 meters and a reinforced concrete composite lining structure. According to the survey results, the site’s terrain fluctuates widely, and the hillside is relatively steep. The strata distributed in the site are mainly Quaternary eluvium, and the underlying bedrock is late Yanshanian granite.

3.2. Test Scheme

The implementation plan of 3D laser scanning on the ground needs to be determined according to the measurement tasks, requirements, and site conditions, including the selection of the coordinate system, the selection of the scanner and its registration target, and the selection of the scanning station. According to the implementation plan, the target is laid out in advance, and then the appropriate equipment is connected to set up scanning parameters (such as scanning range, scanning distance, and scanning interval), and scanners are set up on different stations for scanning.

3.2.1. Layout and Measurement of Control Points

In this tunnel engineering monitoring test, due to the site conditions, the control points can only be set on one side (as shown in Figure 2). Therefore, three homemade plane targets (as shown in Figure 3) are arranged at the tunnel portal as control coordinate conversion points. The positions of these points are relatively fixed and are not easily damaged by construction. After the control conversion points are arranged, the target center coordinates under the construction control coordinate system are measured by the total station for subsequent conversion. For the common point of conversion between stations, the standard spherical targets (as shown in Figure 3), which are not affected by the incident angle and have high reflectivity, are selected, and three to four spherical targets are arranged between two adjacent stations, and they are precisely scanned on both stations.

3.2.2. Data Acquisition

Field measurement is the process of actual data acquisition. The quality of point cloud data is directly affected by scanning distance, surface material, control network, target measurement accuracy, spot size, scanning point spacing, point cloud splicing accuracy, total reflection material, and external environment. Similar to other optical and electronic instruments, the external environment temperature, air pressure, air quality, and other factors significantly impact the laser echo signal. Therefore, in view of the overly complex construction environment in the tunnel, to avoid the impact on the quality of data acquisition, the period with high air visibility in the tunnel is selected for the test.

Before the on-site scanning test, the instrument position and target position in the tunnel were determined to ensure that the scanning would not be interrupted or blocked. Set up the target ball (as shown in Figure 3), set up the scanner, set the scanning parameters, and then carry out the scanning operation. During the scanning operation, the target ball needs to be kept fixed to avoid workers’ touching or misoperation, which will change the ball’s position and affect the subsequent point cloud registration accuracy. The target of the test scanning is the target point on the outer surface of the tunnel lining with an extensive range, so it is not necessary to conduct coarse panoramic scanning but to scan the panoramic target directly first and then to scan the target ball precisely (high-precision and high-quality scanning) on this basis. The on-site layout plan is shown in Figures 46.

3.3. Data Processing

This section mainly introduces the basic data processing process, data extraction algorithms, and filtering processing methods. The specific details are as follows.

3.3.1. Basic Process of Data Processing

For the point cloud data obtained by the 3D laser scanner, not every point data can be used. Too many data points will reduce the efficiency of computer operation and increase storage space. To avoid the above problems, it is necessary to delete some data points to simplify the point cloud data. The main process of point cloud data processing is as follows:(1)Point Cloud Editing. Cut off the coarse difference and point clouds that are not related to the target.(2)Point Cloud Registration and Mosaic. The common point coordinates are used to transform the point cloud data measured by different stations into the same coordinate system to realize point cloud splicing and form a whole. At present, there are three methods of point cloud registration: artificial target registration, point cloud self-registration, and control coordinate system registration.(3)Establishment of a Topological Relationship. Point clouds are usually isolated points, and each point is only related to its surrounding points within a specific range. The establishment of spatial point cloud topology mainly includes the octree method, grid method, and k-d tree method.(4)Point Cloud Data Reduction. Point cloud data reduction algorithms can be divided into four categories: the bounding box method, random sampling method, curvature sampling method, and uniform grid method.(5)Denoising and Smoothing of Point Cloud Data. Due to the influence of surface defects such as roughness and ripple of the measured object and the measurement system itself, there are noise points in the real data. According to the point cloud quality and subsequent modeling requirements, the appropriate filtering algorithm can be selected flexibly.(6)Repair of Holes. In the process of laser scanning, due to various reasons (such as local occlusion), point cloud holes are formed. There are two kinds of repair algorithms: repairing the triangular mesh surface when reconstructing the triangular mesh surface and repairing the holes of scattered point cloud data first and then reconstructing the triangular mesh surface.(7)Segmentation of Point Cloud Data. The subregions of different surface types formed by data segmentation have the characteristics of a single feature and uniform convexity and concavity. The reconstruction of each subregion is helpful to reduce the error and maintain the properties of the point cloud. At present, the algorithms of point cloud segmentation mainly include edge-based algorithms, face-based algorithms, and clustering algorithms.(8)3D Modeling of Point Cloud Data. In the process of 3D model construction, surface reconstruction is the most important and complex step. At present, there are two kinds of surface reconstruction schemes: one is based on a triangular mesh surface, and the other is based on a spline surface.

3.3.2. Point Cloud Extraction Algorithm

Based on the characteristics of 3D laser scanning, a planar target point cloud automatic acquisition method based on planar cutting is designed. First, the plane target point cloud is automatically obtained through the plane cutting method, and then the plane target coordinates are automatically recognized through the k-means clustering method. Finally, the self-recognition of the monitoring point plane target is completed. The black and white targets are arranged on the tunnel surface (as shown in Figure 3), and there are many tunnel surface point clouds on the upper and lower sides of the target. To ensure the quantity and quality of the target point cloud, the area precision scanning mode is used when scanning the planar target, and the selected scanning area is slightly larger than the area where the target is located.

After obtaining the planar target point cloud, it is graded based on the reflection intensity value of the point cloud data so that the number of point clouds in each region is equal, achieving regional segmentation of point cloud data. Finally, the target center coordinates are automatically obtained through the k-means clustering method. After dividing the scanned data into n regions, k initial clustering centers are selected for k-means clustering on the point cloud of each region to obtain k cluster center points () for each region, obtain the center point () for the region, and then obtain the target center point coordinate ().

3.3.3. Point Cloud Data Processing

Point cloud filtering is a critical step in point cloud data processing. Accurate filtering results will help to improve the accuracy of point cloud postprocessing results. In this paper, according to the principle of ordinary Kriging interpolation, the Kriging filtering algorithm is realized by the MATLAB program, and the program processes the point cloud data. The following describes the tunnel point cloud data processing process according to the actual work needs.(1)First, the point cloud data (as shown in Figure 7) obtained by field scanning are imported into the data processing program, and point cloud data preprocessing (filtering and drying) is carried out to prepare for the subsequent operation.(2)Then, the point cloud data are filtered and denoised in the preprocessing process. After data preprocessing, in the registration process, the planar target fitting tool and the spherical target appropriate tool are used to fit the target, and the three-dimensional coordinates of the target center in the scanner coordinate system are extracted.(3)After the target center coordinates are extracted, point cloud registration is carried out (as shown in Figure 8). The coordinates of the control target measured by the total station are saved in the txt file, and the point cloud data are converted to the control coordinate system. If there are many common points, the point with better quality should be selected for point cloud registration. Finally, the coordinate information of the registration point cloud data is exported to the txt file to complete the data processing process.

4. Data Analysis

4.1. Monitoring Information Extraction

According to the point cloud data coordinates of the tunnel surface, the tunnel section data at any pile number can be extracted, cross section fitting can be carried out to extract the central coordinates, and then the central line can be fitted. These data can be used to analyze the situation of over-excavation and under-excavation, settlement of the arch crown, convergence state, and axis deviation to guide construction. Unstable and dangerous situations are discovered in a timely manner, and they are reported to reduce the occurrence of disasters and their adverse consequences.

4.1.1. Section Extraction

In the tunnel monitoring measurement, all the measurement work is completed on the basis of the cross section; therefore, the first step of 3D laser scanning data application in the tunnel is section extraction; otherwise, other monitoring measurement items will not be completed. We can extract the section according to the design data of the tunnel. According to the definition of the section, the section at a point on the tunnel axis is perpendicular to the tangent at the point. That is, the tangent vector at the point is the average vector of the section. If the coordinate of the point on the central axis is and the tangent vector at the point is , then the section equation at the point is

The three-dimensional coordinates of a point on the central axis can be determined according to the line design parameters of the tunnel section. The tangent vector at the point is , is the tangent slope of the plane , and is the slope.

Regardless of how high the scanner’s resolution is, there will always be a certain interval between the measured points. Therefore, in the actual extraction of a cross section, the cross section is not a strict plane but a plane with a certain thickness. In practical applications, the data of the 1-2 cm thickness are generally extracted as section data. Figures 911 show the tunnel section at the same station extracted by this method. Figure 9 shows the section extracted from the coordinate construction system, Figure 10 shows the section after rotating to the front in the coordinate construction system, and Figure 11 shows the section in the independent coordinate system, with the origin at the center of the axis.

4.1.2. Deformation Information Extraction

According to the industry standard, the peripheral displacement and vault settlement are necessary items in tunnel monitoring, and one section needs to be measured every 5–50 m. The data scanned by a 3D laser scanner are continuous and comprehensive, so the monitoring of vault settlement and convergence can be carried out in any section. However, the convergence measurement of the tunnel surroundings by traditional monitoring measurement is generally carried out by the tunnel clearance change meter (referred to as the convergence meter). Both the settlement and convergence of the arch crown are fixed points on the cross section for continuous monitoring. The convergence measurement points and the vault settlement measurement points are arranged in the same section. The embedded parts need not be scanned by any traditional method. According to the tunnel conditions, 1–3 points are generally selected for vault settlement monitoring, and 2-3 pairs of points are generally selected for convergence change.

When extracting the monitoring point data from the section, it is generally based on selecting the area data with the monitoring point as the center, and then the distance weighted average of the points in the area is used to obtain the monitoring data. If the design coordinates of the selected monitoring points are set as and the coordinates of the points in the selected area are , then the measuring coordinates of the monitoring points are

After obtaining the monitoring data of each measuring point, the time variation curve can be drawn based on the first phase data to reflect each section’s convergence and settlement, and the spatial variation curve can also be drawn to reflect the overall change in the tunnel.

4.2. Data Analysis

On the basis of the tunnel coordinate file, data mining and analysis are carried out. According to the point cloud data coordinates of the tunnel surface, the tunnel section data at any pile number can be extracted, cross-section fitting can be carried out to extract the central coordinates, and then the central line can be fitted. This paper selects three representative sections of the tunnel project for comparative analysis. The final cumulative deformation comparison of the three sections in the monitoring period is shown in Table 2. The displacement convergence diagrams of K0 + 202, K0 + 207, and K0 + 212 are shown in Figures 1214.

From the comparison results in Table 3, it can be seen that due to the massive and relatively uniform monitoring data obtained by 3D laser scanning, it is very advantageous for the IDW and Kriging methods, and the results obtained are relatively similar. The filtering result error of the TIN method is relatively large. Overall, the improved Kriging algorithm is closer to the true value, has higher accuracy, and is more suitable for situations with more data compared to the IDW method. This also indicates that the method is effective.

Table 2 shows that the cumulative deformation measured by three-dimensional laser scanning is relatively small, the settlement data of K0 + 207 and K0 + 212 are close to the traditional measurement results, and the peripheral convergence data are quite different. Combined with the field situation and the analysis of equipment accuracy, the tunnel’s environmental factors will significantly affect the measurement accuracy. For vault settlement, the scanner only needs to extract the Z-axis coordinate point data to obtain the settlement result, which is more stable than the horizontal coordinate conversion; for horizontal convergence, the scanner must extract the coordinate point data of the X-, Y-, and Z-axes to effectively obtain the convergence deformation result, which causes a large error. In addition to data extraction, the tunnel’s environmental factors also cause great uncertainty in the monitoring results and then cause error accumulation. The experimental data also show that the anti-interference ability and accuracy of 3D laser scanning technology need to be further optimized and improved in tunnel deformation monitoring. In a specific monitoring environment, the cumulative deformation measured by 3D laser scanning technology can better reflect the real situation of tunnel deformation to a certain extent.

At the same time, it also shows that the improved Kriging algorithm has a greater impact on the accuracy of tunnel settlement data and is more suitable for situations with more data, which can provide theoretical support for 3D laser scanning technology to solve the problem of on-site deformation monitoring. In conclusion, the tunnel 3D laser scanning automatic monitoring measurement method using tunnel point cloud plane cutting and k-means target self-identification has further improved the automation and intelligence level of tunnel deformation monitoring on the basis of ensuring the same level of monitoring accuracy as the existing conventional monitoring measurement methods.

5. Conclusions

In this paper, the Kriging-based filtering algorithm is used to process 3D laser scanning point cloud data. Compared with the traditional point cloud filtering data processing method, the method can effectively identify and extract the tunnel profile section’s visual data, which provides a solution for the wide application of 3D laser scanning technology in the field of tunnel monitoring. The experimental results show the following:(1)Traditional Kriging spatial interpolation is extended and applied to point cloud grid filtering analysis. The calculation results show that the algorithm has high accuracy in the fast and automatic generation of DEMs and improves 3D laser scanning data acquisition accuracy. However, improving the filtering efficiency of massive point cloud data enhances the automation degree of the filtering algorithm, and error control will focus on the point cloud filtering algorithm in the future.(2)Through the field monitoring test, the point cloud data are extracted and analyzed, and the deformation data obtained from the test are compared with the traditional measurement data. The results show that 3D laser scanning technology based on the Kriging filtering algorithm can obtain tunnel deformation more efficiently and accurately.(3)The anti-interference ability and accuracy of 3D laser scanning technology in tunnel deformation monitoring need to be further optimized and improved. Under specific monitoring environmental conditions, the vault settlement, surrounding convergence, and axis deviation can be analyzed, and the abnormal stability and dangerous situation of the surrounding rock can be warned in a timely manner to prevent the occurrence of geological disasters and to ensure the safety of tunnel construction.

Data Availability

The data used to support the findings of this study are available from the corresponding author upon request.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

Acknowledgments

The authors would like to acknowledge the financial support provided by the National Natural Science Foundation of China (grant no. 51678226), the Natural Science Foundation of Hunan Province (grant nos. 2023JJ30110, 2021JJ50147, and 2021JJ30078), the Science and Technology Innovation Project of Yiyang City (grant nos. 2019YR02 and 2020YR02), and the Open Research Foundation of Hunan Provincial Key Laboratory of Key Technology on Hydropower Development (grant no. PKLHD202005).

Supplementary Materials

The supplementary materials are mainly part of the Kriging algorithm code. The main calculation process needs to initialize the semi-variogram and parameters first, then adjust the input data for later optimization, then initialize the parameters and solve the default parameters, and finally apply the Kriging matrix and interpolation solution. (Supplementary Materials)