3D imaging technology for improvement of and application in architectural monitoring

Abstract: We consider the problem of laser scanners are increasingly being employed as surveying instruments for numerous applications. In this paper we have constructed a system for monitoring dangerous parts of archaeological sites or buildings. In order to the fragile parts of an archaeological site or building are protected without human intervention, the system will perform a 3D scan of the building in real time and collect the data. The system will restore the collected data and monitor the building by comparing and analyzing the data at different times. At the same time, in order to reduce the error generated in the coordinate system transformation process, we have established a “flattened” model to optimize 3D imaging to ensure that the final image does not exhibit distortion. By simulating different data, we can determine that our “flattened” model produces good results in 3D imaging.


Introduction
With the continuous evolution of information science and technology, theories such as three-dimensional (3D) simulation, physical reconstruction, and virtual reality have been proposed, and people's understanding has shifted from flat two-dimensional (2D) spaces to 3D spaces. 3D laser scanners that employ 3D laser scanning technology, also known as "real copy technology", can solve many problems owing to advantages such as noncontact function, scanning speed, access to information capacity, high precision, real-time use, and fully automated complex environmental measurements. This technology helps to overcome the limitations of traditional measurement instruments and has become a crucial means of direct access to target precise 3D data. 3D visualization is the next technological revolution after global positioning system (GPS) mapping technology fields.
3D laser scanning technology has unique advantages over traditional single-point measurement methods, such as high efficiency and precision. This technology can also provide 3D point cloud data of a scanning surface and can be used to obtain high accuracy and high resolution of a digital terrain model [1,3,6,9,[11][12][13]15]. 3D laser scanning technology is a set of optical, mechanical, electrical, and other technologies acquired from traditional mapping, measurement technology, and process integration through sophisticated sensors and a variety of modern high-tech means, constituting a variety of traditional summary mapping technology and integration.
Often, the archaeology and architecture undergoes partial or total collapse due to a lack of maintenance and control. In particular, land subsidence is caused by the degradation of mortar due to weather conditions or seismic activity, and can often occur with progressive movements that become excessive, thereby causing collapse [6,7,10,17].Although topographic or electronic monitoring systems could solve or prevent such events, their implementation is virtually impossible for many reasons, the most crucial of which are the invasiveness and aesthetic disturbance that usually accompanies the installation of instrumentation and high cost of installation and maintenance.
We propose cheap laser instrumentation equipment that once positioned can ensure that the visibility of parts of an archaeological site or building are protected without human intervention. Thanks to software with concepts derived from advanced tools of computational mechanics, the system shall execute every scan of a complete engineering analysis with an interpretation of the acquired data, sending across the network standard alarm messages based on certain parameters. Such a system should be tested during the course of the research project in sites of special interest. The installation of targets or other equipment on the property is not required [2,18,21].
The remainder of this paper is organized as follows. Section 2 presents the design objectives Section 3 explains that the core of the system is to obtain descriptions of the 3D information space. Section 4 presents perspectives regarding further research.

Design principles
3D laser scanning technology has a wide range of applications in the field of surveying and mapping. Laser scanning technology and inertial navigation systems, the GPS, charge-coupled devices and other technologies, as well as real-time access to a wide range of high-precision digital elevation models, geographic information on 3D reconstructions of cities and local areas all exhibit strong advantages and are components in protogeometric and remote sensing technology. Examples of successful applications in engineering, environmental testing, and other aspects of urban development include 3D mapping sections, large-scale topographic maps, hazard assessment, establishing 3D city models, complex building construction, deformation monitoring, and construction of other large buildings We propose low-cost laser instrumentation that once positioned can ensure the visibility of the parts of archaeological sites or buildings to be protected without human intervention. According to the design goals, the 3D model has its own special requirements, which are described as follows.
intervention is minimized. 2. A simple and cost-effective modeling method is required. Because of the widespread application prospects of virtual reality, low-cost modeling systems should be developed to facilitate the widespread application. The modeling system should be simple, fast, and effective. 3. To guarantee the real-time speed of the system, the number of patches including the object model should be used sparingly. The focus is on building systems monitoring. We will ignore the scanning measurements of the building. The accuracy can be increased within a specified area of the geometric model.
According to the aforementioned requirements and analysis, our prototype system has the following characteristics.
1. The data acquisition system principle is phase comparison; thus, the accuracy of the data collected by the system is protected. 2. Quick and easy operation: The process of setting up the whole system is simple and can be completed in minutes. After the setup is complete, the system automatically obtains and stores information regarding the target area. 3. Relatively low-cost hardware is selected.
We list a collection of basic assumptions that a 3D scanning scheme must satisfy. To develop a 3D laser-ranging module, a measured positive environment must be designed. To convert a 2D laser scanning laser rangefinder into a 3D scanning range finder, the simplest method is to combine the 2D scanning laser rangefinder with a rotating platform. We independently developed this software to satisfy the requirements of the entire scanning system. Single-target or multi-target 3D scanning can be timed and the scan results can be compared.

Coordinate system
This section presents the block diagram expression of the whole model of 3D scanning coordinate transformation. Evaluating the 3D coordinates of the object is the focus of the monitoring system as well as the present study. The key point is the modeling system including many coordinate systems [4,5,8].
Currently, the core objective of 3D scanning is to convert and optimize a coordinate system. This subject has been extensively explored and is still under investigation through methodological aspects for concrete applications. It involves the reverse process of 3D graphics displays for multiple coordinate systems including the image coordinates system, camera coordinates system, world coordinate system. Transformation between these coordinates affects the 3D image [14][15][16]19,20]. Each step of the coordinate transformation process is shown in Figure 1.
By undergoing processes in Figure 1, we transformed the screen coordinate system into the world coordinate system. Because of the selected scanning method and apparatus of the specific characteristics of each coordinate system, we did not require complex adaptive equipment, conversion speed, flexible operation, or any other parts of the conversion process.  The positioning process is the transformation between the coordinate systems, and thus we needed to present a clear definition of the system by using a variety of coordinate systems. Figure 2 shows a schematic of the coordinate system. In the camera coordinate system, we define the horizontal angle and vertical angle as α and −γ , respectively. Therefore, when α = γ = 0 , the coordinate is O − XYZ. The new coordinate system is transformed into the coordinate O − XYZ through shifting. We call this coordinate system the target object coordinate system and refer to it as o − xyz. Regardless of the coordinate system, we always defined the Y axis as the laser line, and thus point P is always in the Y axis of o − xyz.
When scanning an object, we define P as a point on an object. The point P has coordinate parameters (0, ρ, 0), where ρ is the distance of P from the laser. Therefore, coordination of the relationship between the world coordinate system and camera coordinate system has been simplified [17]. The conversion relationship is shown in Figure 2. In Figure 3, the point coordinates of the coordinate of the target object is expressed as Equation (1).
We need to transform the coordinates from the target object coordinates o − xyz to the camera coordinates O − XYZ. To guarantee the accuracy of the conversion, we set the vector ξ.
In the coordinates of the target object o − xyz: In the camera coordinates system O − XYZ: We set (e 1 , e 2 , e 3 ) as the Cartesian coordinate system and (ε 1 , ε 2 , ε 3 ) as the coordinate system after rotation. Among these, First, we consider the X and Z axes as the center of rotation, as shown in the schematic in Figure 4. Thus, the coordinates are expressed as follows: Through substitution into vector ξ , we can obtain (e 1 , e 2 , e 3 ) We determine point P in the camera coordinate system O − XYZ as follows:

Construction of scanning model
The preparation for the transformation from the theoretical part of the two coordinate systems needed to be based on the previous derivation formula and the operation of the instrument. In addition, the formula and operation needed to be bound. The coordinate system conversion formula must be obtained. Figure 5 shows a schematic of the instrumentation and camera coordinate system [15]. α is the data of horizontal angle. This is a horizontal rotation angle of the laser from the Y-axis to X-axis angle of rotation. We can obtain the data of α from the program. This is a known variable T A.
β is the rotation about the Y-axis. Because of its nature, the instrument will not produce the Y-axis rotation angle, and thus the value of β is 0 γ is the vertical angle. This is a vertical rotation angle of the laser from the Y-axis to the Z-axis angle of rotation. We can obtain the data of γ from the program. This is a known variable −PA.
ρ is the distance between point P and the camera coordinate system origin. The data are calculated from the internal program. This is a known variable.
In our case: At the same time, we can obtain the distance to P from the machine, and thus P is coordinated by (x, y, z). Among these Subsequently, we improve the transformation between the camera coordinate system and world coordinate system. When the instrument is in its initial state, the α, β, γ angles are all 0. In this state,the coordinate system is the world coordinate system. These studies are based on the assumption that the derivation coordinate, origin of the camera coordinate system, and origin of the world coordinate system are coincidental. However, in reality, the instrument scanning the origin of the camera coordinate system will be moving, which only generates an error, thereby affecting the final results. Therefore, we need to calibrate the system, namely the calibration project, and then run the program, in which we set the motion vector (u, v, w) as a correction of the system. The next section introduces the specific methods of the calibration project. At the same time, we summarize the scanning method that is useful for arguments based on calibration. We first need to obtain multiple groups of data, and subsequently correct the data by using the correction system. Finally, we compare the effectiveness of the data to confirm the effectiveness of the correction system.
The previous coordinate transformation theory can be applied to an existing model. The point P transformation formula is expressed as follows: In the following discussion, we abbreviate the formula as follows: In equation (4), (X * , Y * , Z * ) is the data obtained after the conversion of the actual measurement results, (X, Y, Z) is the theoretical conversion value required, and (u, v, w) is the correct value of the system.
(u, v, w) consists of two parts. One is the self-generated system error and the other is the error generated when the camera, namely the laser emission point, is moving. Different angles are used so that the receiving point remains uninterrupted.

Calibration procedure
This section presents the experimental data demonstrating that the correction system is a crucial aspect of the 3D scanning monitoring system [19] Before comparison of the experimental data, we must first be introduced to the calibration process and formula deduction. The principle of the system calibration procedure is as follows. First, we set up a plane perpendicular to the horizontal plane, and the point of the plane is a sampled scan. Because these sampling points are located on the same plane, in theory, these coordinates in the world coordinate system (X, Y, Z) in the value of Y should be the same. We want to solve the equation by each point of correction to ensure they are on the same plane.
This section describes a system for the analysis of the calibration procedure. N > 7 points on a plane are scanned. Each point must satisfy the (unknown) plane equation.
aX + bY + cZ − d = 0 (X * , Y * , Z * ) is given by equation (4) a( The N equations given by this equation for the scanned point form a nonlinear system of equations in the unknown a, b, c, d, h, i, j, which can be solved using the least squares method with an iteration, where in each step, the linear system computes the solution increment that may be solved in the least square sense. From equation (5), we can organize them again as follows: By dividing both sides by a, we obtain Subsequently, we reset the variables as follows: Thus, According to equation (6), the values of µ 0 , µ 1 , µ 2 , φ are calculated using the least squares method. Subsequently, we can obtain the values of a, b, c, d. Subsequently, we can use equation (7) as follows: We can use the least squares method again to calculate h, i, j. Details of the relevant procedures are provided in the appendix.Once the solution has been determined, the constants h, i, j are the instrument calibration constants and the coordinates of a scanned point are provided by the equation of the world coordinate system, in which the angles are provided by machine settings.
We used different data to verify the practicality of the correction system. We used six different sets of experiments (1 m, 3 m, 6 m, 9 m and 12 m) for testing and then we got two sets of data for each experiment: use correction system and not use correction system. Our target is to determine the error between calculated and theoretical data, so each set of experiment has been done 50 times. The experimental results in the Figure 6. The red image in the picture is the data of the not using the correction system and the blue image of the using the correction system. It can be seen that using the correction system, the data will be closer to our target. We consider a set of experimental results for comparison, with a first set of data without correction and a second set that has been corrected. Because the data obtained by the scanning instrument is expressed in meters, we must ensure that the data is accurate up to three decimal places. The experimental set up is based on the data, and the instrument is scanned on a 7-m flat plate. Subsequently, imaging and analysis are conducted. The results are shown in Figure 7. Even if the preferred interpretation is inaccurate, the data strongly suggest that the calibration procedure is crucial. We further analyzed the results shown in Figure 7. The analysis results are shown in Table 1. The minimum distancs 6.852 6.872 The average distance 6.965 6.957 The Variance between the standard distance 0.0057 0.0023 The experimental results show that the corrected data led to better results compared with the data that had not been corrected. By comparing the data, we found that the first set of data for the difference between the maximum and minimum values yielded a result of 0.222 m. The second set of data had a maximum difference of 0.129 m. We determined the optimal number of calculations and used suitable correction data. In addition, the error value in the correction process was obtained mainly from two aspects; the first of which was to return the instrument to scan the data with a certain degree of error. The accuracy of each second data conversion formula also had an impact on the data. Only the minimum error correction is possible to ensure optimum effectiveness of 3D imaging.

Conclusion
This paper concludes with a discussion of future research. Based on the theme of 3D scanning and monitoring, we explored the system requirements and design issues of the coordinate system. In addition, we determined and implemented a feasibility plan. We developed a low-cost laser instrumentation system that once positioned can ensure the visibility of parts of archaeological sites or buildings to be protected. The system requires no human intervention. Thanks to software with concepts derived from advanced tools of computational mechanics, the system shall execute every scan within a given period based on a complete engineering analysis with an interpretation of the acquired data.
The experimental results may not be completely satisfactory, but a wide range of applications for the system can be observed. Better imaging and data analysis methods could contribute to higher instrument accuracy. The following aspects could be improved. 1) Choice of coordinate system perspective: Based on the experimental results, a suitable viewing angle could help to obtain the desired results easily. Thus, if we can artificially control the viewing angle, the performance of the instrument will be further enhanced.
2) Coordinate scale improvements: The coordinate system of the unit length is represented by 1 m. Although the instrument can detect changes of 1 cm, it cannot be easily positioned in the observation coordinate system. Selecting the most appropriate coordinate ratio affects the monitoring results.

Conflict of interest
The authors declare no conflict of interest.