A HIGH-ACCURACY METHOD TO MAKE MEASUREMENTS ALONG A LINE USING COMPUTER VISION AND LASER DISTANCE SENSORS

This paper presents an efficient method to measure real-world distance between interest points of a 3-D object using a camera, two laser distance sensors and Image Processing techniques. This method consists of calculating the distance between interest points of an object in the image plane. Two laser distance sensors placed next to the camera measure accurately the distance between the camera and the object. Moreover, the laser dots generated by the distance sensors are visible in the image, acting as a reference along with the distance measures in the calculations. As a result, thecalculation of the interest point's coordinate on the camera 3-D space is transformed in a real-world distance between interest points. Through a set of experiments, we show that the proposed system measures distancesup to 70 mm with sub-mm accuracy. The results suggest a potential use of our method in many other Computer Vision applications, including industrial applications.


INTRODUCTION
Three-dimensional (3-D) reconstruction is one of the most interesting fields of Computer Vision.Applications of 3-D reconstruction can be seen in many areas, such as autonomous navigation (Básaca-Preciado, 2013), reverse engineering (Tao, 2007), object recognition (Gomes, 2013), and object digitalization (Zhen-Yu, 2012).Developments on this field (Su, 2010) allowed the construction of sophisticated measurement systems (Li, 2012).Frank et al (2000) emphasizes that 3-D shape measurement using optical methods have evolved significantly during the last decades.
General 3-D shape measurement systems can be classified into active and passive systems (Avilagh, 2013) (Muquit, 2006).Active systems employ structure illumination (structure projection, phase shift, moiré topography, etc.) or laser scanning and tend to be very expensive.On the other hand, passive 3-D measurement systems based on stereo vision have the advantages of simplicity and applicability, since such techniques require simple instrumentation.The main problem of passive systems is poor reconstruction quality.
In industry, measurement systems can be highly specialized, since there are constraints regarding the shape measured, the measurement range, etc.Many of the off-the-shelf non-contact industrial measurement systems are based on Image Processing techniques and rely on the fact that the object is at a known position, for example, on a conveyor belt (Song, 2015).However, when this condition is not met, general 3-D shape measurement systems are more suitable.
The objective of this paper is to present a high-accuracy method to measure the distance between interest points along a line on the surface of a 3-D object.The interest points are obtained using Image Processing techniques.Depth information is obtained with the use of the laser distance sensors, and further 3-D reconstruction of the interest points allows measurements in real-world units, such as millimeters.137

Method overview
Our method is comprised of a camera, two laser distance sensors and a computer and its objective is to measure the length L, along a line on the object surface, as shown in Figure 1.The laser dot coordinate on the camera 3-D space are calculated using the distances and the laser beam mapping on the camera 3-D space (Figure 3).The laser dot coordinates on the image plane (x,y) are calculated using Perspective Transformation (Figure 4).Interest points are obtained along a line formed by the two laser dots on the image plane (Figure 5).Interest point's Z coordinate on the camera 3-D space is obtained by interpolation (Figure 6).The interest point's coordinate on the camera 3-D space is calculated using Inverse Perspective Transformation (Figure 7).The length L between interest points is calculated in real-world units, such as millimeters.The steps II, III, IV, V, VI and VII are described in detail in sections 2.4, 2.5, 2.6, 2.7 and 2.8, respectively.

Camera calibration
Camera calibration is a fundamental step for many 3-D reconstruction methods.The camera calibration determines the camera matrix and the distortion matrix.Those matrices allow removing distortion from an image, as well as Perspective Transformation calculations.
In this work, pictures of a chessboard pattern in different poses were used in order to perform 141 the camera calibration.The source code with the calibration algorithm, based on (Zhang, 2000), was available on OpenCV website.Prior to any of the calculations mentioned in the next sections, the image's distortion was removed.

Laser beam mapping on the camera 3-D space
The method proposed in this work uses information about the laser beam mapping on the camera 3-D space.The laser beam mapping includes the laser distance sensor's coordinate, as well as the laser beam equations on the camera 3-D space, and must be obtained for each of the laser distance sensors.
In order to obtain the laser beam mapping of a laser distance sensor, it is necessary to take a set of pictures containing both a chessboard pattern and the laser dot from the laser distance sensor, and to annotate the distance measured by the sensor in each picture.Each picture is taken with the chessboard pattern in a different distance from the camera.
The presence of the chessboard pattern allows the calculation of the rotation and the translation matrices, R and T, which relate the chessboard 3-D space to the camera 3-D space (see Fig. 9).As the laser dot lies on the chessboard plane, the Z´ coordinate of the laser dot on the chessboard 3-D space is zero.The 3-D coordinate of the laser dot on the chessboard 3-D space ( ′ ,  ′ ,  ′ is then calculated by: Given that: And the 3-D-coordinate of the laser dot on the camera 3-D space (, , is: where 1 ,  1 ,  1 , , , are constants.
Based on the laser beam line equations ( 7), ( 8) and ( 9) and the measurement from the laser distance sensor it is possible to estimate the "origin" of each laser beam, which is the sensor's coordinate (, , ) on the camera 3-D space.Given a laser dot coordinate (, ,  on the camera 3-D space and the measurement from the laser distance sensor, it is possible to calculate t: And substituting ton equation ( 11), ( 12) and ( 13 Where , , , , , are constants. Notice that the constants a, b and c on equations ( 11), ( 12) and ( 13) have the same value of those from the equations ( 7), ( 8) and ( 9).In order to determine (, , ) with better accuracy, the calculation of (, , ) can be repeated for each picture.

Calculation of the distance sensor's laser dot coordinates on camera 3-D space
Given the laser beam mapping on the camera 3-D space, it is possible to calculate the laser dot 3-D coordinate (, ,  on the camera 3-D space given the reading from the distance sensors.We assume that the right solution has positive Z, that is, the laser dots are ahead of the camera. Given the origin of a laser beam on the camera 3-D space (, , ), it is possible to re-write its line equations ( 7), ( 8) and ( 9) on the camera 3-D space in the following way: Where , , , ,  are constants.
Using the reading from the distance sensor, we can find the positive t, which gives a positive Z, using equation (10).
In order to obtain the coordinate (, , of the laser dot on the camera 3-D space, it is necessary to substitute the t value on the laser beam line equations ( 13), ( 14) and ( 15).Fig. 3 illustrates the laser dot coordinates of the right sensor and the left sensor on the camera 3-D space as (1, 1, 1 and 145 (2, 2, 2.

Calculation of the sensor's laser dot coordinate on the image plane
The laser dot coordinate on the image plane (, ) can be calculated using the Perspective Transformation equation: Given that: Then: where M is the Camera Matrix and (, ,  is the laser dot coordinate on the camera 3-D space, calculated on the previous sub-section.Fig. 4 illustrates the laser dot coordinates of the right sensor and the left sensor on the image plane namely (1, 1 and (2, 2.

Detection of interest points along the line formed by the laser dots' image points
An interest point is a coordinate on the image plane, along the line that links the two laser dots, that is interesting for any reason (see Fig. 5).It could be a point from the border of an object, for example, which is useful to measure the object.The interest points can be detected along this line manually or automatically.However, in an industrial application, such as the one presented on section 4, the interest points are detected using Image Processing techniques. 146

Calculation of the depth of image points on the line formed by the laser dots
The coordinates of the laser dots on the camera 3-D space (, , can be calculated, as shown in the sub-section 2.4.The depth of any interest point (, )along the line formed by the laser dots can then be obtained by linear interpolation, since the depth (Z-coordinate on camera 3-D space) of both laser dots, Z1 and Z2, as well as their image points (x1,y1) and (x2,y2) are known (see Fig. 6).The Zcoordinate of an image point on the line formed by the laser dots can be calculated by: or

Calculation of interest points 3-D-coordinate on the camera 3-D space
Given the depth of the interest point and the 2-D coordinate of the point on the image plane, it is possible to calculate its 3-D coordinate (, ,  on the camera 3-D space (see Fig. 7).

Validation experiment setup
The validation experiment was designed to check that the method developed works, and find its accuracy.Previous to the experiment, the camera parameters were obtained with camera calibration, as 147 described on section 2.2.After that, the camera and laser distance sensors were mounted on an acrylic support.After mounting, the laser beam mapping was obtained for both right and left sensor, as described on section 2.3.
The experiment consisted on placing the system in front of a white wall, and making measurements along a line on the wall (see Fig. 11).The camera used was a Basler Ace acA2000-165um (see Table 1).The camera lenses were the Fujinon HF9HA-1B (see Table 2) and the laser distance sensors were the Keyence IL-600 (see Table 3).The objective of this experiment setup was to measure the distance between the marks with the proposed system and compare to the measurement obtained with a paquimeter.On the wall, a line was drawn connecting the laser dots from the laser distance sensors.Along this line, two vertical marks were made.The interest points were the points where the vertical marks hit the line (see Fig. 12).The experiment consisted of three rounds.On each of them, two marks were drawn, and the width was measured with a paquimeter.After that, the same measurements were made with the proposed system, varying the distance between the system and the wall from 400mm to 600mm, with 50mm increments.The distance was defined according to the requirements of an industrial environment.The coordinates of the interest points on the image plane were obtained manually, using Pylon Viewer software.

Results
The width obtained using the paquimeter and the proposed system in each of the three rounds can be seen on Table 4.The statistical analysis of the data shown in Table 4are presented in Table 5.

CONCLUSION
This work presented a Computer Vision based method that measures the width along a line on the surface of a 3-D object.The experiments performed in the laboratory showed that the proposed method achieved good accuracy and could be used in a real world application.
Our proposed system could be adapted to carry out measurements of a plane, after the introduction of a third distance sensor.
Finally, the effect of the light on the measurements was not tested and more experiments are necessary in order to verify the need of a dedicated light.

Fig. 1 -
Fig. 1 -Overview of the proposed method

Fig. 5
Fig. 5 The interest points are represented in this figure generically as (x,y)

Fig. 6
Fig. 6 Z1 and Z2 are the Z coordinates of the laser dots on the camera 3-D space 140

Fig. 7
Fig. 7 The interest points' coordinates on the camera 3-D space

Fig. 8
Fig.8The laser dots coordinates on the camera 3-D space are calculated

Fig. 10
Fig. 10 illustrates some of the laser beam mapping steps.The laser dots' coordinates (, ,  are represented by the circles and the sensors' coordinates (, , ) are represented by the triangles.The Fig. 10 also shows the laser beam lines, plotted with the help of the laser beam equations.
Fig. 11 Proposed system in front of a white wall

Fig. 12
Fig. 12 Interest points along the line formed by the laser dots

Table 1
Camera specifications

Table 4
Data collected from experiment