• Chinese Optics Letters
  • Vol. 14, Issue 1, 010010 (2016)
Qingyang Wu*, Baichun Zhang, Haixin Lin, Xiangjun Zeng, and Zeng Zeng
Author Affiliations
  • Shenzhen Key Laboratory of Micro-Nano Photonic Information Technology, College of Electronic Science and Technology, Shenzhen University, Shenzhen 518060, China
  • show less
    DOI: 10.3788/COL201614.010010 Cite this Article Set citation alerts
    Qingyang Wu, Baichun Zhang, Haixin Lin, Xiangjun Zeng, Zeng Zeng. Novel 3D shape measurement method with lower occlusion for the bottom of cavity[J]. Chinese Optics Letters, 2016, 14(1): 010010 Copy Citation Text show less

    Abstract

    Because the bottom of the cavity has the shadow and occlusion, the angle between the projection system and imaging system is limited. So the traditional fringe projection technique based on the principle of optical triangulation is inapplicable. This Letter presents a 3D shape measurement method of using the light tube for the cavity. The method can measure an object from two opposite views at the same time, which means it will obtain two different groups of 3D data for the same object in a single measurement. The experimental results show the feasibility and validity of the 3D shape measurement method.

    The optical 3D shape measurement based on digital fringe projection technique is widely applied in various areas, including reverse engineering, industrial inspection, machine vision, and medical imaging[13]. The technique is based on the principle of optical triangulation, which means that it requires a certain angle between the axis of the projection and imaging system[48]. The angle is significantly associated with the measurement accuracy. However, when we measure the inner confined space of the objects, such as a small-mouthed container, an engine cylinder, and the inside carving of handicraft it is difficult to keep a larger angle because there is not enough working space in the internal space and the larger angle is blocked by some obstacles.

    At present, the methods of optical 3D measurement for inner confined space (like a cavity) can be divided into the following two categories: using optical fiber image bundles to transmit images[9] and using a micro device that can reduce the size of instruments[10]. The image can spread along the optical fiber image bundles freely, and the optical fiber image bundles have the advantages of large use freedom, blending in any directions, so this method can realize 3D measurement for the inside of objects. By projecting the fringe images onto the internal surface from external space the image is collected by CCD, which is transferred from interior space to external space. However, the effect of optical fiber image bundles is not ideal because of the limitation of their material characteristics and image transmission principles. The optical fiber image bundles are expensive and the technique is complicated, and it is easy to have broken fibers, which will cause blind pixels due to limitations of manufacture technologies. For using the micro device, the cost of the device is higher and the resolution of the miniature CCD and projection device is low, which will lead to a low measurement accuracy.

    This Letter aims at the problems of image projection and collection in a confined space; we design a light tube that is based on the plane mirror to transmit images and develop a 3D measurement system for the bottom of a cavity. There are two subimages on the imaging plane of the camera that can collect two groups of 3D data for the internal surface of the cavity. In the end, compared with the traditional structured light projection technology, this system reduces the influence of occlusion. In addition, the two groups of data are calibrated in the same world coordinate system; therefore we realize the automatic splicing for the point cloud data without registration. The following experimental results show the feasibility and validity of the 3D shape measurement method.

    Phase-shifting methods are widely used in optical 3D measurement based on digital fringe projection because of their speed and accuracy[1115]. We use a four-step phase-shifting algorithm to solve the phase value[1618]. The four-step phase shifting images recorded by a CCD camera can be written as Im(i,j)=A(i,j)+B(i,j)cos(φ(i,j)mπ2),m=0,1,2,3,where A(i,j) is the intensity of the background light, B(i,j) is the fringe contrast, and φ(i,j) is the phase to be solved for. The phase can be obtained by φ(i,j)=tan1I1(i,j)I3(i,j)I0(i,j)I2(i,j).

    This formula provides the wrapped phase ranging from π to +π with 2π discontinuities. The unwrapped phase can be obtained by the phase unwrapping algorithm[19].

    As shown in Fig. 1(a), in order to accomplish a measurement for the bottom of point P, the traditional measurement system needs to reduce the angle between the axis of the projection and imaging system. The deeper the point P is, the smaller the angle θ will be. From the principle of optical triangulation we know that the measuring accuracy is related to the angle θ; if the angle θ is too small, the accuracy cannot be guaranteed. In order to keep the imaging angle, a light tube consisting of two planar mirrors is employed to transmit images, as shown in Fig. 1(b). The fringe patterns are projected onto the internal surface of the object along the coaxial direction of the light tube. The two planar mirrors can create a multireflection beam path to transmit the field of view, which makes it possible for the camera to collect the deformed fringe images.

    Measurement system for the bottom of a cavity. (a) Traditional system and (b) new system.

    Figure 1.Measurement system for the bottom of a cavity. (a) Traditional system and (b) new system.

    Figure 2 shows the principle of the 3D measurement system structure for a cavity. It mainly contains three parts, including a miniature digital image projection system, light tube, and imaging system. According to Fig. 2, we know that the deformed fringe image is reflected by mirror 1 and mirror 2 and finally it is received by the imaging system after multiple reflections in the light tube. The whole CCD chip is divided into two parts along the image axis of the camera lens. With an appropriate view angle, the two parts will collect their own images at the same time. The imaging results are shown in the top right corner of Fig. 2. The left and right parts of the image are formed by the red and green lines, respectively.

    Principle of the system structure.

    Figure 2.Principle of the system structure.

    The reconstruction of 3D data points is based on the algorithm of direct linear transformation[2022]. The image coordinates, unwrapped phase, and corresponding world coordinates can be expressed by the mapping formula as (Xw,Yw,Zw)=f(u,v,φ),where (u,v) is the image coordinate of the feature point, φ is the unwrapped phase value of the feature point, and (X,Y,Z) is the world coordinate of the feature point. In this Letter we use the three fitting method, so the Eq. (3) can be written as B=A×Mwhere B is the vector in the real coordinate system (X,Y,Z), A is a 20×1 matrix containing the parameters A=[u3v3φ3u2vu2φv2uv2φφ2uφ2vu2v2φ2uvuφvφuvφuvφ1],and M is a 20×3 matrix that includes all of the polynomial coefficients as follows: M=[a1b1c1a2b2c2a20b20c20].

    In order to realize 3D reconstruction, we need to obtain the polynomial coefficient of M. Figure 3 shows the principle of system calibration. First, a 2D plane calibration target is fixed on the translation stage, moving the target along the direction perpendicular to the axis of the projection system (P1,P2,,Pn), projecting and collecting the images of the calibration target in each position. According to the plane of the calibration target P1 and the motion direction of the calibration target, the world coordinate system OXYZ is established. The position of the feature points on the calibration target and the moving distance of the calibration target are previously known, so the 3D coordinates (X,Y,Z) of the feature points directly in the OXYZ world coordinate system can be calculated. Then the image coordinates (u,v) and the unwrapped phase φ(u,v) of the feature points are extracted by the collected images. Finally, we substituted the 3D coordinates (X,Y,Z), image coordinates (u,v), and unwrapped phase φ(u,v) of all of the feature points into Eq. (4); the polynomial coefficients of M are obtained with the least squares method as follows: M=(ATA)1ATB.

    Principle of the system calibration.

    Figure 3.Principle of the system calibration.

    Setting all of the feature points to N, then B is a matrix of N×3 and A is a matrix of N×20.

    The reconstruction of 3D data points for the right part can be obtained by using Eq. (3) and, in order to realize automatic matching, we need to obtain the 3D data for left part. We know the 3D coordinates (X,Y,Z) of the feature points in the OXYZ world coordinate system and the image coordinates (u,v) in the right part; the key problem is how to find the matching points in the left part. In this Letter, we use the phase-matching technique to obtain the matching point. The fringe patterns with horizontal and vertical directions are projected, respectively, so there is a unique matching point in each pixel point. The corresponding feature points in the left part can be obtained by this method. The two groups of data are reconstructed in the same world coordinate system so the point cloud data are automatic matching without registration.

    Figure 4(a) shows the calibration steps; the spacing between the center points of two adjacent crosshairs in the calibration target is 2 mm. According to the imaging properties of the light tube, there will be two subimages on the imaging plane of the camera [see Fig. 4(b)], the top right corner of the crosshair in the left image and the top left corner of the crosshair in the right image actually are the image of the same crosshair in the calibration target. We set that point as the origin O, the calibration target plane as plane XOY, and the motion direction of the calibration target as the Z direction. Therefore, the world coordinate system OXYZ is established, and the measureable size in the XOY axes is about 10mm×12mm. After the corresponding feature points are obtained in the two cameras, we can collect two groups of 3D points of cloud data with respect to the mapping coefficients.

    Calibration steps.

    Figure 4.Calibration steps.

    First, we move the translation stage five times along the Z direction of the world coordinate, and collect two groups of images (including the horizontal and vertical fringe patterns) at each position. Figure 5 shows one group of the collected images. The traveling distance of each step is 1 mm. We can get six positions in total that can be used to obtain the world coordinates of the feature points in OXYZ coordinates.

    Image of the fringe pattern.

    Figure 5.Image of the fringe pattern.

    Second, we calculate the image coordinates (u,v) of these feature points and the corresponding unwrapped phase value φ(u,v) by the image in the right part (the right part of the images).

    Third, the world coordinates (X,Y,Z), the image coordinates (u,v), and the unwrapped phase value φ(u,v) of the feature points can be used to solve the polynomial coefficient M by Eq. (7).

    Finally, we use the phase-matching technique to obtain the matching points in the left part, and then we can obtain another group of polynomial coefficients.

    Figure 6 shows the 3D measurement system for the cavity. The resolution of the camera is 1280×960, the pixel size is 3.75 μm, and the focal length is 50 mm. The resolution of the projector is 640×480, and the length of the light tube is 150 mm.

    System structure.

    Figure 6.System structure.

    We measured the bottom shape of the jar [see Fig. 7(a)], the height of the jar is 110 mm. Fig. 7(b) is one of the fringe patterns images, the 3D point cloud data of Figs. 7(c) and 7(d) are reconstructed by the left and the right parts of Fig. 7(b). The phase information will be lost in the regions [see the red box in Fig. 7(b)] in which the stripes are obscured or severe compressed. The combination of data in Figs. 7(c) and 7(d) can mutually make up for the missing data, and then we can obtain the complete 3D data [see Fig. 7(e)]. From the experimental results we know that this method increases the measuring area and reduces the effect of shade and shelter caused by the object. In addition, in order to reduce the error of system calibration, we use the phase-matching technique to get the matching points. The experimental results show that the measurement volume of the system is about 10mm×12mm×5mm, and the two groups of data have a good accuracy of registration.

    Experimental result. (a) The bottom shape of jar, (b) one of the fringe pattern images, (c) reconstruction result of the left part in Fig. 7(b), (d) reconstruction result of the right part in Fig. 7(b), and (e) stitching results of Figs. 7(c) and 7(d).

    Figure 7.Experimental result. (a) The bottom shape of jar, (b) one of the fringe pattern images, (c) reconstruction result of the left part in Fig. 7(b), (d) reconstruction result of the right part in Fig. 7(b), and (e) stitching results of Figs. 7(c) and 7(d).

    In order to verify the measurement accuracy of the system, we measure a surface of the standard block, as shown in Fig. 8. We use the point cloud data of the standard block to fit a plane, and finally get the color map of the plane fitting deviation (see Fig. 9) in which the geometric maximum is 0.0582 mm, the average value is 0.0081 mm, and the standard deviation is 0.0062 mm.

    Standard cube.

    Figure 8.Standard cube.

    Measurement of the standard plane.

    Figure 9.Measurement of the standard plane.

    We also measure a standard ball whose radius is 1.7500 mm (as Fig. 10). The point cloud data of it is used to fit a sphere, and then we can obtain the color map of the sphere fitting deviation (see Fig. 11) in which the fitting radius is 1.7584 mm, geometric maximum is 0.0571 mm, the average value is 0.0129 mm, and the standard deviation is 0.0093 mm.

    Standard ball.

    Figure 10.Standard ball.

    Measurement of the standard ball.

    Figure 11.Measurement of the standard ball.

    In conclusion, this Letter presents a novel method of 3D shape measurement with lower occlusion for the bottom of a cavity where a light tube is used to provide the possibility for a larger imaging angle. Compared to the measurement system based on optical fiber image bundles and a micro device, this method is low cost and also can collect high-accuracy 3D data. With the use of the two planar mirrors, the system can measure the object from two perspectives at the same time and reduce the influence of occlusion. Experimental results show the feasibility and validity of this method. In addition, the light tube is an extra component whose robustness will influence the stability of this system and limit the measurement flexibility. In order to enhance the measurement range and property of the system, in the next step we will improve the structure of the light tube.

    Qingyang Wu, Baichun Zhang, Haixin Lin, Xiangjun Zeng, Zeng Zeng. Novel 3D shape measurement method with lower occlusion for the bottom of cavity[J]. Chinese Optics Letters, 2016, 14(1): 010010
    Download Citation