High-resolution low-noise 360-degree digital solid reconstruction using phase-stepping profilometry

In this paper we describe a high-resolution, low-noise phaseshifting algorithm applied to 360 degree digitizing of solids with diffuse light scattering surface. A 360 degree profilometer needs to rotate the object a full revolution to digitize a three-dimensional (3D) solid. Although 360 degree profilometry is not new, we are proposing however a new experimental set-up which permits full phase-bandwidth phase-measuring algorithms. The first advantage of our solid profilometer is: it uses baseband, phase-stepping algorithms providing full data phase-bandwidth. This contrasts with band-pass, spatial-carrier Fourier profilometry which typically uses 1/3 of the fringe data-bandwidth. In addition phasemeasuring is generally more accurate than single line-projection, noncoherent, intensity-based line detection algorithms. Second advantage: new fringe-projection set-up which avoids self-occluding fringe-shadows for convex solids. Previous 360 degree fringe-projection profilometers generate self-occluding shadows because of the elevation illumination angles. Third advantage: trivial line-by-line fringe-data assembling based on a single cylindrical coordinate system shared by all 360-degree perspectives. This contrasts with multi-view overlapping fringe-projection systems which use iterative closest point (ICP) algorithms to fusion the 3D-data cloud within a single coordinate system (e.g. Geomagic). Finally we used a 400 steps/rotation turntable, and a 640x480 pixels CCD camera. Higher 3D digitized surface resolutions and less-noisy phase measurements are trivial by increasing the angular-spatial resolution and phase-steps number without any substantial change on our 360 degree profilometer. ©2014 Optical Society of America OCIS codes: (120.0120) Instrumentation, measurement, and metrology; (120.5050) Phase measurement; (120.4630) Optical inspection References and links 1. M. Takeda, H. Ina, and S. Kobayashi, “Fourier-transform method of fringe-pattern analysis for computer-based topography and interferometry,” J. Opt. Soc. Am. A 72, l56–l60 (1982). 2. M. Halioua, R. S. Krishnamurthy, H. C. Liu, and F. P. Chiang, “Automated 360 ° profilometry of 3-D diffuse objects,” Appl. Opt. 24(14), 2193–2196 (1985). 3. X. X. Cheng, X. Y. Su, and L. R. Guo, “Automated measurement method for 360 ° profilometry of 3-D diffuse objects,” Appl. Opt. 30(10), 1274–1278 (1991). 4. A. K. Asundi, “360-deg profilometry: new techniques for display and acquisition,” Opt. Eng. 33(8), 2760–2769 (1994). 5. M. Chang and W. C. Tai, “360-deg profile noncontact measurement using a neural network,” Opt. Eng. 34(12), 3572–3576 (1995). 6. A. S. Gomes, L. A. Serra, A. S. Lage, and A. Gomes, “Automated 360° degree profilometry of human trunk for spinal deformity analysis,” in Proceedings of Three Dimensional Analysis of Spinal Deformities, M. Damico et al. eds., (IOS, Burke, 1995), pp. 423–429. 7. Y. Song, H. Zhao, W. Chen, and Y. Tan, “360 degree 3D profilometry,” Proc. SPIE 3204, 204–208 (1997). 8. A. Asundi and W. Zhou, “Mapping algorithm for 360-deg profilometry with time delayed integration imaging,” Opt. Eng. 38(2), 339–344 (1999). 9. X. Su and W. Chen, “Fourier transform profilometry,” Opt. Lasers Eng. 35(5), 263–284 (2001). #206909 $15.00 USD Received 26 Feb 2014; revised 10 Apr 2014; accepted 15 Apr 2014; published 29 Apr 2014 (C) 2014 OSA 5 May 2014 | Vol. 22, No. 9 | DOI:10.1364/OE.22.010914 | OPTICS EXPRESS 10914 10. X. Zhang, P. Sun, and H. Wang, “A new 360 rotation profilometry and its application in engine design,” Proc. SPIE 4537, 265–268 (2002). 11. J. A. Munoz-Rodriguez, A. Asundi, and R. Rodriguez-Vera, “Recognition of a light line pattern by Hu moments for 3-D reconstruction of a rotated object,” Opt. Laser Technol. 37(2), 131–138 (2005). 12. G. Tmjillo-Schiaffino, N. Portillo-Amavisca, D. P. Salas-Peimbert, L. Molina-de la Rosa, S. AlmazanCuellarand, and L. F. Corral-Martinez, “Three-dimensional profilometry of solid objects in rotation,” in AIP Proceedings 992, 924–928 (2008). 13. B. Shi, B. Zhang, F. Liu, J. Luo, and J. Bai, “360° Fourier transform profilometry in surface reconstruction for fluorescence molecular tomography,” IEEE J Biomed Health Inform 17(3), 681–689 (2013). 14. Y. Zhang and G. Bu, “Automatic 360-deg profilometry of a 3D object using a shearing interferometer and virtual grating,” Proc. SPIE 2899, 162–169 (1996). 15. Z. Zhang, H. Ma, S. Zhang, T. Guo, C. E. Towers, and D. P. Towers, “Simple calibration of a phase-based 3D imaging system based on uneven fringe projection,” Opt. Lett. 36(5), 627–629 (2011). 16. M. Servin and J. C. Estrada, “Analysis and synthesis of phase shifting algorithms based on linear systems theory,” Opt. Lasers Eng. 50(8), 1009–1014 (2012).


Introduction
Fringe projection profilometry is a well known technique since the classical paper by Takeda et al. in 1982 [1]. Although this profilometry technique effectively demonstrated that 3D digitalization was possible using a single carrier fringe pattern, it cannot digitize the full 360 degree 3D object. The primary reason is that one needs to position the 3D object over a turntable to have access to every solid perspective from all (360 degree) directions. As far as we know the first researcher to implement an automated 360 degree profilometer was Halioua et al. in 1985 [2]. Halioua used a grating projector along with a 3-step phase shifter set-up and a turntable to obtain the 360 degree profilometry of a human mannequin head [2]. Later on in 1991 Cheng et al. have also positioned the 3D object in a turntable in order to rotate it 360° degree and obtain the full object information, and projected over the object a carrier frequency fringe-pattern [3]. Asundi published an interesting technique based on a striped light projection for 360 degree profilometry [4]. Using a light stripe from a laser diode, Chang et al. [5] automatically reconstructed a solid with 360 degrees using a neural network. Gomes et al. used projected linear grating to study the human trunk looking for spinal deformities; they used Fourier profilometry for their purpose [6]. Later on Song et al. used a fringe grating projector and phase shifting interferometry of a rotating object for 360° degree profilometry [7]. Asundi et al. used a time delay integration imaging for 360 degree acquisition of a rotating object [8] for 360 degree profilometry. The state of the art on 3D profilometry was reviewed at 2001 by Sue and Chen but they have only included a single paper of 360 degree profilometry [9]. Afterwards Zhang et al. have used 360 degree profilometry for flow analysis in mechanical engines [10]. In 2005 Munoz-Rodriguez et al. used triangulation for 3D object reconstruction by projecting a stripe light and Hu moments [11]. In 2008 Trujillo-Shiaffino et al. used 3D profilometry based on a single line projection and triangulation of a smooth rotating-symmetric object [12]. More recently Shi et al. used 360 degree fringe pattern projection profilometry applied to fluorescent molecular tomography [13]. Some researchers have used shearing interferometry to project high quality linear fringes for 360 degree profilometry [14]. In this paper we have not discussed calibration issues, because we have not used any new or non-standard calibration strategy apart from those well-known in 3D profilometry [15]. Also we have not used any new or non-standard phase unwrapping algorithm [14,15]. Given the low noise of the fringes and the noise-rejection capability of the 4-step least-squares phase-shifting demodulation, we have unwrapped our phase using simple line integration of wrapped phase differences.
Assume that the 3D-surfaces enclosing the solids are expressed in 3D cylindrical coordinates ( , , ) z ρ ϕ as (see Fig. 1), 2 2 ( , ) , , The 3D object under analysis lies within [ , ] L L − in the z direction and within [0,2 ) π in the azimuthal ϕ direction (see Fig. 1). The projected linear fringes are aimed towards the 3Dsurface ( , ) z ρ ϕ having a phase-sensitivity angle θ 0 (see Fig. 1(a)). Then the object is rotated N times an incremental azimuthal angle 2 / N ϕ π Δ = (see Fig. 1(a)). For each increment 2 / N ϕ π Δ = one collects the pixels at the center column of the CCD (x = 0,z). In this way one generates an image composed by N-lines in the ϕ direction, each having the CCD's pixel- . Therefore the composed spatial-carrier fringe pattern is, Where the spatial-carrier of the projected fringes is 0 ν , and assuming telecentric illumination and imaging, the spatial carrier at the CCD plane is On the other hand a mathematical model for light-stripe profilometry may be ( Fig. 1 This Dirac delta is phase-modulated by the 3D-object surface ( , ) z ρ ϕ . For each discrete is estimated by image intensity (not phase-measuring) algorithms and it is proportional to the 3D object's topography. The main advantage of using a light-stripe is that many solids are fully illuminated (without selfgenerating shadows) from its highest point z L = + , to its lowest one z L = − . This full solid illumination is frequently impossible with the configuration in Fig. 1(a), because the elevation angle of the grating projector easily cast occluding shadows in the lower part of the 3D solid under analysis, precluding its digitizing.

Proposed improved profilometer for 360 degree solid digitizing
The improved 360-degree solid profilometry set-up is shown in Fig. 2. With this experimental set-up one maintains the higher phase resolution of interferometric phase-measuring methods while eliminating most self-generated object shadows cast by the 3D solid (see Fig. 1(a)). The 3D surfaces that can be digitized with our 360 degree profilometer belongs to the continuous, single-valued function space C 1 in cylindrical coordinates ( , , ) z ρ ϕ , Any solid that can be enclosed with a 3D-surface within the function space 1 ( , ) z C ρ ϕ ∈ may be digitized with our 360 degree profilometer; this function space includes as a subset quasicylindrical surfaces ( , )  , as well as any topological convex solid. In Fig. 2, the CCD sensor is assumed to be parallel to the (x,z) plane. Therefore the 3Dsurface imaged over the CCD plane has the following mathematical form, But we are only interested at the centered (x = 0) CCD column-pixels (0, ) I z . In this way for a full azimuthal rotation [0,2 ) ϕ π ∈ , one collects N-CCD columns at (0, , ) I z ϕ as, The projected linear grating is in this case, along the z axis and orthogonal to previous 360-degree fringe-projection profilometers ( Fig. 1(a)). As for the light-stripe ( Fig. 1(b)) the phasesensitivity of this 360-degree test is proportional to tan(φ 0 ). This angle (φ 0 ) must be kept large enough to increase the sensitivity of the test while avoiding lateral self-occluding shadowing.  (7) shows that the resulting fringe-pattern has no spatial carrier because 0 0 0 0 x ω ω = = . So we need several phase-shifted fringe-patterns for phase demodulation of 1 ( , ) g z C ρ ϕ ∈ .
To give an intuitive idea of the phase-shifted fringe patterns generated by our profilometer, assume that a sphere with radius L is being digitized. The sphere in cylindrical coordinates ( , , ) z ρ ϕ is, Assuming telecentric projection and imaging optical systems, the carrier frequency fringe pattern at the CCD camera parallel to the (x,z) plane is shown in Fig. 3(a). In Figs. 3(b), 3(c), and 3(d) the phase-shifted fringe patterns correspond to, Phase-shifting demodulation of these 3 fringe patterns gives the digitized sphere's phase in cylindrical coordinates; an element of the function space  (9)). The phase-shift among these 3 fringe patterns (see Eq. (9)) is 2π/3 radians, and are phase-demodulated using a 3-step leastsquares phase-shifting algorithm to obtain gρ(z,φ).

Experimental results
Here we show the experimental results for our proposed 360-degree profilometry technique. As test 3D-surface ( , ) z ρ ϕ , we have chosen a (white-painted) Rubik's cube mounted in its triangular base for its sharp angles and high frequency surface structure which clearly reveals the accuracy, high resolution and low noise of the proposed 360-degree digitizing technique. We start by showing a white-light photograph of our Rubik's cube in Fig. 4(a). In Fig.  4(b) we show the same cube illuminated with the linear grating along the z coordinate. The projection and imaging optical system are almost telecentric, and the camera's CCD is parallel to the (x,z) plane. Figure 5(a) and Fig. 5(b) show two (out-of 4) fringe patterns phase-modulated by the Rubik's cylindrical coordinate representation ( , ) z ρ ϕ . Figure 5(a) has a phase-shift of 0radians, while Fig. 5(b) has a phase-shift of π radians. The fact that the camera and the fringe projector both have their optical axis at the middle of the Rubik's cube permits an entire (shadows-free) clear digitalization of the cube 3D-surface ( , ) z ρ ϕ (see Fig. 2). In general if the sensitivity angle 0 ( ) ϕ is not too high, and the 3D-surface belongs to 1 ( , ) z C ρ ϕ ∈ , no selfgenerated shadows from the object under analysis are cast over the camera-view as in previous grating projection configurations (see Fig. 1(a)).   Fig. 6 shows three different perspectives of the 360degree digitized Rubik's cube ( , ) z ρ ϕ . As Fig. 6 shows, the Rubik's cube is obtained with low phase-noise and high spatial resolution which translates into more visible surface details. Of course higher spatial frequency surface reconstruction is possible by increasing the azimuthal 2 / N ϕ π Δ = and spatial CCD pixel resolutions. Lower phase-noise demodulation is also possible by increasing the number M of phase-stepped images, Fig. 6 clearly shows interesting high frequency surface details of the Rubik's cube.
Because the fringe patterns obtained have no spatial-carrier (see Eq. (7)) we may use the full spatial Fourier spectrum to reconstruct fine details of the 3D cube's surface. In spatial carrier interferometry one has at most half, but in practice, one third of the available Fourier frequency space to house the 3D surface spectrum. In contrast having close-fringes (baseband) fringe patterns (Figs. 5(a) and 5(b)) one has the full spectral bandwidth to recover fine surface details. This high-frequency surface reconstruction (Fig. 6) cannot be seen in previously published 360-degree approaches which render very smooth digital solid surfaces [2][3][4][5][6][7][8][9][10][11][12][13][14][15]. At the risk of becoming repetitive, we emphasize that, using base-band (closedfringes) fringe patterns, allows one to reconstruct the digitized 3D-surfaces with the full theoretical spatial bandwidth that the raw digitized fringe data can hold.

Discussion of the advantages and limitations of the proposed 360-degree profilometer
In the peer-reviewing process we had many interesting questions coming from our reviewers and we feel that it is worth answering them within the paper because they elucidate and clarifies many interesting points not covered, or probably not fully explained so far. a) Why this is a low-noise 360-degree profilometry technique? This 360-degree profilometer has low phase-noise because it uses base-band, phase-measuring, Mstep demodulation algorithms. If white additive Gaussian noise corrupt the fringes, and least-squares M-step phase-shifting algorithms are used (as in this paper) we obtain an analytic signal with a noise-power-reduction of (1/M) with respect to the noise power of the digitized fringes [16]. In our case the analytical signal in Eq. (10) has a noise power reduction of (1/4) with respect to the additive noise power of each phase-stepped fringe images ( , , ) I z n ϕ , {0,1, 2,3} n = .
b) How many digital CCD images are needed to obtain each phase-stepped cylindrical fringe image ( , , ) I z n ϕ of the solid? We need N CCD-images to assemble N central CCD-lines at x = 0 into a single fringe-image ( , , ) I z n ϕ . In our Rubik's cube, for each rotation we took 400 CCD-lines per phase-stepped fringe-image ( , , ) I z n ϕ . The azimuthal resolution increment was Δφ = 2π/400. c) Which unwrapping algorithm was used? Given the low-noise of the projected fringes and the 4-step least-squares phase-shifting demodulation used, we have employed the most basic phase unwrapping algorithm: line integration of wrapped phase differences.