Analysis of the depth of field of integral imaging displays based on wave optics

In this paper, we analyze the depth of field (DOF) of integral imaging displays based on wave optics. With considering the diffraction effect, we analyze the intensity distribution of light with multiple microlenses and derive a DOF calculation formula for integral imaging display system. We study the variations of DOF values with different system parameters. Experimental results are provided to verify the accuracy of the theoretical analysis. The analyses and experimental results presented in this paper could be beneficial for better understanding and designing of integral imaging displays. ©2013 Optical Society of America OCIS codes: (100.6890) Three-dimensional image processing; (110.2990) Image formation theory; (110.3055) Information theoretical analysis; (350.5500) Propagation. References and links 1. G. Lippmann, “La photographie integrale,” C. R. Acad. Sci. 146, 446–451 (1908). 2. M. Cho, M. Daneshpanah, I. Moon, and B. Javidi, “Three-dimensional optical sensing and visualization using integral imaging,” Proc. IEEE 99(4), 556–575 (2011). 3. C. C. Ji, C. G. Luo, H. Deng, D. H. Li, and Q. H. Wang, “Tilted elemental image array generation method for moiré-reduced computer generated integral imaging display,” Opt. Express 21(17), 19816–19824 (2013). 4. X. Xiao, B. Javidi, M. Martínez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications,” Appl. Opt. 52(4), 546–560 (2013). 5. Y. Takaki, K. Tanaka, and J. Nakamura, “Super multi-view display with a lower resolution flat-panel display,” Opt. Express 19(5), 4129–4139 (2011). 6. J. Nakamura, T. Takahashi, C. W. Chen, Y. P. Huang, and Y. Takaki, “Analysis of longitudinal viewing freedom of reduced-view super multi-view display and increased longitudinal viewing freedom using eye-tracking technique,” J. Soc. Inf. Disp. 20(4), 228–234 (2012). 7. S. Tay, P. A. Blanche, R. Voorakaranam, A. V. Tunç, W. Lin, S. Rokutanda, T. Gu, D. Flores, P. Wang, G. Li, P. St Hilaire, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “An updatable holographic three-dimensional display,” Nature 451(7179), 694–698 (2008). 8. H. E. Ives, “Optical properties of a Lippman lenticulated sheet,” J. Opt. Soc. Am. A 21(3), 171–176 (1931). 9. C. B. Burckhardt, “Optimum parameters and resolution limitation of integral photography,” J. Opt. Soc. Am. A 58(1), 71–74 (1968). 10. T. Okoshi, Three-Dimensional Imaging Techniques (Academic, 1976). 11. L. Yang, M. McCormick, and N. Davies, “Discussion of the optics of a new 3-D imaging system,” Appl. Opt. 27(21), 4529–4534 (1988). 12. F. Okano, J. Arai, K. Mitani, and M. Okui, “Real-time integral imaging based on extremely high resolution video system,” Proc. IEEE 94(3), 490–501 (2006). 13. J. Arai, F. Okano, H. Hoshino, and I. Yuyama, “Gradient-index lens-array method based on real-time integral photography for three-dimensional images,” Appl. Opt. 37(11), 2034–2045 (1998). 14. H. Hoshino, F. Okano, H. Isono, and I. Yuyama, “Analysis of resolution limitation of integral photography,” J. Opt. Soc. Am. A 15(8), 2059–2065 (1998). 15. T. Mishina, “3D television system based on integral photography,” in Picture Coding Symposium (PCS) (Nagoya, Japan, 2010), p. 20. 16. J. Arai, F. Okano, M. Kawakita, M. Okui, Y. Haino, M. Yoshimura, M. Furuya, and M. Sato, “Integral threedimensional television using a 33-megapixel imaging system,” J. Disp. Technol. 6(10), 422–430 (2010). #197453 $15.00 USD Received 10 Sep 2013; revised 4 Dec 2013; accepted 5 Dec 2013; published 11 Dec 2013 (C) 2013 OSA 16 December 2013 | Vol. 21, No. 25 | DOI:10.1364/OE.21.031263 | OPTICS EXPRESS 31263 17. D. Aloni, A. Stern, and B. Javidi, “Three-dimensional photon counting integral imaging reconstruction using penalized maximum likelihood expectation maximization,” Opt. Express 19(20), 19681–19687 (2011). 18. M. Cho and B. Javidi, “Three-dimensional visualization of objects in turbid water using integral imaging,” J. Disp. Technol. 6(10), 544–547 (2010). 19. B. Javidi, I. Moon, and S. Yeom, “Three-dimensional identification of biological microorganism using integral imaging,” Opt. Express 14(25), 12096–12108 (2006). 20. J. Arai, H. Hoshino, M. Okui, and F. Okano, “Effects of focusing on the resolution characteristics of integral photography,” J. Opt. Soc. Am. A 20(6), 996–1004 (2003). 21. A. Ö. Yöntem and L. Onural, “Integral imaging using phase-only LCoS spatial light modulators as Fresnel lenslet arrays,” J. Opt. Soc. Am. A 28(11), 2359–2375 (2011). 22. F. Okano, J. Arai, H. Hoshino, and I. Yuyama, “Three-dimensional video system based on integral photography,” Opt. Eng. 38(6), 1072–1077 (1999). 23. J. H. Park, K. Hong, and B. Lee, “Recent progress in three-dimensional information processing based on integral imaging,” Appl. Opt. 48(34), H77–H94 (2009). 24. R. Martinez-Cuenca, G. Saavedra, M. Martinez-Corral, and B. Javidi, “Progress in 3-D multiperspective display by integral imaging,” Proc. IEEE 97(6), 1067–1077 (2009). 25. R. Yang, X. Huang, S. Li, and C. Jaynes, “Toward the light field display: Autostereoscopic rendering via a cluster of projectors,” IEEE Trans. Vis. Comput. Graph. 14(1), 84–96 (2008). 26. B. Javidi, S. H. Hong, and O. Matoba, “Multidimensional optical sensor and imaging system,” Appl. Opt. 45(13), 2986–2994 (2006). 27. H. Liao, N. Hata, S. Nakajima, M. Iwahara, I. Sakuma, and T. Dohi, “Surgical navigation by autostereoscopic image overlay of integral videography,” IEEE Trans. Inf. Technol. Biomed. 8(2), 114–121 (2004). 28. H. Deng, Q. H. Wang, Y. H. Tao, D. H. Li, and F. N. Wang, “Realization of undistorted and orthoscopic integral imaging without black zone in real and virtual fields,” J. Disp. Technol. 7(5), 255–258 (2011). 29. J. Y. Son, W. H. Son, S. K. Kim, K. H. Lee, and B. Javidi, “Three-dimensional imaging for creating real-worldlike environments,” Proc. IEEE 101(1), 190–205 (2013). 30. Y. Igarashi, H. Murata, and M. Ueda, “3D display system using a computer generated integral photography,” Jpn. J. Appl. Phys. 17(9), 1683–1684 (1978). 31. A. Stern and B. Javidi, “Three-dimensional image sensing, visualization, and processing using integral imaging,” Proc. IEEE 94(3), 591–607 (2006). 32. C. G. Luo, C. C. Ji, F. N. Wang, Y. Z. Wang, and Q. H. Wang, “Crosstalk free integral imaging display with wide viewing angle using periodic black mask,” J. Disp. Technol. 8(11), 634–638 (2012). 33. R. Martínez-Cuenca, H. Navarro, G. Saavedra, B. Javidi, and M. Martinez-Corral, “Enhanced viewing-angle integral imaging by multiple-axis telecentric relay system,” Opt. Express 15(24), 16255–16260 (2007). 34. C. W. Chen, Y. P. Huang, P. Y. Hsieh, T. H. Jen, Y. C. Chang, Y. R. Su, M. Cho, X. Xiao, and B. Javidi, “Enlarged viewing angle of integral imaging system by liquid crystal prism,” Proc. SID Symp. Dig. 44(1), 231– 234 (2013). 35. H. Navarro, J. C. Barreiro, G. Saavedra, M. Martínez-Corral, and B. Javidi, “High-resolution far-field integralimaging camera by double snapshot,” Opt. Express 20(2), 890–895 (2012). 36. R. Martínez-Cuenca, G. Saavedra, M. Martínez-Corral, and B. Javidi, “Extended depth-of-field 3-D display and visualization by combination of amplitude-modulated microlenses and deconvolution tools,” J. Disp. Technol. 1(2), 321–327 (2005). 37. S. W. Min, J. Kim, and B. Lee, “New characteristic equation of three-dimensional integral imaging system and its applications,” Jpn. J. Appl. Phys. 44(2), L71–L74 (2005). 38. C. G. Luo, Q. H. Wang, H. Deng, X. X. Gong, L. Li, and F. N. Wang, “Depth calculation method of integral imaging based on Gaussian beam distribution model,” J. Disp. Technol. 8(2), 112–116 (2012). 39. J. W. Goodman, Introduction to Fourier Optics (Roberts and Company, 2005), Chap. 5. 40. F. Okano, J. Arai, and M. Kawakita, “Wave optical analysis of integral method for three-dimensional images,” Opt. Lett. 32(4), 364–366 (2007). 41. F. A. Jenkins and H. E. White, Fundamentals of Optics, 4th ed., S. Grall, ed. (McGraw-Hill, 2001), Part 1. 42. M. Born and E. Wolf, Principle of Optic, 7th ed. (Cambridge University, 1999), Chap. VIII. 43. J. S. Jang and B. Javidi, “Large depth-of-focus time-multiplexed three-dimensional integral imaging by use of lenslets with nonuniform focal lengths and aperture sizes,” Opt. Lett. 28(20), 1924–1926 (2003). 44. J. E. Greivenkamp, “Airy Disk,” in Field Guide to Geometrical Optics (SPIE, 2004). 45. B. Javidi, F. Okano, and J. Y. Son, Three-Dimensional Imaging, Visualization, and Display (Springer, 2009). 46. G. Park, J. H. Jung, K. Hong, Y. Kim, Y. H. Kim, S. W. Min, and B. Lee, “Multi-viewer tracking integral imaging system and its viewing zone analysis,” Opt. Express 17(20), 17895–17908 (2009). 47. ITU-R Rec. BT.1438, Subjective assessment of stereoscopic television pictures (2000). 48. M. Cho and B. Javidi, “Optimization of 3D integral imaging system parameters,” J. Disp. Technol. 8(6), 357– 360 (2012). 49. D. Shin, M. Daneshpanah, and B. Javidi, “Generalization of three-dimensional N-ocular imaging systems under fixed resource constraints,” Opt. Lett. 37(1), 19–21 (2012).


Introduction
Integral imaging is a 3D technique that was proposed by Lippman in 1908 [1].It can provide full-parallax and continuous-viewing 3D images and does not require any special glasses or coherent light, such as those used in multi-view auto-stereoscopic 3D display or holography [2][3][4][5][6][7].Integral imaging has attracted extensive attention for its research and application in the area of 3D sensing and display .
However, integral imaging displays are affected by diffraction and aberration effects in both optical pickup and display processes.To eliminate the effects of diffraction and aberration in the optical pickup process, computer-generated integral imaging can be applied [30].It can generate elemental images by using computer graphic techniques with the parameters of the virtual micro-lens array (MLA) without a real optical system.Because of its flexibility and diffraction-free feature, computer-generated integral imaging has been proposed in many fields such as entertainment and medical sciences et al..In such cases, the deterioration factor in the pickup process can be ignored and only the display process should be considered when analyzing the viewing performance of the system.In this paper, we analyze the depth of field (DOF) using computer-generated integral imaging.
The main parameters used to evaluate the performance of the integral imaging display are DOF, viewing resolution, and viewing angle [31][32][33][34][35][36].In this paper, we focus on the analysis of the DOF of integral imaging displays using wave optics.First, we analyze the number of correspondence pixels which are responsible for reconstructing the same image point in 3D space.Then, by analyzing the wave propagation and light intensity distribution with multiple micro-lenses, we perform the DOF calculation based on hyperbola fitting and the minimum angular resolution of human eyes.By taking into consideration of the diffraction effect and using the superposition of light intensity with multiple micro-lenses, we can obtain more accurate DOF calculation results compared to the previous methods for the integral imaging display.Experimental results are also given to verify the accuracy of the proposed method.
This paper is organized as follows.In section 2, we review two pre-existing DOF calculation methods based on geometrical optics and Gaussian beam distribution model [37,38].In section 3, we propose a DOF calculation method using wave optics [39,40].In sections 4, we calculate several groups of DOF values using our proposed method and analyze the variations of DOF with different system parameters.In section 5, the accuracy of the method is verified by comparing the experimental results of the proposed method with the previous researches.In section 6, we summarize the main achievements of this paper and conclude with future directions and outlook.

Previous DOF calculation methods
One of the most important parameters that set 3D displays apart from 2D displays is their DOF.To understand the DOF of integral imaging displays, Min et al. [37] performed an analysis using geometrical optics theory without considering the diffraction effect caused by MLA.Because the integrated image is out of focus when the image is located away from the central depth plane (CDP), DOF can be set as the distance between the rear and front marginal depth planes, where the ray-optical focusing error occurs due to the overlap of the image pixel: where p and p d represent the pitch of the MLA and the pixel size of the display, respectively.Parameter g represents the gap between the display and the MLA, and l represents the conjugate distance defined by the Gauss lens law [41].
A DOF calculation method based on Gaussian beam distribution model was also proposed [38].In that model, the light emanating from a single pixel on the display was considered as a Gaussian beam whose waist is located at the CDP.Taking into account the minimum angular resolution of human eyes (α e ) [42], the DOF was obtained by calculating the locations of the rear and front marginal depth planes based on the Gaussian beam function: where d refers to the viewing distance between the observer and the CDP.These DOF calculation methods are easy to use but suffer from low accuracy because they both neglect the diffraction effect and only take a single micro-lens into consideration.Even though the second method tries to approximate the actual light intensity distribution in the image space, the obtained Gaussian light beam functions are derived from the geometrical relationships of the integral imaging display.It results in the low accuracy of the DOF calculation results.

DOF calculation methods based on wave optics
Here, we propose another approach for DOF calculation using wave-optics.First, we analyze the number of correspondence pixels which are responsible for reconstructing the same image point in 3D space.Then, by analyzing the wave propagation and light intensity distribution with multiple micro-lenses, we perform the DOF calculation based on hyperbola fitting and the minimum angular resolution of human eyes.
Note that for an integral imaging display the pixel size of the display device is a factor that strongly influences the DOF [37,38].However, in our analysis, we regard the display pixels as ideal points for simplification.We only consider square micro-lenses in this paper, but it is not difficult to expand our proposed method to other micro-lens shapes.In addition, our studies are based on resolution priority integral imaging (RPII) displays [43] while the derivations of depth priority integral imaging (DPII) [43] displays are similar.For a RPII display, the correspondence pixels are located on different elemental images and imaged by their corresponding micro-lenses to reconstruct the same image point on the conjugate plane, as shown in Fig. 1.Suppose the MLA contains M × N micro-lenses, and the central micro-lens is denoted as the 0th micro-lens.To describe the light propagation in the image space, we set up a coordinate system xyz in which the z axis coincides with the optical axis of the 0th micro-lens.And the origin is located at the center of that lens.Also, a plane coordinate system x 0 y 0 is utilized to describe the MLA plane (z = 0).

Number of correspondence pixels of the RPII display
We take image point A located on the z axis for example.Assuming the w th elemental image is the critical one that contains the correspondence pixel for image point A, according to the geometric relationships, w is given by where the symbol x denotes the greatest integer less than or equal to x, distance f and g are defined as the focal length and the gap between the display and the MLA, respectively.The number of correspondence pixels (H × V) for image point A is given by where functions min(M, N) and max(M, N) give the smaller and the larger values of M and N, respectively.
Among the H × V correspondence pixels on the display panel ( ) x y , as shown in  After obtaining the number of correspondence pixels, we will analyze the light intensity distribution in the image space of the RPII display.

Light intensity distribution of the RPII display
For the RPII display shown in Fig. 2, the pupil function for the central micro-lens can be given as ( ) The pupil function for other micro-lenses can be expressed as For a single correspondence pixel ( ) , we can obtain the light intensity distribution at an arbitrary point C(x, y, z) on a certain depth plane by using the paraxial approximation and the Fresnel diffraction theory, as given below: where λ is the mean wavelength, k is the wave number and is given by For multiple correspondence pixels, all the intensity distributions can be superimposed linearly.Therefore, the light intensity distribution of the RPII display can be expressed as 

DOF of the RPII display
In the following parts, we only take into account one dimension for simplicity, and the discussions for the other dimension are the same due to the square shapes of micro-lenses.For a given RPII display the diffraction intensity patterns on different depth planes can be obtained by using Eqs.( 8) and (9).Figures 3(a For different depth planes, the corresponding diffraction intensity patterns can be calculated as shown in Fig. 3.For a given diffraction intensity pattern, we use a square light spot that contains a substantial percentage (84%) of the whole pattern energy to replace the entire pattern [44].The diffraction intensity pattern can then be simplified as a square light spot with a half side length r(z).Therefore, r(z) can be calculated as   The discrete data of half side length r(z) is obtained and fitted into a hyperbola function whose waist is located on the conjugate plane, as denoted by the red lines in Fig. 4. Taking into consideration the minimum angular resolution of human eyes (α e ), we can calculate the DOF using the intersection of the hyperbola and the margin lines of α e (blue lines shown in Fig. 4).Here, d represents the viewing distance between the eye and the conjugate plane.After obtaining the hyperbola functions, we can deduce the DOF as follows.The hyperbola fitting function of the light beam can be written as ( ) where a and b are given by the fitting results of the hyperbola light beam.The margin lines of the minimum angular resolution of human eyes (α e ) are given by the following equation: Combining Eqs. ( 11) and ( 12), we can confirm the locations of the rear and front marginal depth planes.Therefore, the DOF of the RPII display can be defined as the distance between the two marginal depth planes:

DOF calculation results based on wave optics
After obtaining the DOF calculation formula, we can use it to analyze the variations of DOF with different parameters.In the calculation process we assume that the number of correspondence pixels (H × V) always meets the requirement of ( ) ( ) . Note that H × V is decided by the focal length f and the gap g according to Eqs. ( 3) and ( 4).It means that once f and g are given, H × V is fixed and can be calculated by the given f and g.The relationship between the viewing distance d and the DOF Wave is given by Eq. ( 13).It shows that the DOF Wave increases monotonously if d increases.Therefore, we will only discuss the variations of DOF Wave with different focal length f, gap g, and MLA pitch p in this paper.Table 1 lists the parameters used in this section.The minimum angular resolution of human eyes is set to be α e = 1.662 × 10 -2° [42], the mean wavelength is 13.8 11.5 9.9 8.6 7.7 6.9 6.3 5.8 DOF Wave have been calculated with different specifications according to Eq. ( 13).Tables 2-4 show the variations of DOF Wave with different focal length f, gap g, and MLA pitch p, respectively.
From the results shown above, we can see that the values of DOF Wave highly depend on focal length f, gap g, and micro-lens pitch p.The variations of DOF Wave are not monotonic as we change these parameters, and the optimal DOF Wave values can be determined by multiparameter optimization [48,49].

Experimental results
According to the above analyses, we designed and implemented an experiment to show the accuracy of the proposed method.To avoid light diffraction or aberration effects in the pickup process, we used Autodesk 3ds Max 2012 software to generate the elemental image array, and the obtained elemental image array was printed on a piece of high-quality photo paper with a Brother HL-4150CDN color printer to implement the optical display.The reconstructed 3D images were then captured by a Canon EOS 5D Mark II camera at a viewing distance of d = 2500mm.Table 5 shows the parameters used in the pickup and display experiments.To verify the proposed DOF calculation method, we compared the DOF calculation results with those calculated by geometrical optics and Gaussian beam methods as shown in Table 6.With given parameters, we calculated three groups of DOF values using three different DOF calculation methods.Here, DOF Geom , DOF Gauss , and DOF Wave refer to the DOF derived by geometrical optics method, Gaussian beam method, and wave optics method, respectively.The pickup process was conducted in a computer without the influence of diffraction or aberration effects, as shown in Fig. 5.The parameters used in the pickup process are listed in Table 5 and the generated elemental image array contains 110 × 110 elemental images, each with a resolution of 40 × 40 pixels, as given in Fig. 6(a).In the optical 3D display process, as shown in Fig. 6(b), the elemental image array was printed on a piece of paper (as our display) and placed behind the MLA, which has identical parameters with the previous one used in the pickup process, with a separation of g = 3.7mm.Other parameters are given in Table 5. Figure 7 shows the captured display results.
From the experimental results, we can see that the reconstructed 3D "florets" number 1, 2, 6, and 7 which are located within the theoretically predicted range of DOF Gauss or DOF Geom and out of the range of DOF Wave appear blurry.However, "florets" number 3, 4, and 5 which are located within the theoretically predicted range of DOF Wave and also within the ranges of both DOF Gauss and DOF Geom appear clearer and smoother.What's more, by comparing Fig. 7(b) with Fig. 7(c), we can see that 3D "florets" number 3, 4, and 5 are reconstructed more correctly than the rest ones with less distortion.
The results demonstrate that the effect of multiple micro-lenses and diffraction strongly affect the DOF in an RPII display.Even though we have neglected the pixel size of the display, the proposed DOF calculation method based on wave optics is more accurate than both geometrical optics and Gaussian beam methods.Experimental results demonstrate the accuracy of the proposed method.

Conclusion
In this paper, we analyzed the DOF of the integral imaging display with multiple micro-lenses using wave optics.We first derived the light intensity distribution with multiple micro-lenses, and then the DOF was calculated by combining hyperbola fitting of the diffraction intensity patterns and the minimum angular resolution of human eyes.With given system parameters, we determined several groups of DOF values and analyzed the variations of them under different situations.Finally, optical experimental results confirmed the accuracy of the proposed method.In the future work, other parameters such as viewing resolution and viewing angle will be studied as well as the pixel size of the display panel.These analyses can be quite beneficial for researchers to better understand and develop desired integral imaging displays.

Fig. 1 .
Fig. 1.Number of correspondence pixels to reconstruct image point A in the RPII display.

Fig. 1 ,
Fig. 1, the coordinates of an arbitrary correspondence pixel ( ) , D D mn mn D x y can be expressed
is a δ function (point source), the phase transformation of the corresponding micro-lens is given by ) and3(b)  show the diffraction intensity patterns at depth planes z = 20.0mm(the conjugate plane) and z = 26.0mmwith parameters f = 4.0mm, g = 5.0mm, p = 0.8mm, the number of correspondence pixels H × V = 5 × 5.

Fig. 4 .
Fig. 4. Analysis of DOF of the RPII display based on wave optics taking into account the minimum angular resolution of human eyes.

Table 1 .
Parameters used in DOF calculation λ -Mean wavelength l-Conjugate distance defined by 1 of MLA α e -Minimum angular resolution of human eyes f-Focal length of the micro-lens DOF Wave -DOF derived by wave optics method d-Viewing distance between the eye and the conjugate plane g-Gap between the display and the MLA; M × N-The total number of micro-lenses H × V-The number of correspondence pixels Table 2. Variation of DOF Wave with different focal length f Parameters g = 8.0mm, p = 0.8mm, d = 2500mm, α e = 1.662 × 10 -2°, λ = 5.5 × 10 −4 mm f

Fig. 7 .
Fig. 7. (a) Different perspectives of the reconstructed 3D image, (b) enlarged view of the center viewpoint, and (c) 2D image of the original 3D"florets" number 4 for reference.