Concept of coherence aperture and pathways toward white light high-resolution correlation imaging

Self-interference correlation imaging is a recently discovered method that takes advantage of holographic reconstruction when using a spatially incoherent light. Although the temporal coherence of light significantly influences the resolution of the method, it has not been studied either theoretically or experimentally. We present the first systematic study of the resolution in a broadband correlation imaging based on the concept of coherence-induced diffraction. We show that the physical limits of the resolution are reached in a non-dispersive experiment and their examination can be performed by the coherence aperture whose width depends on the coherence length of light and the optical path difference of interfering waves. As the main result, the optimal configuration of the non-dispersive experimental system is found in which the sub-diffraction image resolution previously demonstrated for monochromatic light can be retained even when the white light is used. Dispersion effects that prevent reaching the physical resolution limits are discussed and the dispersion sensitivity of the currently available experiments examined. The proposed concept of the coherence aperture is verified experimentally and its generalization to the concept of the dispersion-induced aperture suggested. As a challenge for future research, possible methods of dispersion elimination are outlined that allow the design of advanced optical systems enabling implementation of the high-resolution white light correlation imaging.


Introduction
In recent years, imaging science has been enriched by new significant trends benefiting from advanced technologies for the shaping and detection of light and powerful numerical methods providing digital versions of the holographic imaging [1]. A spatial light modulator (SLM) has been successfully used in imaging experiments as a versatile diffractive element allowing shaping, splitting or spatial filtering of light waves. Its integration has given rise to a variety of new imaging techniques, such as the spiral phase contrast imaging [2] or the spatial light interference microscopy [3]. The SLM has also been utilized in digital holographic imaging based on the optical recording and numerical reconstruction of holograms taken in a coherent light [4].
Recently, considerable effort has been focused on methods allowing exploitation of the advantages of the holographic imaging when using a spatially incoherent light. In these methods, the image has been reconstructed from the correlation records acquired in special configurations including two-channel holographic microscopy [5], scanning holography [6] or Fresnel incoherent correlation holography (FINCH) [7]. A considerable application potential has been found especially in the self-interference correlation methods, which can be advantageously implemented in a simple and extremely stable common-path interferometer [8]. In these methods, two steps are necessary for image formation. Initially, the correlation holographic records of the three-dimensional (3D) object are captured in a spatially incoherent light and then processing of the records and the numerical image reconstruction are performed. A combination of the optical and numerical stages of the experiments provided new unexpected imaging options that were discovered in appropriate theoretical models and then demonstrated experimentally. In [9], the basic simulation model was proposed revealing a connection between the geometry of the experiment, the shape of the correlation records and the two-point resolution assuming a perfect temporal coherence of the used light. In [9,10], the violation of the Lagrange invariant was demonstrated and used for the discovery of the sub-diffraction resolution in nearly monochromatic light [11]. The role of the spatial coherence in the recording and reconstruction of the holograms was also investigated. It was verified that the imaging is realized in a mixed coherence regime, where the point correlation records are mutually uncorrelated, while their reconstruction is fully coherent [9]. This property was used for the first implementation of the spiral edge contrast enhancement using incoherent light [12].
In all studies demonstrating the performance of the incoherent correlation imaging, the high temporal coherence of light was required. In the experiments, the narrow spectral filters (SFs) with a width of several nanometers were used, which significantly attenuated the signal and deteriorated the signal-to-noise ratio. In this paper, the first detailed study of the broadband correlation imaging is presented, which explains a complicated connection of the image resolution with the temporal coherence of light, the optical path difference (OPD) of interfering waves and the dispersion effects. The concept of the point spread function (PSF) representing the basis of the standard imaging theory is substantially modified in the case of the correlation imaging, because the structure of the point image becomes dependent on the temporal coherence of light. The geometric aperture of the lens, which is important for the image resolution of the standard imaging, must be replaced by a more versatile aperture function involving the properties of the point correlation records. In the model of the incoherent correlation imaging, the observed object is composed of an infinite number of point sources that emit mutually uncorrelated light. Light waves originating from separate object points are captured by the optical system and each of them is divided into the signal and reference waves. These waves are spatially correlated and if their OPD does not exceed the coherence length (CL) of light, they interfere. The resolution of the reconstructed image is then directly determined by the lateral size of the interference pattern, which depends on the CL and the OPD and in current experiments also on the dispersion. The main objective of the paper is to explore the physical limits of the resolution that are reached in systems with eliminated dispersion. In this case, the lateral bounding of the correlation record is significantly influenced by the temporal coherence of light and therefore an envelope of the point correlation record can be interpreted as the coherence-induced aperture. This concept is used to explore the physical resolution limits of the broadband correlation imaging in both the standard geometry [11] and the dual lens configuration [13,14] and to reveal the pathways toward the high-resolution white light imaging. The inclusion of dispersion effects into the concept of coherence-induced aperture is also outlined as a challenge for further research together with the methods of dispersion elimination in advanced experiments.

Correlation imaging in different regimes of the temporal coherence of light and dispersion
The concept of coherence-induced diffraction proposed in the paper is applicable to all modes of correlation imaging but for its detailed elaboration the conditions of the temporal coherence of light and dispersion must be specified for the particular experiment. In this regard, four basic experimental regimes can be distinguished, which are briefly discussed.

Quasi-monochromatic imaging
In all theoretical studies on the principle of the correlation imaging, the monochromatic light with an infinite CL was assumed and also experiments demonstrating the high-resolution imaging were performed with nearly monochromatic light. This was achieved by using narrow SFs with the bandwidth not exceeding 10 nm [11,15]. The monochromatic light has a perfect temporal coherence, so that an infinite coherence aperture (CA) is obtained in the proposed concept of the coherence-induced diffraction. In this case, the resolution is limited by the aperture of the point correlation record whose size is determined by the geometrical overlapping of the signal and reference waves as shown in figure 1(a). A detailed study of this case was presented in [15], where the system with a perfect correlation overlapping was also designed.

Broadband imaging implemented by an ideal dispersion-free system
If a broadband light is used in correlation imaging, the image quality is affected not only by the temporal coherence of light, but also by the OPD and the dispersion effects. Because the dispersion is a technical problem that can be avoided by an optimal design of optical components, a dispersion-free system can be used as an appropriate approach for exploring the physical aspects of broadband correlation imaging. The analysis is then greatly simplified because the OPD is independent of the wavelength and the physical limits of the resolution can be found by applying the concept of the CA ( figure 1(b)).

Broadband imaging affected by the spatial light modulator dispersion
In the correlation imaging, the records of the object are implemented in an optical system that uses the MO for collimation of light and the SLM for splitting of light waves. In the current experiments, the achromatic objective is used so that the dispersion caused by the diffraction of light at the SLM remains uncompensated and significantly influences the achievable resolution. For such experiments, the concept of the CA is still applicable but the calculation of the width of the aperture is more complicated because the OPD depends on the wavelength. In this case, the size of the aperture and therefore also the resolution are affected by the dispersion characteristics of the SLM as shown in figure 1(c). This problem deserves particular attention because the basic experimental configurations of the correlation imaging have different sensitivities to the SLM dispersion, as will be shown later in this paper. In [13,14], the double lens configuration was proposed, which provided a favorable OPD in comparison with the previously used standard method. The advantages of the dual lens geometry were demonstrated on the low-resolution imaging using light with the spectral width of 80 nm, in which the resolution target with 10 lines per mm was close to the resolution limit [13]. As will be shown, the experiment must be optimally configured and the SLM dispersion compensated to fully exploit advantages of the optimal OPD for the broadband high-resolution imaging.

Broadband imaging in feasible systems with compensated dispersion
Since the Abbe numbers of refractive and diffractive optical elements have opposite signs, an optimized system for the correlation imaging can be designed, in which the chromatic aberration of the refractive elements compensates the SLM dispersion caused by diffraction. Such a system is a challenge for further research, because it allows us implementation of the high-resolution white light imaging theoretically discovered for the non-dispersive system. This overview describes a sequence of research activities needed to explore the broadband correlation imaging and to ensure its applicability. The main aim of the paper is to establish a theoretical background of the correlation imaging, to explain the basic connection between the temporal coherence of light and the resolution and to determine the physical resolution limits in non-dispersive imaging. A simple analysis of the dispersion is also included and the procedures for its further detailed examination suggested.

Concept of the coherence-induced aperture
Influence of the temporal coherence of light on the resolution can be explained when the correlation record of a point source that emits light with the specified spectrum is described in a non-dispersive approach assuming the stationarity and ergodicity of light. The divergent spherical wave emitted by a point source is captured by the optical system and divided into the signal and reference waves, which can be written as where FT denotes the Fourier transform, e s and e r are the amplitudes, u is a random spectral component with frequency ν and ϑ j , j = 1, 2, 3, are constant phase shifts used to remove the holographic twin image [8]. The phases s and r are given as j = 2πν D j /c, j = s, r, where c denotes the phase velocity of light in vacuo, and D s and D r are the optical paths of the signal and reference waves between the object and detection points given by the position vectors r and r 0 , respectively. The intensity of the point correlation records is obtained by the time averaging, As the Fourier components of light that belong to different frequencies are uncorrelated, we can write where δ and g denote the Dirac delta function and the power spectrum of the used source, respectively. The interference term carrying information about the recorded object point then can be written as the Fourier transform where T = D/c is a time delay related to the OPD given as a difference of the optical paths of the signal and reference waves, D = D s − D r . If the interfering waves do not have any initial time delay, t − t = 0, the point correlation records created in the broadband light can be written in the form where I j (ν) represents the monochromatic interference patterns given as To analyze the performance of the broadband correlation imaging, the spectral properties of light and the OPD must be specified. In this paper, a Gaussian spectrum with the bandwidth 2 ν and the central frequency ν 0 is assumed, In the non-dispersive approach, the OPD is frequency independent, ∂ D/∂ν = 0, and E s j E * r is calculated analytically. The broadband point correlation records are given as where A C denotes the degree of coherence depending on the CL and the OPD and is the phase difference of interfering waves. In the experiments, I j is successively recorded with the phase settings ϑ j = 0, 2π/3 and 4π/3, and the holographic twin image is removed [8]. By this way, the point correlation records can be arranged into the form of the pupil function t of a diffractive lens originally defined in [9] for the monochromatic light. In the case of the broadband light it can be expressed as The phase difference is quadratic in the radial spatial coordinate r and can be expressed by the focal length of the diffractive lens f l depending on the shape of the signal and reference waves [9], where k 0 = 2π ν 0 /c. The lens is laterally bounded by the aperture A C whose shape is influenced by the dependence of D on the spatial coordinates and can be written as where L = c/( √ π ν) is the CL given by the Gaussian spectrum. This aperture is not given by geometrical constraints but it is induced by the partial temporal coherence of light. Hence, it is called the CA throughout this paper. In the case of the monochromatic light with an infinite CL, the width of the CA is also infinite and the diffractive lens is bounded by the geometrical aperture (GA) given by the overlapping of interfering waves [15]. The size of the GA can be simply determined from geometrical constraints on the amplitudes e s and e r . As shown in (6), the CA can also be explained as a consequence of an incoherent superposition of the monochromatic point records I j (ν) modified by the power spectrum of the source, |g(ν)| 2 . The superposition is illustrated in figure 2, which demonstrates a connection between the monochromatic and broadband imaging modes.

Physical resolution limits
The PSF representing the normalized intensity of the point image is of fundamental importance for evaluation of the resolution achieved in the optical system of specified parameters. In the correlation imaging based on the numerical reconstruction of the image, the PSF is given by the Fresnel transform of the pupil function (10) obtained from the holographic records of the object, I N ∝ |FrT{t}| 2 . The size of the diffraction limited image spot is then inversely proportional to the size of the CA given by (12) that laterally bounds the point correlation records. Because the width of the CA depends on both the CL and the OPD, a connection between the temporal coherence of light, the experiment geometry and the image resolution can be established by equation (12). The interconnection of L and D with the resolution is important for the experiments. It shows that the physical limits of the resolution obtained in a non-dispersive model of the imaging are not uniquely determined by the temporal coherence of light but also depend on the OPD, which may be favorably influenced by an appropriate geometry of the interfering waves. In this paper, two basic experimental configurations with the significantly different OPD will be examined and their physical resolution limits achievable in the broadband light determined.

Resolution limits in the standard configuration.
In this paper, the geometry originally proposed in [7,8] is regarded as the standard configuration. It operates with the interfering waves illustrated in figure 3(a). The observed 3D object illuminated by incoherent light is located near the focal plane of the MO so that the light waves emitted by individual object points are nearly collimated when hitting the SLM. It acts as a beam splitter (BS) that transmits part of the incident energy unchanged and the remaining energy transforms as a lens with the focal length f d . In this way, each incident wave is divided into the reference and signal waves with the plane and spherical wavefronts. Three correlation records of the individual object points are created by interference of the plane and phase shifted spherical waves and by their processing, the pupil function of the diffractive lens (10) is obtained for each point of the object [9]. If the CA is examined for an axial object point placed near the focal plane of the collimating lens, the OPD can be written as where f l = f d − 2 is the focal length of the diffractive lens, and r and 2 denote the radial coordinate and the distance between the SLM and the CCD, respectively. The lens is bounded by the CA whose half-width r can be defined by a decrease 1/e of the Gaussian profile (12). The numerical CA of the diffractive lens used for the image reconstruction is then given by NA C = r/| f l | and can be written as When using the nearly monochromatic light, the CA exceeds the GA and the image aperture is determined by the overlapping area of interfering waves [15]. The numerical GA then can be expressed as where 2R is the size of the active area of the SLM. In specific settings, the OPD and the CL decides, whether the CA or the GA determines the diffraction limit of imaging. To evaluate the image resolution, the effective numerical aperture is introduced as NA E = min{NA C , NA G }. Because the image is reconstructed coherently, the maximal resolved spatial frequency is determined as where λ 0 is the central wavelength and m is the lateral magnification. If the object is placed near the focal plane, the lateral magnification is given as m = 2 / f 0 , where f 0 is the focal length of the used microscope objective (MO) [9]. As is obvious from (14)-(16), the CCD position 2 is a crucial parameter influencing the resolution in both the monochromatic and broadband light. For the monochromatic light, the resolution is given by (16), where NA E = NA G is used. The dependence of the monochromatic resolution on the CCD position 2 is illustrated by the dashed line in figure 4. The best resolution at the position (a) twice exceeds the diffraction resolution limit of the used MO, = 2 0 , where 0 = NA 0 /λ 0 , and NA 0 = R/ f 0 [11]. When light with a wider spectral width is used, the point correlation records are bounded by the CA and the resolution is influenced by the temporal coherence of light. The deterioration of the resolution manifests differently depending on the CCD setting. The strongest degradation of the resolution of the broadband imaging occurs just at the CCD position, where the best monochromatic resolution is achieved. For the spectral width of 20 nm, the monochromatic resolution corresponding to the position (a) is reduced to the value given by the position (b) in figure 4. The dependence of the image resolution on the CCD position deserves a more detailed discussion for the setting 2 > 2 f d . For the monochromatic light, the dependence of the resolution on 2 can be written as ∝ 2 /| f d − 2 |, so that the resolution decreases with increasing 2 . As is evident from (14) and (16), ∝ (L 2 ) 1/2 , so the resolution increases with increasing 2 , when the broadband light is used. For light with specified spectral width, a CCD position can be found where the CA has the same size as the GA and equal resolution limits are reached for both monochromatic and broadband light. In figure 4, the dependence of the image resolution on the CCD position is illustrated by the solid lines for the central wavelength 632 nm and the spectral widths 20, 50 and 100 nm. The positions (c) and (d) indicate the settings, where the GA has exactly the same size as the CA corresponding to the spectral width of 20 nm. Applying the condition NA C = NA G , the shortest CCD position can be determined, where the monochromatic resolution is maintained even when the spectral width of light is extended, The distance 2 is very large for small L, so that it is difficult to implement such a setting for broadband light with the short CL. The interconnection of the temporal coherence of light and the resolution can be used for determination of the permissible CL that allows to reach the required resolution limits. The monochromatic resolution is given by the experimental parameters and the CCD position as = 2 NA G /( f 0 λ 0 ). The CL for which the resolution is preserved even if the temporal coherence of light is reduced can be estimated as For the parameters of the experiment NA 0 = 0.28, f 0 = 20 and 2 = 800 mm, the CL with L > 100 µm is required. This CL corresponds to the spectral width 2 λ < 4 nm for the central wavelength λ 0 = 632 nm. It is important to note that the image resolution was examined in the non-dispersive model and should be considered as the physical resolution limit of the method. In the current experiments, the achromatic MO is usually used and the diffractive dispersion of the SLM remains uncompensated. As will be shown later, the standard configuration has a weak sensitivity to dispersion so that the theoretically predicted CCD settings, in which the monochromatic resolution is either lost or retained, can be successfully verified even when the system with the SLM dispersion is used.

Resolution limits in the dual lens configuration.
In this modification of the correlation imaging, the SLM operates as a dual lens with two different focal lengths f 1 and f 2 , so that both the signal and reference waves are spherical ( figure 3(b)). This geometry was proposed in [13,14], where the OPD was analyzed and its benefits were demonstrated on a low-resolution imaging. In this paper, a possibility to use the dual lens system for the high-resolution broadband imaging is examined. The physical limits allowing the sub-diffraction resolution in white light are discovered and the reasons that prevent achievement of the high resolution in currently used systems operating with the broadband light are explained.
The analysis is again based on the concept of the coherence-induced diffraction which requires calculation of the OPD and the focal length of the diffractive lens f l . Assuming that R 2 / f 2 j 1, j = 1, 2, these quantities can be written as [13] (20) where f j = | 2 − f j |, j = 1, 2. The numerical CA can be determined from (12) similarly as in the standard case where The image resolution is still given by (16), but the effective numerical aperture is defined as  (12) shows that the numerical CA of the dual lens system is determined as the numerical CA of the standard system multiplied by the factor G, which becomes infinite if f 1 = f 2 = f . In this case D = 0, and the size of the CA exceeds the GA for an arbitrary temporal coherence of light. This situation occurs with the CCD setting The correlation records captured at the optimal CCD position allow image reconstruction with the resolution given as where 0 denotes the diffraction resolution limit of the used MO. Equation (24) determines the physical resolution limit W of the dual lens system, which can be achieved regardless of the temporal coherence of light, and therefore also for the white light. In figure 5, the point B corresponds to the best resolution for the broadband light. If the monochromatic light with the perfect temporal coherence is used, the optimal CCD setting is given as and the best resolution can be written as M = 2 0 (point A in figure 5). The ratio of the highest resolution attainable in monochromatic and white light is therefore determined as The ratio M / W 1 shows the degree of deterioration of the resolution, which occurs when the white light is used instead of the monochromatic light. It depends on the geometry of the interfering waves, especially on the distance f between the CCD and the focal points of the lenses created by the SLM. Degradation of the resolution is weak if f is short. In figure 5, the parameters f = 124.5 and 2 W = 524.5 mm are used, so that according to (26) the white light resolution W is 1.2 times lower than the monochromatic resolution M . In the limiting case f → 0, the monochromatic sub-diffraction resolution M = 2 0 is fully retained even when white light is used. In this limit, 2 W = 2 M , and the optimal CCD position is the same for both the monochromatic and white light. In real experiments, the smallest value of f has certain technical limits [13], so that the CCD must be carefully positioned. The records taken with the broadband light at the position optimal for monochromatic light result in a significant loss of the resolution, as shown in figure 5.

Dispersion sensitivity of basic experiments
The physical resolution limits of the broadband correlation imaging depend on the CL and the OPD and can be reached only in the dispersion-free experiment. In the available experiments, the achromatic MO is used in combination with the SLM, so that the diffractive dispersion affects the quality of the reconstructed image. The concept of the coherence-induced diffraction used in this paper can be extended also to dispersion effects. In this approach, the resolution is determined by the dispersion-induced aperture whose width depends on the CL of light, the OPD and the dispersion properties of the SLM. To define the dispersion aperture mathematically, the integration (5) must be performed with the time delay T depending on the frequency. If a broadband light is used, the OPD of the interfering waves differs for separate frequency components and the dispersion properties of the experimental system must be considered when expressing D. Determination of the basic relations between the resolution and dispersion is beyond the scope of this paper and will be addressed elsewhere together with compensation of the SLM dispersion by refractive optics. Here, the spectral changes of the OPD are analyzed and used to evaluate the dispersion sensitivity of the basic experimental configurations of the correlation imaging.

Dispersion sensitivity of the standard configuration.
In the standard geometry, the images of individual object points are reconstructed from the holographic records created by interference of the spherical signal wave and the reference plane wave. The interfering waves are generated by the SLM that acts as a diffraction splitter. A portion of the collimated input wave is released unchanged and a part of its energy is focused by the diffractive lens formed on the SLM ( figure 3(a)). The SLM lens is designed in such a manner that the focal length f d is created for the central wavelength λ 0 . If the SLM is illuminated by the broadband light, the dispersion caused by diffraction occurs and the focal length of the SLM lens varies with the wavelength as f d λ 0 /λ. The OPD then depends on λ and 2 and its maximal value can be determined as The sensitivity of the experimental setup to the SLM dispersion can be assessed by the change of the OPD corresponding to the spectral interval λ, In figure 6, the dependence of the OPD on 2 is shown for the focal length of the SLM lens f d = 400 mm and the central wavelength λ 0 = 632 nm (solid red line). The dispersion sensitivity D d is indicated by the error bars defined by the blue dashed lines corresponding to the wavelengths λ = λ 0 ± 50 nm. As is obvious, both the OPD for the central wavelength and the dispersion sensitivity increase with the increasing distance 2 . At the position 2 = 2 f d , which provides the sub-diffraction resolution in monochromatic light, the OPD is more than three times larger compared to the OPD at 2 = f d /2, where the monochromatic imaging is not optimal. The dispersion effects can be estimated using the ratio of the change of the OPD caused by λ, D d and the OPD corresponding to λ 0 , D 0 ≡ D(λ 0 ), In the standard configuration, the OPD takes large values for the central wavelength so that the best monochromatic resolution is significantly reduced when light with decreased temporal coherence is used. The dispersion effects do not affect this situation significantly because Q 1 for λ λ 0 . As will be shown later, the theoretical predictions obtained in the dispersion-free model for light with slightly reduced temporal coherence can be verified experimentally even if the standard configuration with uncompensated SLM dispersion is used.  For correct use of (32), the ray heights r 1 and r 2 must be determined and the indices j and k assigned applying the condition The dependence of the OPD on the detection position 2 is illustrated in figure 7 for the central wavelength λ 0 = 632 nm (red solid line) and the wavelengths λ = λ 0 ± 50 nm (blue dashed lines). With the CCD settings 2 < f 1 or 2 > f 2 , the OPD is relatively large for the central wavelength λ 0 but the dispersion sensitivity shown by the error bars is low. If the CCD is placed between the focal points of the SLM lenses, the situation is completely different. The OPD becomes very small for λ 0 and goes to zero at the optimal CCD setting, while the dispersion sensitivity is maximal, there. To achieve the physical resolution limits of the broadband correlation imaging demonstrated in figure 5, the SLM dispersion must be eliminated.

Experimental results
To verify the physical resolution limits theoretically predicted in a dispersion-free approach, the standard configuration with a weak sensitivity to the SLM dispersion was used. In the setup illustrated in figure 3, the MO Melles Griot ( f 0 = 20 mm, NA 0 = 0.28) and the SLM Hamamatsu (800 × 600 pixels) creating the lens with the focal length f d = 400 mm were used. A halogen lamp (HL) supplemented by suitable spectral filters (SFs) was used as the light source. The experimental demonstration of the resolution illustrated by the theoretical curves in figure 4 was performed. In particular, the resolution decay in the broadband light corresponding to the points (a) and (b) and the resolution recovery achieved by the proper system adjustment (c) and (d) were successfully verified by both the point correlation records and the imaging of the USAF resolution targets. In the first part of the experiment, the correlation records of the point sources were carried out for the spectral half-widths of 1 and 20 nm and the CCD settings  figure 4, the CA induced by the broadband light ( λ = 20 nm) is apparently narrower than the GA corresponding to the nearly monochromatic light ( λ = 1 nm). The bounding of the correlation records by the coherence-induced aperture causes a decrease in resolution when the light with a lower temporal coherence is used. This is clearly demonstrated by reconstruction of the USAF resolution targets shown in the upper right part of figure 8. The correlation records captured at 2 = 1650 mm show that the CA has the same size as the GA (figures 8(c) and (d)). The image resolution then remains unchanged even when the broadband light is used instead of the nearly monochromatic light. Preservation of the resolution is evident from the USAF resolution targets illustrated in the right lower part of figure 8, which were reconstructed in light with λ = 1 nm and 20 nm. These experimental results are in a good agreement with the theoretical prediction demonstrated by points (c) and (d) in figure 4.

Conclusions
In this paper, the connection between the temporal coherence of light and the resolution in the common-path incoherent correlation imaging was thoroughly examined. A general concept of the CA was proposed and used for the first time to examine the physical resolution limits of the broadband correlation imaging in dependence on the CL of light and the OPD of interfering waves. The basic experimental configurations were investigated and the physical resolution limits and the dispersion sensitivity discussed. As the main result, the avenues towards the white light incoherent correlation holographic imaging with sub-diffraction resolution were revealed. Further specific findings can be summarized as follows: • The standard configuration of the correlation imaging has a low sensitivity to the dispersion but cannot be used for the high-resolution broadband imaging. Due to the coherenceinduced aperture, the sub-diffraction resolution can be maintained only for light with nearly perfect temporal coherence that corresponds to the spectral width about 4 nm for common experimental parameters.
• The dual lens configuration offers a high potential for the broadband correlation imaging but also has a high sensitivity to dispersion. The physical limits allow the possibility of preserving the monochromatic sub-diffraction resolution even when white light is used provided dispersion effects are eliminated.
• The proposed concept of the CA has a general validity and its correctness was verified experimentally. In the following research it will be used both for determining the dispersion limits of the resolution and the design of a dispersion-free experiment.
In the design of the dual lens dispersion-free experimental setup providing high-resolution white light correlation imaging, a mutual compensation of the dispersion caused by refraction and diffraction can be utilized. The hyperchromatic objective used in spectroscopy may be useful for this purpose but for a perfect elimination of the SLM dispersion, the special correcting optics has been designed, whose properties and benefits will be discussed elsewhere.