Abstract
The transverse spatial coherence of an optical beam alters as the light traverses a dense, multiple-scattering, random dielectric medium. Results of measurements show the predicted initial increase of the decoherence (or decorrelation) rate with spatial separation, followed by saturation at large separation. This behavior can be modeled by using a Boltzmann-like transport equation for the Wigner function of the wave field, whereas the standard model for decoherence (developed for quantum wave fields), which is based on the Fokker-Planck phase-space diffusion equation, contains no saturation and fails at all separations to describe the observed behavior.
- Received 2 February 1999
DOI:https://doi.org/10.1103/PhysRevLett.82.4807
©1999 American Physical Society