Abstract
We show that applying a noise-reduction algorithm to a discretized time series increases its average error, compared to the original series. We find that adding external noise comparable to the discretization step before noise reduction limits the increase of the average error and improves the estimation of Lyapunov exponents.
- Received 28 February 2001
DOI:https://doi.org/10.1103/PhysRevE.64.046211
©2001 American Physical Society