Elsevier

Neural Networks

Volume 21, Issues 2–3, March–April 2008, Pages 222-231
Neural Networks

2008 Special Issue
A new nonlinear similarity measure for multichannel signals

https://doi.org/10.1016/j.neunet.2007.12.039Get rights and content

Abstract

We propose a novel similarity measure, called the correntropy coefficient, sensitive to higher order moments of the signal statistics based on a similarity function called the cross-correntopy. Cross-correntropy nonlinearly maps the original time series into a high-dimensional reproducing kernel Hilbert space (RKHS). The correntropy coefficient computes the cosine of the angle between the transformed vectors. Preliminary experiments with simulated data and multichannel electroencephalogram (EEG) signals during behaviour studies elucidate the performance of the new measure versus the well-established correlation coefficient.

Introduction

Quantification of dynamic interdependence in multidimensional complex systems with spatial extent provides a very useful insight into their spatio-temporal organization. In practice, the underlying system dynamics are not accessible directly. Only the observed time series can help decide whether two time series collected from the system are statistically independent or not and further elucidate any hidden relationship between them. Extracting such information becomes more difficult if the underlying dynamic system is nonlinear or the couplings among the subsystems are nonlinear and nonstationary.

There has been extensive research aimed at detecting the underlying relationships in multidimensional dynamic systems. The classical methodology employs a linear approach, in particular, the cross-correlation and coherence analysis (Shaw, 1981). Cross-correlation measures the linear correlation between two signals in the time domain, while the coherence function specifies the linear associations in the frequency domain by the ratio of squares of cross-spectral densities divided by the products of two autospectra. There have been several extensions of correlation to more than two pairs of time series such as directed coherence, directed transfer functions and partial directed coherence (Pereda, Quian Quiroga, & Bhattacharya, 2005). Unfortunately, linear methods only capture linear relationships between the time series, and might fail to detect nonlinear interdependencies between the underlying dynamic subsystems.

Nonlinear measures include mutual information and state-space methods. One technique is the generalized mutual information function (Pompe, 1993). However, a large quantity of noise-free stationary data is required to estimate these measures based on information theory, which restricts their applications in practice. Another method is the phase synchronization where the instantaneous phase using Hilbert transforms is computed and interdependence is specified in terms of time-dependent phase locking (Rosenblum et al., 1996). The state-space methodologies include similarity-index and synchronization likelihood. The similarity-index technique and its modifications compute the ratio of average distances between index points, their nearest neighbours and their mutual nearest ones (Arnhold et al., 1999, Quiroga et al., 2000). Stam et al. proposed the synchronization likelihood to offer a straightforward normalized estimate of the dynamic coupling between interacting systems (Stam & van Dijk, 2002). There are several drawbacks associated with these techniques based on state space embedding. Estimating the embedding dimension of times series corrupted by measurement noise for a valid reconstruction, searching a suitable neighbourhood size and finding a constant number of nearest neighbours are a few of many constraints that severely affect the estimation accuracy.

In this paper, we introduce a novel functional measure, called the correntropy coefficient, to characterize dynamic interdependencies between interacting systems. Correntropy is a new concept to quantify similarity based on a reproducing kernel Hilbert space method (Santamaria, Pokharel, & Principe, 2006). Correntropy is sensitive to both the higher order statistical distribution information and temporal structure of the random process. Correntropy can be applied both to one-time series, called the autocorrentropy, or a pair of scalar random processes, called the crosscorrentropy. In this paper, we work with the centered crosscorrentropy, which implicitly subtracts the mean of the nonlinearly transformed signal. The correntropy coefficient is defined as the normalized centered cross-correntropy. If two random variables or two time series are independent, then the correntropy coefficient becomes zero; if the two are the same, then it attains maximum value 1; the correntropy coefficient achieves −1 when the two random variables are in the opposite directions. Hence, the correntropy coefficient is a suitable interdependence measure for interacting dynamic systems.

The paper is organized as follows. In Section 2, we briefly introduce the newly proposed correntropy concept and present the method of the correntropy coefficient in details. We also explores the correntropy coefficient from geometrical perspective and other relevant issues in Section 3. Experiments of the correntropy coefficient on simulated data and real EEG signals are presented in Section 4. We conclude the work in Section 5.

Section snippets

Method

In function analysis, the symmetrical positive definite kernel is a special type of bivariate function. The most widely used kernel in machine learning and in nonlinear data representation is the Gaussian kernel which is given by κ(x,y)=12πσexp{(xy)22σ2}, where σ is the kernel width. According to the Mercer’s theorem (Mercer, 1909) of Hilbert space analysis, the symmetrical positive definite kernel function possesses an eigendecomposition as κ(x,y)=n=0λnφn(x)φn(y)=Φ(x),Φ(y)Φ:xλnφn(x),n=1,

Discussion

In this section, we explore more details about the correntropy coefficient both in theoretical analysis and practical implementation.

Experiments

We test the correntropy coefficient on simulated data set and real world EEG signals in this section.

Conclusions

In this paper, we propose the correntropy coefficient as a novel nonlinear interdependence measure. Due to a positive definite kernel function, the correntropy coefficient implicitly maps the original random variables or time series into an infinite dimensional reproducing kernel Hilbert space which is uniquely induced by the centred cross-correntropy function and essentially computes the cosine of the angle between the two transformed vectors. Orthogonality in RKHS HU corresponds to

Acknowledgments

This work was partially supported by NSF grant ECS-0601271, Graduate Alumni Fellowship from University of Florida and research scholarship from RIKEN Brain Science Institute.

References (22)

  • D. Prichard et al.

    Generating surrogate data for time series with several simultaneously measured variables

    Physics Review Letters

    (1994)
  • Cited by (0)

    An abbreviated version of some portions of this article appeared in Xu, Bakardjian, Cichocki, and Principe (2007) as part of the IJCNN 2007 Conference Proceedings, published under IEE copyright.

    View full text