Skip to main content
Log in

On directed information theory and Granger causality graphs

  • Published:
Journal of Computational Neuroscience Aims and scope Submit manuscript

Abstract

Directed information theory deals with communication channels with feedback. When applied to networks, a natural extension based on causal conditioning is needed. We show here that measures built from directed information theory in networks can be used to assess Granger causality graphs of stochastic processes. We show that directed information theory includes measures such as the transfer entropy, and that it is the adequate information theoretic framework needed for neuroscience applications, such as connectivity inference problems.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Achard, S., Salvador, R., Whitcher, B., Suckling, J., & Bullmore, E. (2006). A resilient, low-frequency, small-world human brain functional network with highly connected association cortical hubs. The Journal of Neuroscience, 26(1), 63–72. doi:10.1523/JNEUROSCI.3874-05.2006.

    Article  Google Scholar 

  • Amblard, P. O., & Michel, O. J. J. (2009a). Measuring information flow in networks of stochastic processes. Physical Review E, submitted. ArXiv:cs:IT:/0911.2873v2.

  • Amblard, P. O., & Michel, O. J. J. (2009b). Sur différentes mesures de dépendance causales entre signaux aléatoires (On different measures of causal dependencies between random signals). In Proc. Gretsi, Dijon, France.

  • Amblard, P. O., Zozor, S., Michel, O. J. J., & Cuculescu, A. (2008). On the estimation of the entropy using k-th nearest neighbors. In IMA conf. on maths and signal processing (pp. 79–82).

  • Bach, F., & Jordan, M. I. (2004). Learning graphical models for stationary times series. IEEE Transactions on Signal Processing, 52, 2189–2199.

    Article  Google Scholar 

  • Barnett, L., Barrett, A. B., & Seth, A. K. (2009). Granger causality and transfer entropy are equivalent for Gaussian variables. Physical Review Letters, 103, 23870. arXiv:0910.4514v2.

    Article  Google Scholar 

  • Beirlant, J., Dudewicz, J. E., Gyorfi, L., & Meulen, E. C. V. D. (1997). Nonparametric entropy estimation: An overview. International Journal of Mathematical and Statistical Sciences, 6, 17–39.

    Google Scholar 

  • Brillinger, D. (1996). Remarks concerning graphical models for time series and point processes. Revista de Econometria, 16, 1–23.

    Google Scholar 

  • Cover, J., & Thomas, B. (1993). Elements of information theory. New York: Wiley.

    Google Scholar 

  • Dahlaus, R. (2000). Graphical interaction models for multivariate time series. Metrika, 51, 157–172.

    Article  Google Scholar 

  • Dahlaus, R., & Eichler, M. (2003). Highly structured stochastic systems. In P. Green, N. Hjort, & S. Richardson (Eds.), Chap. Causality and graphical models in time series analysis. Oxford: University Press.

    Google Scholar 

  • Dahlaus, R., Eichler, M., & Sandkuhler, J. (1997). Identification of synaptic connections in neural ensembles by graphical models. Journal of Neuroscience Methods, 77, 93–107.

    Article  Google Scholar 

  • Eichler, M. (1999). Graphical models in time series analysis. Ph.D. thesis, Ruprecht-Karls-Universit at Heidelberg.

  • Eichler, M. (2005). A graphical approach for evaluating effective connectivity in neural systems. Philosophical Transactions of the Royal Society of London B, 360, 953–967.

    Article  Google Scholar 

  • Eichler, M. (2006). On the evaluation of information flow in multivariate systems by the directed transfer function. Biological Cybernetics, 94, 469–482.

    Article  PubMed  Google Scholar 

  • Frenzel, S., & Pompe, B. (2007). Partial mutual information for coupling analysis of multivariate time series. Physical Review Letters, 99, 204,101.

    Article  PubMed  Google Scholar 

  • Geweke, J. (1982). Measurement of linear dependence and feedback between multiple time series. Journal of the American Statistical Association, 77, 304–313.

    Article  Google Scholar 

  • Good, P. (2005). Permutation, parametric and bootstrap tests of hypotheses. Springer.

  • Granger, C. W. J. (1980). Testing for causality: A personal viewpoint. Journal of Economic Dynamics and Control, 2, 329–352.

    Article  Google Scholar 

  • Granger, C. W. J. (1988). Some recent developments in a concept of causality. Journal of Econometrics, 39, 199–211.

    Article  Google Scholar 

  • Gray, R. M. (1990). Entropy and information theory. New York: Springer.

    Google Scholar 

  • Gray, R. M., & Kieffer, J. C. (1980). Mutual information rate, distorsion and quantization in metric spaces. IEEE Transactions on Information Theory, 26, 412–422.

    Article  Google Scholar 

  • Hlavackova-Schindlera, K., Palus, M., Vejmelka, M., & Bhattacharya, J. (2007). Causality detection based on information-theoretic approaches in time series analysis. Physics Reports, 441, 1–46.

    Article  Google Scholar 

  • Jirsa, V. K., & McIntosh, A. R. (Eds.) (2007). Handbook of brain connectivity. New York: Springer.

    Google Scholar 

  • Kaiser, A., & Schreiber, T. (2002). Information transfer in continuous processes. Physica, D 166, 43–62.

    Article  CAS  Google Scholar 

  • Kaminski, M., Ding, M., Truccolo, W., & Bressler, S. (2001). Evaluating causal relations in neural systems: Granger causality, directed transfer functions and statistical assessment of significance. Biological Cybernetics, 85, 145–157.

    Article  CAS  PubMed  Google Scholar 

  • Kramer, G. (1998). Directed information for channels with feedback. Ph.D. thesis, Swiss Federal Institute of Technology Zurich.

  • Kraskov, A., Stogbauer, H., & Grassberger, P. (2004). Estimating mutual information. Physics Review E, 69, 066,138.

    Article  Google Scholar 

  • Kozachenko, L. F., & Leonenko, N. N. (1987). Sample estimate of the entropy of a random vector. Problems of Information Transmission, 23, 95–101.

    Google Scholar 

  • Lauritzen, S. (1996). Graphical models. Oxford: Oxford University Press.

    Google Scholar 

  • Lehmann, E. L., & Romano, J. P. (2005). Testing statistical hypotheses (3rd ed.). New York: Springer.

    Google Scholar 

  • Leonenko, N. N., Pronzato, L., & Savani, V. (2008). A class of Rényi information estimators for multidimensional densities. Annals of Statistics, 36, 2153–2182.

    Article  Google Scholar 

  • Lungarella, M., & Sporns, O. (2006). Mapping information flow in sensorimotor networks. PLOS Computational Biology, 2, 1301–1312.

    Article  CAS  Google Scholar 

  • Marko, H. (1973). The bidirectional communication theory—a generalization of information theory. IEEE Transactions on communications, 21(12), 1345–1351.

    Article  Google Scholar 

  • Massey, J. (1990). Causality, feedback and directed information. In Proc. intl. symp. on info. theory and its applications. Waikiki, Hawai, USA.

  • Michel, O. J. J., & Flandrin, P. (1996). Application of methods based on higher-order statistics for chaotic time series analysis. Signal Processing, 53, 133–148.

    Article  Google Scholar 

  • Palus, M., Komarek, V., Hrncir, Z., & Sterbova, K. (2001). Synchronisation as adjustment of information rates: Detection from bivariate time series. Physics Review E, 046211, 1–6.

    Google Scholar 

  • Paninski, L. (2003). Estimation of entropy and mutual information. Neural Computation, 15, 1191–1253.

    Article  Google Scholar 

  • Pearl, J. (2000). Causality: Models, reasoning and inference. Cambridge: Cambridge University Press.

    Google Scholar 

  • Pinsker, M. S. (1964). Information and information stability of random variables. Princeton: Holden Day.

    Google Scholar 

  • Rissanen, J., & Wax, M. (1987). Measures of mutual and causal dependence between two time series. IEEE Transactions on Information Theory, 33, 598–601.

    Article  Google Scholar 

  • Saito, Y., & Harashima, H. (1981). Recent advances in EEG and EMG data processing. In Chap. Tracking of information within multichannel EEG record-causal analysis in EEG (pp. 133–146). Amsterdam: Elsevier.

    Google Scholar 

  • Schreiber, T. (2000). Measuring information transfer. Physical Review Letters, 85(2), 461–465.

    Article  CAS  PubMed  Google Scholar 

  • Seth, A. K. (2005). Causal connectivity of evolved neural networks during behavior. Network: Computation in Neural Systems, 16, 35–54.

    Article  Google Scholar 

  • Seth, A. K., & Edelman, G. M. (2007). Distinguishing causal interactions in neural populations. Neural Computation, 19, 910–933.

    Article  PubMed  Google Scholar 

  • Sporns, O. (2007). Brain connectivity. Scholarpedia, 2(2), 4695. http://www.scholarpedia.org/article/Brain_connectivity.

    Article  Google Scholar 

  • Tatikonda, S., & Mitter, S. (2009). The capacity of channels with feedback. IEEE Transactions on Information Theory, 55, 323–349.

    Article  Google Scholar 

  • Tatikonda, S. C. (2000). Control under communication constraints. Ph.D. thesis, MIT.

  • Venkataramanan, R., & Pradhan, S. S. (2007). Source coding with feed-forward: Rate-distortion theorems and error exponents for a general source. IEEE transactions on Information Theory, 53, 2154–2179.

    Article  Google Scholar 

  • Victor, J. D. (2002). Binless strategies for estimation of information from neural data. Physics Review E, 66, 051,903-1–15.

    Google Scholar 

  • Wang, Q., Kulkarni, S., & Verdu, S. (2009). Divergence estimation for multidimensional densities via k-nearest-neighbor distances. IEEE Transactions on Information Theory, 55, 2392–2405.

    Article  Google Scholar 

  • Whittaker, J. (1989). Graphical models in applied multivariate statistics. New York: Wiley.

    Google Scholar 

Download references

Acknowledgements

The authors thank Dr Mark McDonnell, from the University of South Australia, Adelaide, for his thorough reading of the paper and his constructive remarks. This work was supported by ANR project NeuroFet, CNRS PEPS SolCaus, CNRS PIR Neuroinformatique, PPF ISSO U. Nice (France).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Pierre-Olivier Amblard.

Additional information

Action Editor: Alexander G. Dimitrov

Rights and permissions

Reprints and permissions

About this article

Cite this article

Amblard, PO., Michel, O.J.J. On directed information theory and Granger causality graphs. J Comput Neurosci 30, 7–16 (2011). https://doi.org/10.1007/s10827-010-0231-x

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10827-010-0231-x

Keywords

Navigation