• Open Access

Transient Chaotic Dimensionality Expansion by Recurrent Networks

Christian Keup, Tobias Kühn, David Dahmen, and Moritz Helias
Phys. Rev. X 11, 021064 – Published 25 June 2021

Abstract

Neurons in the brain communicate with spikes, which are discrete events in time and value. Functional network models often employ rate units that are continuously coupled by analog signals. Is there a qualitative difference implied by these two forms of signaling? We develop a unified mean-field theory for large random networks to show that first- and second-order statistics in rate and binary networks are in fact identical if rate neurons receive the right amount of noise. Their response to presented stimuli, however, can be radically different. We quantify these differences by studying how nearby state trajectories evolve over time, asking to what extent the dynamics is chaotic. Chaos in the two models is found to be qualitatively different. In binary networks, we find a network-size-dependent transition to chaos and a chaotic submanifold whose dimensionality expands stereotypically with time, while rate networks with matched statistics are nonchaotic. Dimensionality expansion in chaotic binary networks aids classification in reservoir computing and optimal performance is reached within about a single activation per neuron; a fast mechanism for computation that we demonstrate also in spiking networks. A generalization of this mechanism extends to rate networks in their respective chaotic regimes.

  • Figure
  • Figure
  • Figure
  • Figure
  • Figure
  • Figure
  • Figure
7 More
  • Received 7 May 2020
  • Revised 18 March 2021
  • Accepted 23 April 2021

DOI:https://doi.org/10.1103/PhysRevX.11.021064

Published by the American Physical Society under the terms of the Creative Commons Attribution 4.0 International license. Further distribution of this work must maintain attribution to the author(s) and the published article’s title, journal citation, and DOI.

Published by the American Physical Society

Physics Subject Headings (PhySH)

Statistical Physics & Thermodynamics

Authors & Affiliations

Christian Keup1,2,*,†, Tobias Kühn1,2,3,*, David Dahmen1, and Moritz Helias1,4

  • 1Institute of Neuroscience and Medicine (INM-6) and Institute for Advanced Simulation (IAS-6) and JARA Institut Brain Structure-Function Relationships (INM-10), Jülich Research Centre, Jülich, Germany
  • 2RWTH Aachen University, Aachen, Germany
  • 3Laboratoire de Physique de l’ENS, Laboratoire MSC de l’Université de Paris, CNRS, Paris, France
  • 4Department of Physics, Faculty 1, RWTH Aachen University, Aachen, Germany

  • *C. K. and T. K. contributed equally to this work.
  • Corresponding author. c.keup@fz-juelich.de

Popular Summary

The electrical signals in densely connected networks of neurons, such as in the brain, tend to be strongly chaotic. How, then, can these circuits reliably process information? Past investigations have studied mainly weak chaotic activity. Here, we demonstrate a universal mechanism that explains how even strongly chaotic network activity can support powerful computations. Our calculations are based on a novel unified theoretical framework, which allows us to compare different models of neural networks.

Concretely, we apply this framework to two common classes of models: binary neurons, which switch between a pulse-emitting and nonemitting state, and rate neurons, which discard details about individual pulses and describe just the number of pulses per second. These models implement two different assumptions about the substrate of computation, either single pulses or the rate of pulses.

We calculate the transition to chaos in binary networks and show that each chaotic binary network corresponds to an equivalent rate network with the same activity statistics, but with nonchaotic dynamics. The activity in strongly chaotic regimes of binary and nonbinary network models promotes the separability of different input stimuli. This is because state trajectories for different inputs diverge from one another in a stereotypical way. Binary networks and pulse-coupled networks offer a particularly fast separation.

The theoretical framework can serve as a bridge between many types of existing neural-network models. Furthermore, our results provide predictions for experimental recordings in brain circuits and invite research on the use of chaotic dynamics as a resource of computation in artificial neural networks.

Key Image

Article Text

Click to Expand

References

Click to Expand
Issue

Vol. 11, Iss. 2 — April - June 2021

Subject Areas
Reuse & Permissions
Author publication services for translation and copyediting assistance advertisement

Authorization Required


×
×

Images

×

Sign up to receive regular email alerts from Physical Review X

Reuse & Permissions

It is not necessary to obtain permission to reuse this article or its components as it is available under the terms of the Creative Commons Attribution 4.0 International license. This license permits unrestricted use, distribution, and reproduction in any medium, provided attribution to the author(s) and the published article's title, journal citation, and DOI are maintained. Please note that some figures may have been included with permission from other third parties. It is your responsibility to obtain the proper permission from the rights holder directly for these figures.

×

Log In

Cancel
×

Search


Article Lookup

Paste a citation or DOI

Enter a citation
×