Abstract
Neurons in the brain communicate with spikes, which are discrete events in time and value. Functional network models often employ rate units that are continuously coupled by analog signals. Is there a qualitative difference implied by these two forms of signaling? We develop a unified mean-field theory for large random networks to show that first- and second-order statistics in rate and binary networks are in fact identical if rate neurons receive the right amount of noise. Their response to presented stimuli, however, can be radically different. We quantify these differences by studying how nearby state trajectories evolve over time, asking to what extent the dynamics is chaotic. Chaos in the two models is found to be qualitatively different. In binary networks, we find a network-size-dependent transition to chaos and a chaotic submanifold whose dimensionality expands stereotypically with time, while rate networks with matched statistics are nonchaotic. Dimensionality expansion in chaotic binary networks aids classification in reservoir computing and optimal performance is reached within about a single activation per neuron; a fast mechanism for computation that we demonstrate also in spiking networks. A generalization of this mechanism extends to rate networks in their respective chaotic regimes.
7 More- Received 7 May 2020
- Revised 18 March 2021
- Accepted 23 April 2021
DOI:https://doi.org/10.1103/PhysRevX.11.021064
Published by the American Physical Society under the terms of the Creative Commons Attribution 4.0 International license. Further distribution of this work must maintain attribution to the author(s) and the published article’s title, journal citation, and DOI.
Published by the American Physical Society
Physics Subject Headings (PhySH)
Popular Summary
The electrical signals in densely connected networks of neurons, such as in the brain, tend to be strongly chaotic. How, then, can these circuits reliably process information? Past investigations have studied mainly weak chaotic activity. Here, we demonstrate a universal mechanism that explains how even strongly chaotic network activity can support powerful computations. Our calculations are based on a novel unified theoretical framework, which allows us to compare different models of neural networks.
Concretely, we apply this framework to two common classes of models: binary neurons, which switch between a pulse-emitting and nonemitting state, and rate neurons, which discard details about individual pulses and describe just the number of pulses per second. These models implement two different assumptions about the substrate of computation, either single pulses or the rate of pulses.
We calculate the transition to chaos in binary networks and show that each chaotic binary network corresponds to an equivalent rate network with the same activity statistics, but with nonchaotic dynamics. The activity in strongly chaotic regimes of binary and nonbinary network models promotes the separability of different input stimuli. This is because state trajectories for different inputs diverge from one another in a stereotypical way. Binary networks and pulse-coupled networks offer a particularly fast separation.
The theoretical framework can serve as a bridge between many types of existing neural-network models. Furthermore, our results provide predictions for experimental recordings in brain circuits and invite research on the use of chaotic dynamics as a resource of computation in artificial neural networks.