The interplay between randomness and optimization has always been a major theme in the design of neural networks [3]. In the last 15 years, the success of reservoir computing (RC) showed that, in many scenarios, the algebraic structure of the recurrent component is far more important than the precise fine-tuning of its weights. As long as the recurrent part of the network possesses a form of fading memory of the input, the dynamics of the neurons are enough to efficiently process many spatio-temporal signals, provided that their activations are sufficiently heterogeneous. Even if today it is feasible to fully optimize deep recurrent networks, their implementation still requires a vast degree of experience and practice, not to mention vast computational resources, limiting their applicability in simpler architectures (e.g., embedded systems) or in areas where time is of key importance (e.g., online systems). Not surprisingly, then, RC remains a powerful tool for quickly solving dynamical problems, and it has become an invaluable tool for modeling and analysis in neuroscience.

Ten years after the last special issue entirely dedicated to the topic [2], this issue aims at providing an up-to-date overview on (some of) the latest developments in the field. Recently, Goudarzi and Teuscher listed a series of 11 questions that will drive research in RC from here forward [1]. Although we cannot cover all of them in a single issue, many of these questions are addressed in the articles that compose the issue, which we believe provides a good overview on the diversity and the vitality of the field. Overall, we hope the issue to be of interest to the readers of Cognitive Computation.

In particular, we selected ten papers to appear in this special issue. All of them have gone through at least two rounds of revision by two to four expert reviewers. One paper, coauthored by one of the guest editors, underwent an independent review process to guarantee fairness. The articles are logically organized in three separate parts. The first third of the issue is dedicated to the study of delay-line architectures, which have recently been inspired by the possibility of implementation on non-conventional computing architectures, most notably photonic computers. The second part of the issue investigates some theoretical aspects of RC models, and the third part is devoted to innovative formulations for designing architectures for learning and recognition tasks.

The first four papers of the special issue are dedicated to photonic RC and time-delay architectures:

  • In ‘Online training for high-performance analogue readout layers in photonic reservoir computers’, Antonik et al. propose the use of online training algorithms when exploiting analogue readouts in photonic RC. Their simulated experiments show that online algorithms can be beneficial to the task, particularly thanks to the possibility of including nonlinearities in the readout.

  • In ‘A Multiple-Input Strategy to Efficient Integrated Photonic Reservoir Computing,’ Katumba et al. explore the design of optical reservoirs (in terms of performance and power efficiency), whenever the input is fed to more than a single node. Their results further extend the applicability of such architectures.

  • In ‘Real-time audio processing with a cascade of discrete-time delay line based reservoir computers,’ Keuninckx et al. present a compelling use case for delay-line RC models for tasks of real-time audio processing. Specifically, their experiments on guitar amplifier distortion show the technique to be viable even in today’s computing hardware.

  • Finally, ‘Reservoir computing with an ensemble of time-delay reservoirs,’ by Ortín and Pesquera, investigates ways of building ensemble models with time-delay reservoirs, based on both decoupled neurons and neurons coupled by feedback. These architectures are able to boost the processing speed of the models with respect to the standard RC approach.

The second part of the issue is dedicated to theoretical models of RC:

  • In ‘Echo State Property of Deep Reservoir Computing Networks,’ Gallicchio and Micheli provide necessary and sufficient conditions for the echo state property of multi-layered RC modules. They show that stacking multiple layers of fixed recurrent neurons drives the network to less stable regimes, but the resulting models are able to analyze a signal at different temporal resolutions.

  • Nikiforou et al., in ‘An investigation of the dynamical transitions in harmonically driven random networks of firing-rate neurons,’ study the behavior of continuous recurrent networks when subjected to harmonically oscillating stimuli. Their findings shed light on the dynamics and structure of this class of neural networks.

Finally, the third part of the issue is devoted to the design of echo state networks (ESNs):

  • In ‘Training Echo State Networks with Regularization Through Dimensionality Reduction,’ Løkse et al. train ESNs by first projecting the internal state to a space of lower dimensionality. Superior predictive performance is then validated via the use of sophisticated visualization tools from chaos theory.

  • Mayer and Yu, in ‘Orthogonal Echo State Networks and stochastic evaluations of likelihoods,’ use ESNs with orthogonal weight matrices to build probabilistic likelihood estimates of time series. They analyze and evaluate several parameters influencing the behavior of such networks.

  • Wootton et al., in ‘Optimizing Echo State Networks for Static Pattern Recognition,’ put forth the idea of applying ESNs for static tasks, by letting them run until they reach stable outputs. These ESNs outperformed other machine learning techniques, even when they were still in an unstable state.

  • Finally, in ‘Reservoir computing with both neuronal intrinsic plasticity and multi-clustered structure,’ Xue et al. combine two previously proposed ways to enhance ESNs, namely multi-clustered reservoirs and intrinsic plasticity. The results show that the two techniques are both beneficial to the performance of the models.

We would like to thank the editor in chief of the journal, Amir Hussain, for the strong support given when organizing and publishing this special issue; all the authors who participated in the issue; and the anonymous reviewers who helped us evaluated and improve the quality of the submissions (listed in purely alphabetical order): Piotr Antonik, Dorian Aur, Omri Barak, Daniel Brunner, Yanne Chembo, Danilo Comminiello, Claudio Gallicchio, Lorenzo Livi, Sigurd Løkse, Ali Rodan, Manuel Roveri, Miguel Soriano, David Sussillo, Guy Van Der Sande, Thomas Van Vaerenbergh, and Adam Wootton.