Skip to main content

Distance-Based Delays in Echo State Networks

  • Conference paper
  • First Online:
Intelligent Data Engineering and Automated Learning – IDEAL 2022 (IDEAL 2022)

Abstract

Physical reservoir computing, a paradigm bearing the promise of energy-efficient high-performance computing, has raised much attention in recent years. We argue though, that the effect of signal propagation delay on reservoir task performance, one of the most central aspects of physical reservoirs, is still insufficiently understood in a more general learning context. Such physically imposed delay has been found to play a crucial role in some specific physical realizations, such as integrated photonic reservoirs. While delays at the readout layer and input of Echo State Networks (ESNs) have been successfully exploited before to improve performance, to our knowledge this feature has not been studied in a more general setting. We introduce inter-node delays, based on physical distances, into ESNs as model systems for physical reservoir computing. We propose a novel ESN design that includes variable signal delays along the connections between neurons, comparable to varying axon lengths in biological neural networks or varying length delay lines in physical systems. We study the impact of the resulting variable inter-node delays in this setup in comparison with conventional ESNs and find that incorporating variable delays significantly improves reservoir performance on the NARMA-10 benchmark task.

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No 860949.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Akiyama, T., Tanaka, G.: Computational efficiency of multi-step learning echo state networks for nonlinear time series prediction. IEEE Access 10, 28535–28544 (2022). https://doi.org/10.1109/ACCESS.2022.3158755

    Article  Google Scholar 

  2. Atiya, A.F., Parlos, A.G.: New results on recurrent network training: unifying the algorithms and accelerating convergence. IEEE Trans. Neural Netw. 11(3), 697–709 (2000)

    Article  Google Scholar 

  3. Caminiti, R., et al.: Diameter, length, speed, and conduction delay of callosal axons in macaque monkeys and humans: comparing data from histology and magnetic resonance imaging diffusion tractography. J. Neurosci. 33(36), 14501–14511 (2013). https://doi.org/10.1523/JNEUROSCI.0761-13.2013. https://www.jneurosci.org/content/33/36/14501

  4. Dambre, J., Verstraeten, D., Schrauwen, B., Massar, S.: Information processing capacity of dynamical systems. Sci. Rep. 2(1), 1–7 (2012)

    Article  Google Scholar 

  5. Freiberger, M., Bienstman, P., Dambre, J.: A training algorithm for networks of high-variability reservoirs. Sci. Rep. 10(1), 1–11 (2020)

    Article  Google Scholar 

  6. Gallicchio, C., Micheli, A., Pedrelli, L.: Deep echo state networks for diagnosis of Parkinson’s disease. arXiv preprint arXiv:1802.06708 (2018)

  7. Gallicchio, C., Micheli, A., Pedrelli, L.: Design of deep echo state networks. Neural Netw. 108, 33–47 (2018). https://doi.org/10.1016/j.neunet.2018.08.002. https://www.sciencedirect.com/science/article/pii/S0893608018302223

  8. Hansen, N.: The CMA evolution strategy: a comparing review. In: Lozano, J.A., Larrañaga, P., Inza, I., Bengoetxea, E. (eds.) Towards a New Evolutionary Computation. STUDFUZZ, vol. 192, pp. 75–102. Springer, Heidelberg (2006). https://doi.org/10.1007/3-540-32494-1_4

    Chapter  Google Scholar 

  9. Hansen, N., Akimoto, Y., Baudis, P.: CMA-ES/pycma on Github. Zenodo (2019). https://doi.org/10.5281/zenodo.2559634

  10. Hoerl, A.E., Kennard, R.W.: Ridge regression: biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970). https://doi.org/10.1080/00401706.1970.10488634. www.tandfonline.com/doi/abs/10.1080/00401706.1970.10488634

  11. Holzmann, G., Hauser, H.: Echo state networks with filter neurons and a delay &sum readout. Neural Netw. 23(2), 244–256 (2010). https://doi.org/10.1016/j.neunet.2009.07.004. www.sciencedirect.com/science/article/pii/S0893608009001580

  12. Jaeger, H.: The “echo state” approach to analysing and training recurrent neural networks-with an erratum note. German National Research Center for Information Technology GMD Technical Report, Bonn, Germany, 148(34), 13 (2001)

    Google Scholar 

  13. Jaeger, H.: Tutorial on training recurrent neural networks, covering BPPT, RTRL, EKF and the “echo state network” approach, vol. 5. GMD-Forschungszentrum Informationstechnik, Bonn (2002)

    Google Scholar 

  14. Jaeger, H.: Echo state network. Scholarpedia 2(9), 2330 (2007)

    Article  Google Scholar 

  15. Jaurigue, L., Robertson, E., Wolters, J., Lüdge, K.: Reservoir computing with delayed input for fast and easy optimisation. Entropy 23(12) (2021). https://doi.org/10.3390/e23121560. www.mdpi.com/1099-4300/23/12/1560

  16. Jeanson, F., White, A.: Evolving axonal delay neural networks for robot control. In: Proceedings of the 14th Annual Conference on Genetic and Evolutionary Computation, GECCO 2012, pp. 121–128. Association for Computing Machinery, New York (2012). https://doi.org/10.1145/2330163.2330181

  17. Li, Z., Tanaka, G.: Deep echo state networks with multi-span features for nonlinear time series prediction. In: 2020 International Joint Conference on Neural Networks (IJCNN), pp. 1–9 (2020). https://doi.org/10.1109/IJCNN48605.2020.9207401

  18. Long, J., Zhang, S., Li, C.: Evolving deep echo state networks for intelligent fault diagnosis. IEEE Trans. Ind. Inf. 16(7), 4928–4937 (2020). https://doi.org/10.1109/TII.2019.2938884

    Article  Google Scholar 

  19. Madadi Asl, M., Valizadeh, A., Tass, P.A.: Dendritic and axonal propagation delays determine emergent structures of neuronal networks with plastic synapses. Sci. Rep. 7(1), 1–12 (2017)

    Article  Google Scholar 

  20. Tanaka, G., et al.: Recent advances in physical reservoir computing: a review. Neural Netw. 115, 100–123 (2019). https://doi.org/10.1016/j.neunet.2019.03.005. www.sciencedirect.com/science/article/pii/S0893608019300784

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Stefan Iacob .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Iacob, S., Freiberger, M., Dambre, J. (2022). Distance-Based Delays in Echo State Networks. In: Yin, H., Camacho, D., Tino, P. (eds) Intelligent Data Engineering and Automated Learning – IDEAL 2022. IDEAL 2022. Lecture Notes in Computer Science, vol 13756. Springer, Cham. https://doi.org/10.1007/978-3-031-21753-1_21

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-21753-1_21

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-21752-4

  • Online ISBN: 978-3-031-21753-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics