Skip to main content

Memory Capacity of Input-Driven Echo State Networks at the Edge of Chaos

  • Conference paper
Book cover Artificial Neural Networks and Machine Learning – ICANN 2014 (ICANN 2014)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 8681))

Included in the following conference series:

Abstract

Reservoir computing provides a promising approach to efficient training of recurrent neural networks, by exploiting the computational properties of the reservoir structure. Various approaches, ranging from suitable initialization to reservoir optimization by training have been proposed. In this paper we take a closer look at short-term memory capacity, introduced by Jaeger in case of echo state networks. Memory capacity has recently been investigated with respect to criticality, the so called edge of chaos, when the network switches from a stable regime to an unstable dynamic regime. We calculate memory capacity of the networks for various input data sets, both random and structured, and show how the data distribution affects the network performance. We also investigate the effect of reservoir sparsity in this context.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bertschinger, N., Natschläger, T.: Real-time computation at the edge a of chaos in recurrent neural networks. Neural Computation 16(7), 1413–1436 (2004)

    Article  MATH  Google Scholar 

  2. Boedecker, J., Obst, O., Lizier, J., Mayer, N., Asada, M.: Information processing in echo state networks at the edge of chaos. Theory in Biosciences 131, 205–213 (2012)

    Article  Google Scholar 

  3. Büsing, L., Schrauwen, B., Legenstein, R.: Connectivity, dynamics, and memory in reservoir computing with binary and analog neurons. Neural Computation 22(5), 1272–1311 (2010)

    Article  MATH  MathSciNet  Google Scholar 

  4. Hermans, M., Schrauwen, B.: Memory in linear recurrent neural networks in continuous time. Neural Networks 23, 341–355 (2010)

    Article  Google Scholar 

  5. Huebner, U., Abraham, N., Weiss, C.: Dimensions and entropies of chaotic intensity pulsations in a single-mode far-infrared NH3 laser. Physics Reviews A 40(11), 6354–6365 (1989)

    Article  Google Scholar 

  6. Jaeger, H.: Short term memory in echo state networks. Tech. Rep. GMD Report 152, German National Research Center for Information Technology (2002)

    Google Scholar 

  7. Jaeger, H.: Echo state network. Scholarpedia 2(9) (2007)

    Google Scholar 

  8. Legenstein, R., Maass, W.: What makes a dynamical system computationally powerful? In: New Directions in Statistical Signal Processing: From Systems to Brain, pp. 127–154. MIT Press (2007)

    Google Scholar 

  9. Lukosevicius, M., Jaeger, H.: Reservoir computing approaches to recurrent neural network training. Computer Science Review 3(3), 127–149 (2009)

    Article  Google Scholar 

  10. Ozturk, M., Xu, C., Principe, J.: Analysis and design of echo state networks. Neural Computation 19, 111–138 (2006)

    Article  Google Scholar 

  11. Rodan, A., Tiňo, P.: Minimum complexity echo state network. IEEE Transaction on Neural Networks 21(1), 131–144 (2011)

    Google Scholar 

  12. Schrauwen, B., Buesing, L., Legenstein, R.: On computational power and the order-chaos phase transition in reservoir computing. In: Advances in Neural Information Processing Systems, pp. 1425–1432 (2009)

    Google Scholar 

  13. Sprott, J.: Chaos and Time-Series Analysis. Oxford University Press (2003)

    Google Scholar 

  14. Verstraeten, D., Dambre, J., Dutoit, X., Schrauwen, B.: Memory versus non-linearity in reservoirs. In: International Joint Conference on Neural Networks, pp. 1–8 (2010)

    Google Scholar 

  15. White, O., Lee, D., Sompolinsky, H.: Short-term memory in orthogonal neural networks. Physical Review Letters 92(14), 148102 (2004)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Barančok, P., Farkaš, I. (2014). Memory Capacity of Input-Driven Echo State Networks at the Edge of Chaos. In: Wermter, S., et al. Artificial Neural Networks and Machine Learning – ICANN 2014. ICANN 2014. Lecture Notes in Computer Science, vol 8681. Springer, Cham. https://doi.org/10.1007/978-3-319-11179-7_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-11179-7_6

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-11178-0

  • Online ISBN: 978-3-319-11179-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics