Skip to main content

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 4693))

  • 2050 Accesses

Abstract

Elman presented a network with a context layer for time-series processing. The context layer keeps the output of the hidden layer, and the outputs are inputted to the hidden layer for the next calculation of the time-series. In this paper, the context layer is reformed into the internal memory layer, which is connected from the hidden layer with the connection weights to make the internal memory. The new learning algorithm, called the time-delayed back-propagation learning, is developed for the internal memory. The ability of the network with the internal memory layer and the organized states of the internal memory are demonstrated by applying it to the simple sinusoidal time-series.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Elman, J.L.: Finding Structure in Time. Cognitive science 14, 179–211 (1990)

    Article  Google Scholar 

  2. Elman, J.L.: Learning and Development in Neural Networks: The Importance of Starting Small. Cognition 48, 71–99 (1993)

    Article  Google Scholar 

  3. Koskela, T., Lehtokangas, M., Saarinen, J., Kaski, K.: Time Series Prediction with Multilayer Perceptron, FIR and Elman Neural Networks. In: Proc. of the World Congress on Neural Networks, pp. 491–496. INNS Press, San Diego, USA (1996)

    Google Scholar 

  4. Cholewo, T.J., Zurada, J.M.: Sequential Network Construction for Time Series Prediction. In: Proc. of the IEEE Intl. Joint Conf. on Neural Networks, Houston, Texas, USA, pp. 2034–2039. IEEE Computer Society Press, Los Alamitos (1997)

    Google Scholar 

  5. Giles, C.L., Lawrence, S., Tsoi, A.C.: Noisy Time Series Prediction Using a Recurrent Neural Network and Grammatical Inference. Machine learning 44(1/2), 161–183 (2001)

    Article  MATH  Google Scholar 

  6. Iwasa, K., Deguchi, T., Ishii, N.: Acquisition of the Time-Series Information in the Network with Internal Memory (in Japanese). IEICE Technical Report NC 2001–71, pp. 7–12 (2001)

    Google Scholar 

  7. Deguchi, T., Ishii, N.: Delayed Learning on Internal Memory Network and Organizing Internal States. In: Wang, J., Yi, Z., Zurada, J.M., Lu, B.-L., Yin, H. (eds.) ISNN 2006. LNCS, vol. 3971, pp. 502–508. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

  8. Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning internal representations by error propagation. In: Rumelhart, D.E., McClelland, J.L. (eds.) Parallel Distributed Processing: Explorations in the Microstructure of Cognition, vol. 1, pp. 318–362. MIT Press, Cambridge, MA (1986)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Deguchi, T., Ishii, N. (2007). Delayed Learning and the Organized States. In: Apolloni, B., Howlett, R.J., Jain, L. (eds) Knowledge-Based Intelligent Information and Engineering Systems. KES 2007. Lecture Notes in Computer Science(), vol 4693. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-74827-4_128

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-74827-4_128

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-74826-7

  • Online ISBN: 978-3-540-74827-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics