Optimal asymptotic learning rate: Macroscopic versus microscopic dynamics

Todd K. Leen, Bernhard Schottky, and David Saad
Phys. Rev. E 59, 985 – Published 1 January 1999
PDFExport Citation

Abstract

We investigate the asymptotic dynamics of on-line learning for neural networks, and provide an exact solution to the network dynamics at late times under various annealing schedules. The dynamics is solved using two different frameworks: the master equation and order parameter dynamics, which concentrate on microscopic and macroscopic parameters, respectively. The two approaches provide complementary descriptions of the dynamics. Optimal annealing rates and the corresponding prefactors are derived for soft committee machine networks with hidden layers of arbitrary size.

  • Received 1 June 1998

DOI:https://doi.org/10.1103/PhysRevE.59.985

©1999 American Physical Society

Authors & Affiliations

Todd K. Leen1, Bernhard Schottky2, and David Saad2

  • 1Department of Computer Science and Engineering, Oregon Graduate Institute of Science and Technology, P.O. Box 91000, Portland, Oregon 97291-1000
  • 2Neural Computing Research Group, University of Aston, Birmingham B4 7ET, United Kingdom

References (Subscription Required)

Click to Expand
Issue

Vol. 59, Iss. 1 — January 1999

Reuse & Permissions
Access Options
Author publication services for translation and copyediting assistance advertisement

Authorization Required


×
×

Images

×

Sign up to receive regular email alerts from Physical Review E

Log In

Cancel
×

Search


Article Lookup

Paste a citation or DOI

Enter a citation
×