Abstract
In this paper, we present experiments comparing different training algorithms for Radial Basis Functions (RBF) neural networks. In particular we compare the classical training which consist of an unsupervised training of centers followed by a supervised training of the weights at the output, with the full supervised training by gradient descent proposed recently in same papers. We conclude that a fully supervised training performs generally better. We also compare Batch training with Online training and we conclude that Online training suppose a reduction in the number of iterations.
Chapter PDF
Similar content being viewed by others
Keywords
- Radial Basis Function
- Gradient Descent
- Training Algorithm
- Radial Basis Function Neural Network
- Radial Basis Function
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Moody, J., Darken, C.J.: Fast Learning in Networks of Locally-Tuned Procesing Units. Neural Computation 1, 281–294 (1989)
Roy, A., Govil, S., et al.: A Neural-Network Learning Theory and Polynomial Time RBF Algorithm. IEEE Trans. on Neural Networks 8(6), 1301–1313 (1997)
Hwang, Y., Bang, S.: An Efficient Method to Construct a Radial Basis Function Neural Network Classifier. Neural Network 10(8), 1495–1503 (1997)
Roy, A., Govil, S., et al.: An Algorithm to Generate Radial Basis Function (RBF)-Like Nets for Classification Problems. Neural Networks 8(2), 179–201 (1995)
Krayiannis, N.: Reformulated Radial Basis Neural Networks Trained by Gradient Descent. IEEE Trans. on Neural Networks 10(3), 657–671 (1999)
Krayiannis, N., Randolph-Gips, M.: On the Construction and Training of Reformulated Radial Basis Functions. IEEE Trans. Neural Networks 14(4), 835–846 (2003)
Newman, D.J., Hettich, S., Blake, C.L., Merz, C.J.: UCI Repository of machine learning databases, University of California, Irvine, Dept. of Information and Computer Sciences (1998), http://www.ics.uci.edu/~mlearn/MLRepository.html
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Torres-Sospedra, J., Hernández-Espinosa, C., Fernández-Redondo, M. (2006). An Experimental Study on Training Radial Basis Functions by Gradient Descent. In: Schwenker, F., Marinai, S. (eds) Artificial Neural Networks in Pattern Recognition. ANNPR 2006. Lecture Notes in Computer Science(), vol 4087. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11829898_8
Download citation
DOI: https://doi.org/10.1007/11829898_8
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-37951-5
Online ISBN: 978-3-540-37952-2
eBook Packages: Computer ScienceComputer Science (R0)