Abstract
A novel multistage feedforward network is proposed for efficient solving of difficult classification tasks. The standard Radial Basis Functions (RBF) architecture is modified in order to alleviate two potential drawbacks, namely the ‘curse of dimensionality’ and the limited discriminatory capacity of the linear output layer. The first goal is accomplished by feeding the hidden layer output to the input of a module performing Principal Component Analysis (PCA). The second one is met by substituting the simple linear combiner in the standard architecture by a Multilayer Perceptron (MLP). Simulation results for the 2-spirals problem and Peterson-Barney vowel classification are reported, showing high classification accuracy using less parameters than existing solutions.
Similar content being viewed by others
References
Bishop, C. M.: Neural Networks for Pattern Recognition, Oxford University Press, New York, 1995.
Haykin, S.: Neural Networks-A Comprehensive Foundation, IEEE Press, New York, 1994.
Broomhead, D. S. and Lowe, D.: Multivariable functional interpolation and adaptive networks, Complex Syst., 2 (1988), 321–355.
Lehtokangas, M.: Determining the number of centroids for CMLP network, Neural Networks, 13 (2000), 525–531.
Lehtokangas, M. and Saarinen, J.: Centroid based multilayer perceptron networks, Neural Processing Letters, 7 (1998), 101–106.
Flake, G. W.: Square Unit Augmented, Radially Extended, Multilayer Perceptrons, In: G. B. Orr and K. R. Muller (Eds.), Neural Networks: Tricks of the Trade, Springer-Verlag: Berlin, pp. 145–164, 1998.
Fahlman, S. E. and Lebiere, C.: The cascade-correlation learning architecture, In: Proc. NISP, 2, pp. 524–532, 1990.
Platt, J. C.: A resource allocating network for function interpolation, Neural Comput., 3 (1991), 213–225.
Moody, J.: The effective number of parameters: An analysis of generalisation and regularisation in nonlinear learning systems, In: J. E. Moody, S. J. Hanson, R. P. Lippmann, (Eds.), Proc. NISP, 4, Morgan Kaufmann: San Mateo, CA, pp. 847–854, 1992.
Suykens J. A. K. and Vandewalle, J.: Least Squares Support Vector Machine Classifiers, Neural Processing Letters, 9 (1999), 293–300.
Shang, Y. and Wah, B. W.: Global Optimization for Neural Network Training, IEEE Computer, 29 (1996), 45–54.
Japkowicz, N., Myers, C. and Gluck, M.: A novelty detection approach to classification, In: Proc. 14th Joint Conference on Artificial Intelligence, pp. 518–523, 1995.
Peterson, G. E. and Barney, H. L. Control methods used in a study of vowels, JASA, 24 (1952), 175–184.
Bridle, J. S.: Probabilistic interpretation of feedforward classification network outputs, with relationships to statistical pattern recognition, In: F. Fougelman-Soulie and J. Herault (Eds.), Neuro-computing: algorithms, architectures and applications, Springer-Verlag, 1990.
Lowe, D.: Adaptive radial basis function nonlinearities and the problem of generalization, In: Proceedings IEE Conference on ANN, 1989.
Kadirkamanathan, V. and Niranjan, M.: Application of an Architecturally Dynamic Network for Speech Pattern Classification, In: Proceedings Inst. Acoustics, 14 (1992), 343–350.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Ciocoiu, I.B. Hybrid Feedforward Neural Networks for Solving Classification Problems. Neural Processing Letters 16, 81–91 (2002). https://doi.org/10.1023/A:1019755726221
Issue Date:
DOI: https://doi.org/10.1023/A:1019755726221