Skip to main content
Log in

Node splitting: A constructive algorithm for feed-forward neural networks

  • Articles
  • Published:
Neural Computing & Applications Aims and scope Submit manuscript

Abstract

A constructive algorithm is proposed for feed-forward neural networks which uses node-splitting in the hidden layers to build large networks from smaller ones. The small network forms an approximate model of a set of training data, and the split creates a larger, more powerful network which is initialised with the approximate solution already found. The insufficiency of the smaller network in modelling the system which generated the data leads to oscillation in those hidden nodes whose weight vectors cover regions in the input space where more detail is required in the model. These nodes are identified and split in two using principal component analysis, allowing the new nodes to cover the two main modes of the oscillating vector. Nodes are selected for splitting using principal component analysis on the oscillating weight vectors, or by examining the Hessian matrix of second derivatives of the network error with respect to the weights.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Wynne-Jones M. Constructive algorithms and pruning: Improving the multi layer perceptron. In: Vichnevetsky R, Miller JJH, editors. Proceedings of the 13th IMACS World Congress on Computation and Applied Mathematics; 1991 July; Dublin: 747–50

  2. Ash T. Dynamic node creation in backpropagation networks. La Jolla (CA): Institute for Cognitive Science, UCSD; 1989 Feb. Technical Report 8901

  3. Press WH, Flannery BP, Teukolsky SA, Vetterling WT. Numerical Recipes in C: The Art of Scientific Computing. Cambridge: Cambridge University Press, 1986

    Google Scholar 

  4. Oja E. Subspace Methods of Pattern Recognition. Section 3.2. Letchworth: Research Studies Press, 1983

    Google Scholar 

  5. Sanger T, Optimal unsupervised learning in a single-layer feedforward neural network. Neural Networks 1989; 2: 459–473

    Google Scholar 

  6. Hanson SJ, Meiosis Networks. In: Touretzky DS, editor. Advances in Neural Information Processing Systems 2. San Mateo, CA: Morgan Kaufmann, 1990 Apr: 533–541

    Google Scholar 

  7. Mozer MC, Smolensky P. Skeletonization: a technique for trimming the fat from a neural network. In: Touretzky DS, ed. Advances in Neural Information Processing Systems 1. San Mateo, CA: Morgan Kaufmann, 1989 Apr: 107–115

    Google Scholar 

  8. Le Cun Y, Denker JS, Solla SA. Optimal Brain Damage. In: Touretzky DS, editor. Advances in Neural Information Processing Systems 2. San Mateo, CA: Morgan Kaufmann, 1990 Apr: 598–605

    Google Scholar 

  9. Heading AJR. An analysis of noise tolerance in multi-layer perceptrons. Malvern, UK: DRA Electronics Division, Research Note SP4 122

  10. Huang D, Ariki Y, Jack MA. Hidden Markov models for speech recognition, Edinburgh: Edinburgh University Press, 1990

    Google Scholar 

  11. Bridle JS, Cox SJ. Recnorm: Simultaneous normalisation and classification applied to speech recognition. In: Lippmann RP, Moody JE, Touretzky DS, eds. Advances in Neural Information Processing Systems 3. San Mateo, CA: Morgan Kaufmann, 1991 Sept: 234–240

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Wynne-Jones, M. Node splitting: A constructive algorithm for feed-forward neural networks. Neural Comput & Applic 1, 17–22 (1993). https://doi.org/10.1007/BF01411371

Download citation

  • Received:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF01411371

Keywords

Navigation