Abstract
A study of noise tolerance characteristics of an adaptive learning algorithm for supervised neural network is presented in this paper. The algorithm allows the existing knowledge to age out in slow rate as a supervised neural network is gradually retrained with consecutive sets of new samples, resembling the change of application locality under a consistent environment. The algorithm utilizes the contour preserving classification algorithm to pre-process the training data to improve the classification and the noise tolerance. The experimental results convincingly confirm the effectiveness of the algorithm and the improvement of noise tolerance.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Tanprasert, T., Kripruksawan, T.: An approach to control aging rate of neural networks under adaptation to gradually changing context. In: ICONIP 2002 (2002)
Tanprasert, T., Kaitikunkajorn, S.: Improving synthesis process of decayed prior sampling technique. In: Tech 2005 (2005)
Tanprasert, T., Tanprasert, C., Lursinsap, C.: Contour preserving classification for maximal reliability. In: IJCNN 1998 (1998)
Burzevski, V., Mohan, C.K.: Hierarchical growing cell structures. In: ICNN 1996 (1996)
Fritzke, B.: Vector quantization with a growing and splitting elastic net. In: ICANN 1993 (1993)
Fritzke, B.: Incremental learning of local linear mappings. In: ICANN 1995 (1995)
Martinez, T.M., Berkovich, S.G., Schulten, K.J.: Neural-gas network for vector quantization and it application to time-series prediction. IEEE Transactions on Neural Networks (1993)
Chalup, S., Hayward, R., Joachi, D.: Rule extraction from artificial neural networks trained on elementary number classification tasks. In: Proceedings of the 9th Australian Conference on Neural Networks (1998)
Craven, M.W., Shavlik, J.W.: Using sampling and queries to extract rules from trained neural networks. In: ICML 1994 (1994)
Setiono, R.: Extracting rules from neural networks by pruning and hidden-unit splitting. Neural Computation (1997)
Sun, R.: Beyond simple rule extraction: Acquiring planning knowledge from neural networks. In: ICONIP 2001 (2001)
Thrun, S., Mitchell, T.M.: Integrating inductive neural network learning and explanation based learning. In: IJCAI 1993 (1993)
Towell, G.G., Shavlik, J.W.: Knowledge based artificial neural networks. Artificial Intelligence (1994)
Mitchell, T., Thrun, S.B.: Learning analytically and inductively. Mind Matters: A Tribute to Allen Newell (1996)
Fasconi, P., Gori, M., Maggini, M., Soda, G.: Unified integration of explicit knowledge and learning by example in recurrent networks. IEEE Transactions on Knowledge and Data Engineering (1995)
Polikar, R., Udpa, L., Udpa, S.S., Honavar, V.: Learn++: An incremental learning algorithm for supervised neural networks. IEEE Transactions on Systems, Man, and Cybernetics (2001)
Tanprasert, T., Fuangkhon, P., Tanprasert, C.: An Improved Technique for Retraining Neural Networks In Adaptive Environment. In: INTECH 2008 (2008)
Fuangkhon, P., Tanprasert, T.: An Incremental Learning Algorithm for Supervised Neural Network with Contour Preserving Classification. In: ECTI-CON 2009 (2009)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Fuangkhon, P., Tanprasert, T. (2009). An Adaptive Learning Algorithm for Supervised Neural Network with Contour Preserving Classification. In: Deng, H., Wang, L., Wang, F.L., Lei, J. (eds) Artificial Intelligence and Computational Intelligence. AICI 2009. Lecture Notes in Computer Science(), vol 5855. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-05253-8_43
Download citation
DOI: https://doi.org/10.1007/978-3-642-05253-8_43
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-05252-1
Online ISBN: 978-3-642-05253-8
eBook Packages: Computer ScienceComputer Science (R0)