Abstract
We study the problem of learning from examples (i.e., supervised learning) by means of function approximation theory. Approximation problems formulated as regularized minimization problems with kernel-based stabilizers exhibit easy derivation of solution, in the shape of a linear combination of kernel functions (one-hidden layer feed-forward neural network schemas). Based on Aronszajn’s formulation of sum of kernels and product of kernels, we derive new approximation schemas – Sum Kernel Regularization Network and Product Kernel Regularization Network. We present some concrete applications of the derived schemas, demonstrate their performance on experiments and compare them to classical solutions. For many tasks our schemas outperform the classical solutions.
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Cucker, F., Smale, S.: On the mathematical foundations of learning. Bulletin of the American Mathematical Society 39, 1–49 (2001)
Girosi, F.: An equivalence between sparse approximation and support vector machines. Technical report, Massachutesetts Institute of Technology A.I. Memo No. 1606 (1997)
Girosi, F., Jones, M., Poggio, T.: Regularization theory and Neural Networks architectures. Neural Computation 2, 219–269 (1995)
Poggio, T., Smale, S.: The mathematics of learning: Dealing with data. Notices of the AMS 50, 536–544 (2003)
Kudová, P.: Learning with kernel based regularization networks. In: Information Technologies - Applications and Theory, pp. 83–92 (2005)
Aronszajn, N.: Theory of reproducing kernels. Transactions of the AMS 68, 337–404 (1950)
Šidlofová, T.: Existence and uniqueness of minimization problems with fourier based stabilizers. In: Proceedings of Compstat, Prague (2004)
Šámalová, T., Kudová, P.: Sum and product kernel networks. Technical report, Institute of Computer Science, AS CR (2005)
Prechelt, L.: PROBEN1 – a set of benchmarks and benchmarking rules for neural network training algorithms. Technical Report 21/94, Universitaet Karlsruhe (1994)
LAPACK: Linear algebra package, http://www.netlib.org/lapack/
PAPI: Performance appl. prog. interface, http://icl.cs.utk.edu/papi/
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Kudová, P., Šámalová, T. (2006). Sum and Product Kernel Regularization Networks. In: Rutkowski, L., Tadeusiewicz, R., Zadeh, L.A., Żurada, J.M. (eds) Artificial Intelligence and Soft Computing – ICAISC 2006. ICAISC 2006. Lecture Notes in Computer Science(), vol 4029. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11785231_7
Download citation
DOI: https://doi.org/10.1007/11785231_7
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-35748-3
Online ISBN: 978-3-540-35750-6
eBook Packages: Computer ScienceComputer Science (R0)