Abstract
We propose a new method for general gaussian kernel hyperparameters optimization for support vector regression. The hyperparameters are constrained to lie on a differentiable manifold. The proposed optimization technique is based on a gradient-like descent algorithm adapted to the geometrical structure of the manifold of symmetric positive-definite matrices. We compare the performance of our approach with the classical support vector regression on real world data sets. Experiments demonstrate that the optimization improves prediction accuracy and reduces the number of support vectors.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Vapnik, V.N.: Statistical Learning Theory. John Wesley and Sons (1998)
Gold, C., Sollich, P.: Model selection for support vector machine classification. Neurocomputing 55(1-2), 221–249 (2003), http://dx.doi.org/10.1016/S0925-23120300375-8
Grandvalet, Y., Canu, S.: Adaptive scaling for feature selection in SVMs. In: Becker, S., Thrun, S., Obermayer, K. (eds.) NIPS, pp. 553–560. MIT Press (2002), http://books.nips.cc/papers/files/nips15/AA09.pdf
Lanckriet, G.R.G., Cristianini, N., Bartlett, P., Ghaoui, L.E., Jordan, M.I.: Learning the kernel matrix with semidefinite programming. Journal of Machine Learning Research 5, 27–72 (2004)
Wang, W., Xu, Z., Lu, W., Zhang, X.: Determination of the spread parameter in the gaussian kernel for classification and regression. Neurocomputing 55(3-4), 643–663 (2003), http://dx.doi.org/10.1016/S0925-23120200632-X
He, W., Wang, Z., Jiang, H.: Model optimizing and feature selecting for support vector regression in time series forecasting. Neurocomputing 72(1-3), 600–611 (2008), http://dx.doi.org/10.1016/j.neucom.2007.11.010
Qiu, S., Lane, T.: Multiple kernel learning for support vector regression. University of New Mexico, Tech. Rep. (2005)
Absil, P.-A., Mahony, R., Sepulchre, R.: Optimization Algorithms on Matrix Manifolds. Princeton University Press, Princeton (2008)
Amari, S., Nagaoka, H.: Methods of Information Geometry. American Mathematical Society (2000)
Asuncion, D.N.A.: UCI machine learning repository (2007), http://www.ics.uci.edu/~mlearn/MLRepository.html
Flake, G.W., Lawrence, S.: Efficient SVM regression training with SMO. Machine Learning 46(1/3), 271 (2002)
Kelley Pace, R., Barry, R.: Sparse spatial autoregressions. Statistics & Probability Letters 33(3), 291–297 (1997), http://ideas.repec.org/a/eee/stapro/v33y1997i3p291-297.html
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Abdallah, F., Snoussi, H., Laanaya, H., Lengellé, R. (2013). Learning General Gaussian Kernel Hyperparameters for SVR. In: Nielsen, F., Barbaresco, F. (eds) Geometric Science of Information. GSI 2013. Lecture Notes in Computer Science, vol 8085. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-40020-9_75
Download citation
DOI: https://doi.org/10.1007/978-3-642-40020-9_75
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-40019-3
Online ISBN: 978-3-642-40020-9
eBook Packages: Computer ScienceComputer Science (R0)