An Improved SVM Based on Feature Extension and Feature Selection

Article Preview

Abstract:

Support Vector Machine (SVM) implicitly maps samples from the lower-dimensional feature space to a higher-dimensional space, and designs a non-linear classifier via optimize the linear classifier in the higher-dimensional space. This paper proposed an improved SVM method based on feature extension and feature selection. The method explicitly maps the samples to a higher-dimensional feature space, perform the feature selection in the space, and finally design a linear classifier with a selected feature set. This paper illustrated the reason why the generalization ability is improved by this technique. The experiment results on benchmark datasets show that the improved SVM greatly decreases the error rate compared with other classifiers, which proves the feasibility of the proposed SVM.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

128-132

Citation:

Online since:

June 2014

Export:

Price:

* - Corresponding Author

[1] Vapnik V forward . The nature of statistical learning theory . In translation, Zhang workers translated . Beijing : Tsinghua University Press, 2000.

Google Scholar

[2] Guyon I, Weston J, Barnhill S, et al. Gene Selection for Cancer Classifier using Support Vector Machines. Machine Learning, 2002, 46 (1), 389-422.

Google Scholar

[3] Müller KR, Mika S, Rätsch G, et al. An Introduction to Kernel-Based Learning Algorithms. IEEE Trans. Neural Networks, 2001, 12 (2): 181-201.

Google Scholar

[4] Xu Jianhua , Zhang workers , Li Yanda . Kernel -based nonlinear perception algorithm . Computer Engineering , July (2002).

Google Scholar

[5] Xu Jianhua , Zhang workers . Classical linear algorithm for nonlinear kernel form . Control and Decision, 2006 01.

Google Scholar

[6] University of California Irvine. UCI Repository of Machine Learning Databases. Internet: http: /www. ics. uci. edu/ ~ mlearn / MLRepository. html.

Google Scholar

[7] GMD FIRST, IDA benchmark repository used in several Boosting, KFD and SVM papers. Internet: http: /ida. first. gmd. de/ ~ raetsch / data / benchmarks. htm.

Google Scholar

[8] Rätsch G., Onoda T and Müller KR, Soft Margins for AdaBoost, Machine Learning, vol. 42, 287-320, March (2004).

Google Scholar

[9] ZHOU Yong-quan , Zhao Bin . Nonlinear weighting function of a linear perceptron learning algorithm fast , Computer Applications and Software , 2004 01.

Google Scholar