Abstract
Auto-encoder is a special type of artificial neural network (ANN) that is used to learn informative features from data. In the literature, the generalization performance of several machine learning models have been improved either using auto-encoder based features or high dimensional features (original + auto-encoder based features). Random vector functional link (RVFL) network also uses two type of features, i.e., original features and randomized features, that makes it a special randomized neural network. These hybrid features improve the generalization performance of the RVFL network. In this paper, we introduce the idea of using additional features into robust energy-based least squares twin support vector machines (RELS-TSVM) and least squares twin support vector machines (LSTSVM). We used sparse auto-encoder with \(L_{1}\) norm regularization to learn the auxiliary feature representation from original feature space. These new additional features are concatenated with the original features to get the extended feature space. The conventional RELS-TSVM and LSTSVM are trained over new extended feature space. Experiments demonstrate that auto-encoder based features improve the generalization capability of the conventional RELS-TSVM and LSTSVM models. To examine the performance of the proposed classifiers, i.e., extended-RELS-TSVM (ext-RELS-TSVM) and extended LSTSVM (ext-LSTSVM), experiments have been conducted over 15 UCI binary datasets and the results show that the proposed classifiers have better generalization performance than the baseline classifiers.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Ganaie, M.A., Hu, M., Malik, A.K., Tanveer, M., Suganthan, P.N.: Ensemble deep learning: a review. Eng. Appl. Artif. Intell. 115, 105151 (2022)
Cortes, C., Vapnik, V.: Support vector machine. Mach. Learn. 20(3), 273–297 (1995)
Zhang, L., Suganthan, P.N.: A survey of randomized algorithms for training neural networks. Inf. Sci. 364, 146–155 (2016)
Cao, Y., Geddes, T.A., Yang, J.Y.H., Yang, P.: Ensemble deep learning in bioinformatics. Nat. Mach. Intell. 2(9), 500–508 (2020)
Shao, Y.-H., Deng, N.-Y., Yang, Z.-M.: Least squares recursive projection twin support vector machine for classification. Pattern Recogn. 45(6), 2299–2307 (2012)
Tanveer, M., Gupta, T., Shah, M.: Pinball loss twin support vector clustering. ACM Trans. Multimedia Comput. Commun. Appl. (TOMM) 265 (2020)
Lee, Y.-J., Mangasarian, O.L.: RSVM: reduced support vector machines. In: Proceedings of the 2001 SIAM International Conference on Data Mining, pp. 1–17. SIAM (2001)
Mangasarian, O.L., Musicant, D.R.: Lagrangian support vector machines. J. Mach. Learn. Res. 1(Mar), 161–177 (2001)
Jayadeva, R.K., Chandra, S.: Twin support vector machines for pattern classification. IEEE Trans. Pattern Anal. Mach. Intell. 29(5), 905–910 (2007)
Tanveer, M., Rajani, T., Ganaie, M.A.: Improved sparse pinball twin SVM. In: 2019 IEEE International Conference on Systems, Man and Cybernetics (SMC), pp. 3287–3291. IEEE (2019)
Ganaie, M.A., Tanveer, M.: Robust general twin support vector machine with pinball loss function. In: Kumar, P., Singh, A.K. (eds.) Machine Learning for Intelligent Multimedia Analytics. SBD, vol. 82, pp. 103–125. Springer, Singapore (2021). https://doi.org/10.1007/978-981-15-9492-2_6
Tanveer, M., Tiwari, A., Choudhary, R., Ganaie, M.A.: Large-scale pinball twin support vector machines. Mach. Learn. 1–24 (2021). https://doi.org/10.1007/s10994-021-06061-z
Tanveer, M., Rajani, T., Rastogi, R., Shao, Y.-H., Ganaie, M.A.: Comprehensive review on twin support vector machines. Ann. Oper. Res. 1–46 (2022)
Kumar, M.A., Gopal, M.: Least squares twin support vector machines for pattern classification. Expert Syst. Appl. 36(4), 7535–7543 (2009)
Ganaie, M.A., Tanveer, M., Initiative, A.D.N.: Fuzzy least squares projection twin support vector machines for class imbalance learning. Appl. Soft Comput. 113, 107933 (2021)
Xu, Y., Xi, W., Lv, X., Guo, R.: An improved least squares twin support vector machine. J. Inf. Comput. Sci. 9(4), 1063–1071 (2012)
Ali, J., Aldhaifallah, M., Nisar, K.S., Aljabr, A., Tanveer, M.: Regularized least squares twin SVM for multiclass classification. Big Data Res. 27, 100295 (2022)
Hornik, K., Stinchcombe, M., White, H.: Multilayer feedforward networks are universal approximators. Neural Netw. 2(5), 359–366 (1989)
Park, J., Sandberg, I.W.: Universal approximation using radial-basis-function networks. Neural Comput. 3(2), 246–257 (1991)
Pao, Y.-H., Park, G.-H., Sobajic, D.J.: Learning and generalization characteristics of the random vector functional-link net. Neurocomputing 6(2), 163–180 (1994)
Pao, Y.: Adaptive pattern recognition and neural networks (1989)
Huang, G.-B., Zhu, Q.-Y., Siew, C.-K.: Extreme learning machine: theory and applications. Neurocomputing 70(1–3), 489–501 (2006)
Zhang, L., Suganthan, P.N.: A comprehensive evaluation of random vector functional link networks. Inf. Sci. 367, 1094–1105 (2016)
Wan, Y., Song, S., Huang, G., Li, S.: Twin extreme learning machines for pattern classification. Neurocomputing 260, 235–244 (2017)
Rastogi (nee Khemchandani), R., Bharti, A.: Least squares twin extreme learning machine for pattern classification. In: Deb, D., Balas, V.E., Dey, R. (eds.) Innovations in Infrastructure. AISC, vol. 757, pp. 561–571. Springer, Singapore (2019). https://doi.org/10.1007/978-981-13-1966-2_50
Zhang, Y., Wu, J., Cai, Z., Du, B., Philip, S.Y.: An unsupervised parameter learning model for RVFL neural network. Neural Netw. 112, 85–97 (2019)
Malik, A.K., Gao, R., Ganaie, M.A., Tanveer, M., Suganthan, P.N.: Random vector functional link network: recent developments, applications, and future directions. arXiv:2203.11316 (2022)
Tanveer, M., Gautam, C., Suganthan, P.N.: Comprehensive evaluation of twin SVM based classifiers on UCI datasets. Appl. Soft Comput. 83, 105617 (2019)
Zhang, P.-B., Yang, Z.-X.: A new learning paradigm for random vector functional-link network: RVFL+. Neural Netw. 122, 94–105 (2020)
Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imag. Sci. 2(1), 183–202 (2009)
Tanveer, M., Khan, M.A., Ho, S.-S.: Robust energy-based least squares twin support vector machines. Appl. Intell. 45(1), 174–186 (2016). https://doi.org/10.1007/s10489-015-0751-1
Frank, A.: UCI machine learning repository (2010). http://archive.ics.uci.edu/ml
Fernández-Delgado, M., Cernadas, E., Barro, S., Amorim, D.: Do we need hundreds of classifiers to solve real world classification problems? J. Mach. Learn. Res. 15(1), 3133–3181 (2014)
Katuwal, R., Suganthan, P.N.: Stacked autoencoder based deep random vector functional link neural network for classification. Appl. Soft Comput. 85, 105854 (2019)
Yang, Z., Xu, B., Luo, W., Chen, F.: Autoencoder-based representation learning and its application in intelligent fault diagnosis: a review. Measurement, 110460 (2021)
Acknowledgment
This work is supported by National Supercomputing Mission under DST and Miety, Govt. of India under Grant No. DST/NSM/R&D HPC Appl/2021/03.29 and Department of Science and Technology under Interdisciplinary Cyber Physical Systems (ICPS) Scheme Grant no. DST/ICPS/CPS-Individual\(/2018/276\) and Mathematical Research Impact-Centric Support (MATRICS) scheme grant no. MTR/2021/000787. Mr. Ashwani kumar Malik acknowledges the financial support (File no-09/1022 (0075)/2019-EMR-I) given as scholarship by Council of Scientific and Industrial Research (CSIR), New Delhi, India. We gratefully acknowledge the Indian Institute of Technology Indore for providing facilities and support.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Malik, A.K., Ganaie, M.A., Tanveer, M., Suganthan, P.N. (2023). Support Vector Machine Based Models with Sparse Auto-encoder Based Features for Classification Problem. In: Tanveer, M., Agarwal, S., Ozawa, S., Ekbal, A., Jatowt, A. (eds) Neural Information Processing. ICONIP 2022. Lecture Notes in Computer Science, vol 13623. Springer, Cham. https://doi.org/10.1007/978-3-031-30105-6_21
Download citation
DOI: https://doi.org/10.1007/978-3-031-30105-6_21
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-30104-9
Online ISBN: 978-3-031-30105-6
eBook Packages: Computer ScienceComputer Science (R0)