Skip to main content

Support Vector Machine Based Models with Sparse Auto-encoder Based Features for Classification Problem

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2022)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13623))

Included in the following conference series:

  • 1144 Accesses

Abstract

Auto-encoder is a special type of artificial neural network (ANN) that is used to learn informative features from data. In the literature, the generalization performance of several machine learning models have been improved either using auto-encoder based features or high dimensional features (original + auto-encoder based features). Random vector functional link (RVFL) network also uses two type of features, i.e., original features and randomized features, that makes it a special randomized neural network. These hybrid features improve the generalization performance of the RVFL network. In this paper, we introduce the idea of using additional features into robust energy-based least squares twin support vector machines (RELS-TSVM) and least squares twin support vector machines (LSTSVM). We used sparse auto-encoder with \(L_{1}\) norm regularization to learn the auxiliary feature representation from original feature space. These new additional features are concatenated with the original features to get the extended feature space. The conventional RELS-TSVM and LSTSVM are trained over new extended feature space. Experiments demonstrate that auto-encoder based features improve the generalization capability of the conventional RELS-TSVM and LSTSVM models. To examine the performance of the proposed classifiers, i.e., extended-RELS-TSVM (ext-RELS-TSVM) and extended LSTSVM (ext-LSTSVM), experiments have been conducted over 15 UCI binary datasets and the results show that the proposed classifiers have better generalization performance than the baseline classifiers.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Ganaie, M.A., Hu, M., Malik, A.K., Tanveer, M., Suganthan, P.N.: Ensemble deep learning: a review. Eng. Appl. Artif. Intell. 115, 105151 (2022)

    Article  Google Scholar 

  2. Cortes, C., Vapnik, V.: Support vector machine. Mach. Learn. 20(3), 273–297 (1995)

    Article  MATH  Google Scholar 

  3. Zhang, L., Suganthan, P.N.: A survey of randomized algorithms for training neural networks. Inf. Sci. 364, 146–155 (2016)

    Article  MATH  Google Scholar 

  4. Cao, Y., Geddes, T.A., Yang, J.Y.H., Yang, P.: Ensemble deep learning in bioinformatics. Nat. Mach. Intell. 2(9), 500–508 (2020)

    Article  Google Scholar 

  5. Shao, Y.-H., Deng, N.-Y., Yang, Z.-M.: Least squares recursive projection twin support vector machine for classification. Pattern Recogn. 45(6), 2299–2307 (2012)

    Article  MATH  Google Scholar 

  6. Tanveer, M., Gupta, T., Shah, M.: Pinball loss twin support vector clustering. ACM Trans. Multimedia Comput. Commun. Appl. (TOMM) 265 (2020)

    Google Scholar 

  7. Lee, Y.-J., Mangasarian, O.L.: RSVM: reduced support vector machines. In: Proceedings of the 2001 SIAM International Conference on Data Mining, pp. 1–17. SIAM (2001)

    Google Scholar 

  8. Mangasarian, O.L., Musicant, D.R.: Lagrangian support vector machines. J. Mach. Learn. Res. 1(Mar), 161–177 (2001)

    Google Scholar 

  9. Jayadeva, R.K., Chandra, S.: Twin support vector machines for pattern classification. IEEE Trans. Pattern Anal. Mach. Intell. 29(5), 905–910 (2007)

    Google Scholar 

  10. Tanveer, M., Rajani, T., Ganaie, M.A.: Improved sparse pinball twin SVM. In: 2019 IEEE International Conference on Systems, Man and Cybernetics (SMC), pp. 3287–3291. IEEE (2019)

    Google Scholar 

  11. Ganaie, M.A., Tanveer, M.: Robust general twin support vector machine with pinball loss function. In: Kumar, P., Singh, A.K. (eds.) Machine Learning for Intelligent Multimedia Analytics. SBD, vol. 82, pp. 103–125. Springer, Singapore (2021). https://doi.org/10.1007/978-981-15-9492-2_6

    Chapter  Google Scholar 

  12. Tanveer, M., Tiwari, A., Choudhary, R., Ganaie, M.A.: Large-scale pinball twin support vector machines. Mach. Learn. 1–24 (2021). https://doi.org/10.1007/s10994-021-06061-z

  13. Tanveer, M., Rajani, T., Rastogi, R., Shao, Y.-H., Ganaie, M.A.: Comprehensive review on twin support vector machines. Ann. Oper. Res. 1–46 (2022)

    Google Scholar 

  14. Kumar, M.A., Gopal, M.: Least squares twin support vector machines for pattern classification. Expert Syst. Appl. 36(4), 7535–7543 (2009)

    Article  Google Scholar 

  15. Ganaie, M.A., Tanveer, M., Initiative, A.D.N.: Fuzzy least squares projection twin support vector machines for class imbalance learning. Appl. Soft Comput. 113, 107933 (2021)

    Article  Google Scholar 

  16. Xu, Y., Xi, W., Lv, X., Guo, R.: An improved least squares twin support vector machine. J. Inf. Comput. Sci. 9(4), 1063–1071 (2012)

    Google Scholar 

  17. Ali, J., Aldhaifallah, M., Nisar, K.S., Aljabr, A., Tanveer, M.: Regularized least squares twin SVM for multiclass classification. Big Data Res. 27, 100295 (2022)

    Article  Google Scholar 

  18. Hornik, K., Stinchcombe, M., White, H.: Multilayer feedforward networks are universal approximators. Neural Netw. 2(5), 359–366 (1989)

    Article  MATH  Google Scholar 

  19. Park, J., Sandberg, I.W.: Universal approximation using radial-basis-function networks. Neural Comput. 3(2), 246–257 (1991)

    Article  Google Scholar 

  20. Pao, Y.-H., Park, G.-H., Sobajic, D.J.: Learning and generalization characteristics of the random vector functional-link net. Neurocomputing 6(2), 163–180 (1994)

    Article  Google Scholar 

  21. Pao, Y.: Adaptive pattern recognition and neural networks (1989)

    Google Scholar 

  22. Huang, G.-B., Zhu, Q.-Y., Siew, C.-K.: Extreme learning machine: theory and applications. Neurocomputing 70(1–3), 489–501 (2006)

    Article  Google Scholar 

  23. Zhang, L., Suganthan, P.N.: A comprehensive evaluation of random vector functional link networks. Inf. Sci. 367, 1094–1105 (2016)

    Article  Google Scholar 

  24. Wan, Y., Song, S., Huang, G., Li, S.: Twin extreme learning machines for pattern classification. Neurocomputing 260, 235–244 (2017)

    Article  Google Scholar 

  25. Rastogi (nee Khemchandani), R., Bharti, A.: Least squares twin extreme learning machine for pattern classification. In: Deb, D., Balas, V.E., Dey, R. (eds.) Innovations in Infrastructure. AISC, vol. 757, pp. 561–571. Springer, Singapore (2019). https://doi.org/10.1007/978-981-13-1966-2_50

    Chapter  Google Scholar 

  26. Zhang, Y., Wu, J., Cai, Z., Du, B., Philip, S.Y.: An unsupervised parameter learning model for RVFL neural network. Neural Netw. 112, 85–97 (2019)

    Article  MATH  Google Scholar 

  27. Malik, A.K., Gao, R., Ganaie, M.A., Tanveer, M., Suganthan, P.N.: Random vector functional link network: recent developments, applications, and future directions. arXiv:2203.11316 (2022)

  28. Tanveer, M., Gautam, C., Suganthan, P.N.: Comprehensive evaluation of twin SVM based classifiers on UCI datasets. Appl. Soft Comput. 83, 105617 (2019)

    Article  Google Scholar 

  29. Zhang, P.-B., Yang, Z.-X.: A new learning paradigm for random vector functional-link network: RVFL+. Neural Netw. 122, 94–105 (2020)

    Article  Google Scholar 

  30. Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imag. Sci. 2(1), 183–202 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  31. Tanveer, M., Khan, M.A., Ho, S.-S.: Robust energy-based least squares twin support vector machines. Appl. Intell. 45(1), 174–186 (2016). https://doi.org/10.1007/s10489-015-0751-1

    Article  Google Scholar 

  32. Frank, A.: UCI machine learning repository (2010). http://archive.ics.uci.edu/ml

  33. Fernández-Delgado, M., Cernadas, E., Barro, S., Amorim, D.: Do we need hundreds of classifiers to solve real world classification problems? J. Mach. Learn. Res. 15(1), 3133–3181 (2014)

    MathSciNet  MATH  Google Scholar 

  34. Katuwal, R., Suganthan, P.N.: Stacked autoencoder based deep random vector functional link neural network for classification. Appl. Soft Comput. 85, 105854 (2019)

    Article  Google Scholar 

  35. Yang, Z., Xu, B., Luo, W., Chen, F.: Autoencoder-based representation learning and its application in intelligent fault diagnosis: a review. Measurement, 110460 (2021)

    Google Scholar 

Download references

Acknowledgment

This work is supported by National Supercomputing Mission under DST and Miety, Govt. of India under Grant No. DST/NSM/R&D HPC Appl/2021/03.29 and Department of Science and Technology under Interdisciplinary Cyber Physical Systems (ICPS) Scheme Grant no. DST/ICPS/CPS-Individual\(/2018/276\) and Mathematical Research Impact-Centric Support (MATRICS) scheme grant no. MTR/2021/000787. Mr. Ashwani kumar Malik acknowledges the financial support (File no-09/1022 (0075)/2019-EMR-I) given as scholarship by Council of Scientific and Industrial Research (CSIR), New Delhi, India. We gratefully acknowledge the Indian Institute of Technology Indore for providing facilities and support.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to A. K. Malik .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Malik, A.K., Ganaie, M.A., Tanveer, M., Suganthan, P.N. (2023). Support Vector Machine Based Models with Sparse Auto-encoder Based Features for Classification Problem. In: Tanveer, M., Agarwal, S., Ozawa, S., Ekbal, A., Jatowt, A. (eds) Neural Information Processing. ICONIP 2022. Lecture Notes in Computer Science, vol 13623. Springer, Cham. https://doi.org/10.1007/978-3-031-30105-6_21

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-30105-6_21

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-30104-9

  • Online ISBN: 978-3-031-30105-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics