Skip to main content

Optimality Conditions for Sparse Quadratic Optimization Problem

  • Conference paper
  • First Online:
EngOpt 2018 Proceedings of the 6th International Conference on Engineering Optimization (EngOpt 2018)

Included in the following conference series:

Abstract

Sparse models are preferred in machine learning problems because of their computational interpretability and it is seen in many applications such as in Google Page Rank, classification and regression problems, in the method of Principal Component Analysis (PCA) that finds the most important features and further applications in graphical models. In this study, we derive optimality conditions for the quadratic problem which has cardinality constraint imposing sparse solution. Our Quadratic model is a special application of ensemble pruning model in ensemble learning algorithms. Here, we refer to our previous study on this application in ensemble selection for clustering problems. The quadratic model proposed in this study optimizes trade-off between accuracy and diversity of discriminant functions (classifiers) simultaneously so that the best candidates of ensemble are selected for prediction step. The selection of the best classifiers in the ensemble is crucial for the overall performance of ensemble learning algorithms since redundant/outlier solutions in the ensemble library will decrease the overall prediction accuracy. In order to eliminate such candidates, both accuracy and diversity are taken into account when selecting the best subset of the ensemble. The cardinality constraint is further relaxed by considering various approximations such as \(l_1\)-norm regularization and student t-log likelihood approximations. Under these considerations and approximations, we build optimality criteria for our quadratic optimization problem with a cardinality constraint.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 259.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 329.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 329.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Akyüz, S., Otar, B.: Ensemble clustering selection by optimization of accuracy-diversity trade off. In: 2017 25th Signal Processing and Communications Applications Conference (SIU), pp. 1–4 (2017)

    Google Scholar 

  2. Azimi, J., Fern, X.: Adaptive cluster ensemble selection. In: IJCAI International Joint Conference on Artificial Intelligence, pp. 992–997 (2009)

    Google Scholar 

  3. Bache, K., Lichman, M.: UCI Machine Learning Repository (2013)

    Google Scholar 

  4. Bradley, P.S., Mangasarian, O.L., Rosen, J.B.: Parsimonious least norm approximation. Comput. Optim. Appl. (1998)

    Google Scholar 

  5. Breiman, L.: Bagging predictors. Mach. Learn. 24(2), 123–140 (1996)

    MathSciNet  MATH  Google Scholar 

  6. Caruana, R., Niculescu-Mizil, A., Crew, G., Ksikes, A.: Ensemble selection from libraries of models. In: Twenty-First International Conference on Machine Learning - ICML 2004, p. 18 (2004)

    Google Scholar 

  7. Daszykowski, M., Walczak, B.: Density-based clustering methods. Compr. Chemom. 2, 635–654 (2010). https://doi.org/10.1016/B978-044452701-1.00067-3

    Article  Google Scholar 

  8. Fern, X.Z., Lin, W.: Cluster ensemble selection. Stat. Anal. Data Min. 1(3), 128–141 (2008). https://doi.org/10.1002/sam.10008

    Article  MathSciNet  Google Scholar 

  9. Ferrari, D.G., De Castro, L.N.: Clustering algorithm selection by meta-learning systems: a new distance-based problem characterization and ranking combination methods. Inf. Sci. 301, 181–194 (2015)

    Article  Google Scholar 

  10. Forgy, E.W.: Cluster analysis of multivariate data: efficiency versus interpretability of classifications. Biometrics 21(3), 768–769 (1965)

    Google Scholar 

  11. Ghosh, J., Acharya, A.: Wiley interdisciplinary reviews: data mining and knowledge discovery. Clust. Ensembles 1(4), 305–315 (2011)

    Google Scholar 

  12. Horst, R., Thoai, N.V.: Dc programming: overview. J. Optim. Theor. Appl. 103(1), 1–43 (1999)

    Article  MathSciNet  Google Scholar 

  13. Jia, J., Xiao, X., Liu, B., Jiao, L.: Bagging-based spectral clustering ensemble selection. Pattern Recognit. Lett. 32(Fskd), 1456–1467 (2011)

    Google Scholar 

  14. Le Thi, H.A., Pham Dinh, T., Le, H.M., Vo, X.T.: DC approximation approaches for sparse optimization. Eur. J. Oper. Res. 244, 26–46 (2015). https://doi.org/10.1016/j.ejor.2014.11.031

    Article  MathSciNet  MATH  Google Scholar 

  15. Mangasarian, O.: Machine learning via polyhedral concave minimization. In: Applied Mathematics and Parallel Computing, pp. 175–188. Springer, Heidelberg (1996)

    Google Scholar 

  16. Moon, T.: The expectation-maximization algorithm. IEEE Sig. Process. Mag. 13(6), 47–60 (1996)

    Article  Google Scholar 

  17. Muñoz, A., Diego, I.: From indefinite to positive semi-definite matrices. Struct. Syntactic Stat. Pattern Recognit. 4109, 764–772 (2006)

    Article  Google Scholar 

  18. Pham Dinh, T., Le Thi, H.A.: Recent advances in DC programming and DCA. In: Nguyen, N.T., Le-Thi, H.A. (eds.) Trans. Comput. Intell. XIII, pp. 1–37. Springer, Heidelberg (2014)

    Google Scholar 

  19. Rinaldi, F., Schoen, F., Sciandrone, M.: Concave programming for minimizing the zero-norm over polyhedral sets. Comput. Optim. Appl. (2010)

    Google Scholar 

  20. Rockafeller, R.T.: Convex analysis (1970)

    Google Scholar 

  21. Schapire, R.: The boosting approach to machine learning: an overview (2001)

    Google Scholar 

  22. Shen, X., Diamond, S., Gu, Y., Boyd, S.: Disciplined convex-concave programming. In: 2016 IEEE 55th Conference on Decision and Control (CDC), pp. 1009–1014. IEEE (2016)

    Google Scholar 

  23. Sriperumbudur, B.K., Torres, D.A., Lanckriet, G.R.: A majorization-minimization approach to the sparse generalized eigenvalue problem. Mach. Learn. (2011)

    Google Scholar 

  24. Strehl, A., Ghosh, J.: Cluster ensembles - a knowledge reuse framework for combining multiple partitions. J. Mach. Learn. Res. 3, 583–617 (2002)

    MathSciNet  MATH  Google Scholar 

  25. Tao, P.D., et al.: Algorithms for solving a class of nonconvex optimization problems. Methods of subgradients. In: North-Holland Mathematics Studies, vol. 129, pp. 249–271. Elsevier (1986)

    Google Scholar 

  26. Thiao, M., Dinh, T.P., Le Thi, H.A.: Dc programming approach for a class of nonconvex programs involving l 0 norm. In: Modelling, Computation and Optimization in Information Systems and Management Sciences, pp. 348–357. Springer (2008)

    Google Scholar 

  27. Tibshirani, R.: Regression selection and shrinkage via the lasso (1996). https://doi.org/10.2307/2346178

  28. Weston, J., Elisseeff, A., Scholkopf, B., Tipping, M.: Use of the zero-norm with linear models and kernel methods. J. Mach. Learn. Res. (2003)

    Google Scholar 

  29. Wolpert, D.H.: Stacked generalization. Neural Netw. 5(2), 241–259 (1992)

    Article  Google Scholar 

  30. Yu, Z., Li, L., Gao, Y., You, J., Liu, J., Wong, H.S., Han, G.: Hybrid clustering solution selection strategy. Pattern Recognit. 47(10), 3362–3375 (2014)

    Article  Google Scholar 

  31. Zhang, Y., Burer, S., Street, W.N.: Ensemble pruning via semi-definite programming. J. Mach. Learn. Res. 7, 1315–1338 (2006)

    MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Duygu Üçüncü .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Üçüncü, D., Akyüz, S., Gül, E., Wilhelm-Weber, G. (2019). Optimality Conditions for Sparse Quadratic Optimization Problem. In: Rodrigues, H., et al. EngOpt 2018 Proceedings of the 6th International Conference on Engineering Optimization. EngOpt 2018. Springer, Cham. https://doi.org/10.1007/978-3-319-97773-7_67

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-97773-7_67

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-97772-0

  • Online ISBN: 978-3-319-97773-7

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics