Skip to main content

Optimization Problems with Cardinality Constraints

  • Chapter
Computational Intelligence in Optimization

Abstract

In this article we review several hybrid techniques that can be used to accurately and efficiently solve large optimization problems with cardinality constraints. Exact methods, such as branch-and-bound, require lengthy computations and are, for this reason, infeasible in practice. As an alternative, this study focuses on approximate techniques that can identify near-optimal solutions at a reduced computational cost. Most of the methods considered encode the candidate solutions as sets. This representation, when used in conjunction with specially devised search operators, is specially suited to problems whose solution involves the selection of optimal subsets of specified cardinality. The performance of these techniques is illustrated in optimization problems of practical interest that arise in the fields of machine learning (pruning of ensembles of classifiers), quantitative finance (portfolio selection), time-series modeling (index tracking) and statistical data analysis (sparse principal component analysis).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 189.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 249.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 249.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Gill, P.E., Murray, W., Saunders, M.A., Wright, M.H.: Inertia-controlling methods for general quadratic programming. SIAM Review 33, 1–36 (1991)

    Article  MATH  MathSciNet  Google Scholar 

  2. Gill, P., Murray, W.: Quasi-newton methods for unconstrained optimization. IMA Journal of Applied Mathematics 9 (1), 91–108 (1972)

    Article  MATH  MathSciNet  Google Scholar 

  3. Adler, I., Karmarkar, N., Resende, M.G.C., Veiga, G.: An implementation of Karmarkar’s algorithm for linear programming. Mathematical Programming 44, 297–335 (1989)

    Article  MATH  MathSciNet  Google Scholar 

  4. Shapcott, J.: Index tracking: genetic algorithms for investment portfolio selection. Technical report, EPCC-SS92-24, Edinburgh, Parallel Computing Centre (1992)

    Google Scholar 

  5. Radcliffe, N.J.: Genetic set recombination. Foundations of Genetic Algorithms. Morgan Kaufmann Pulishers, San Francisco (1993)

    Google Scholar 

  6. Coello, C.: Theoretical and numerical constraint-handling techniques used with evolutionary algorithms: a survey of the state of the art. Computer Methods in Applied Mechanics and Engineering 191, 1245–1287 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  7. Streichert, F., Ulmer, H., Zell, A.: Evaluating a hybrid encoding and three crossover operators on the constrained portfolio selection problem. In: Proceedings of the Congress on Evolutionary Computation (CEC 2004), vol. 1, pp. 932–939 (2004)

    Google Scholar 

  8. Kirkpatrick, S., Gelatt Jr., C.D., Vecchi, M.P.: Optimization by simulated annealing. Science 4598, 671–679 (1983)

    Article  MathSciNet  Google Scholar 

  9. Goldberg, D.: Genetic Algorithms in Search, Optimization and Machine Learning. Addison-Weasley, Reading (1989)

    MATH  Google Scholar 

  10. Moral-Escudero, R., Ruiz-Torrubiano, R., Suarez, A.: Selection of optimal investment portfolios with cardinality constraints. In: Proceedings of the IEEE World Congress on Evolutionary Computation, pp. 2382–2388 (2006)

    Google Scholar 

  11. Radcliffe, N.J.: Equivalence class analysis of genetic algorithms. Complex Systems 5, 183–205 (1991)

    MATH  MathSciNet  Google Scholar 

  12. Larrañaga, P., Lozano, J.A. (eds.): Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation. Kluwer Academic Publishers, Dordrecht (2002)

    MATH  Google Scholar 

  13. Baluja, S.: Population-based incremental learning: A method for integrating genetic search based function optimization and competitive learning. Technical Report CMU-CS-94-163, Carnegie Mellon University (1994)

    Google Scholar 

  14. Muehlenbein, H.: The equation for response to selection and its use for prediction. Evolutionary Computation 5, 303–346 (1998)

    Article  Google Scholar 

  15. Kellerer, H., Pferschy, U., Pisinger, D.: Knapsack Problems. Springer, Heidelberg (2004)

    MATH  Google Scholar 

  16. Miller, R.E., Thatcher, J.W. (eds.): Reducibility among combinatorial problems, pp. 85–103. Plenum Press (1972)

    Google Scholar 

  17. Pisinger, D.: Where are the hard knapsack problems? Computers & Operations Research, 2271–2284 (2005)

    Google Scholar 

  18. Simões, A., Costa, E.: An evolutionary approach to the zero/one knapsack problem: Testing ideas from biology. In: Proceedings of the Fifth International Conference on Artificial Neural Networks and Genetic Algorithms, ICANNGA (2001)

    Google Scholar 

  19. Ku, S., Lee, B.: A set-oriented genetic algorithm and the knapsack problem. In: Proceedings of the IEEE World Congress on Evolutionary Computation, CEC 2001 (2001)

    Google Scholar 

  20. Michalewicz, Z.: Genetic Algorithms + Data Structures = Evolution Programs. Springer, Heidelberg (1996)

    MATH  Google Scholar 

  21. Ladanyi, L., Ralphs, T., Guzelsoy, M., Mahajan, A.: SYMPHONY (2009), https://projects.coin-or.org/SYMPHONY

  22. Padberg, M.W., Rinaldi, G.: A branch-and-cut algorithm for the solution of large scale traveling salesman problems. SIAM Review 33, 60–100 (1991)

    Article  MATH  MathSciNet  Google Scholar 

  23. Dietterich, T.G.: An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization. Machine Learning 40, 139–157 (2000)

    Article  Google Scholar 

  24. Margineantu, D.D., Dietterich, T.G.: Pruning adaptive boosting. In: Proc. of the 14th International Conference on Machine Learning, pp. 211–218. Morgan Kaufmann, San Francisco (1997)

    Google Scholar 

  25. Caruana, R., Niculescu-Mizil, A., Crew, G., Ksikes, A.: Ensemble selection from libraries of models. In: Proc. of the 21st International Conference on Machine Learning, p. 18. ACM Press, New York (2004)

    Google Scholar 

  26. Banfield, R.E., Hall, L.O., Bowyer, K.W., Kegelmeyer, W.P.: Ensemble diversity measures and their application to thinning. Information Fusion 6, 49–62 (2005)

    Article  Google Scholar 

  27. Martínez-Muñoz, G., Lobato, D.H., Suárez, A.: An analysis of ensemble pruning techniques based on ordered aggregation. IEEE Transactions on Pattern Analysis and Machine Intelligence 31, 245–259 (2009)

    Article  Google Scholar 

  28. Zhou, Z.H., Wu, J., Tang, W.: Ensembling neural networks: Many could be better than all. Artificial Intelligence 137, 239–263 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  29. Zhou, Z.H., Tang, W.: Selective ensemble of decision trees. In: Liu, Q., Yao, Y., Skowron, A. (eds.) RSFDGrC 2003. LNCS (LNAI), vol. 2639, pp. 476–483. Springer, Heidelberg (2003)

    Chapter  Google Scholar 

  30. Hernández-Lobato, D., Hernández-Lobato, J.M., Ruiz-Torrubiano, R., Valle, Á.: Pruning adaptive boosting ensembles by means of a genetic algorithm. In: Corchado, E., Yin, H., Botti, V., Fyfe, C. (eds.) IDEAL 2006. LNCS, vol. 4224, pp. 322–329. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

  31. Zhang, Y., Burer, S., Street, W.N.: Ensemble pruning via semi-definite programming. Journal of Machine Learning Research 7, 1315–1338 (2006)

    MathSciNet  Google Scholar 

  32. Asuncion, A., Newman, D.: UCI machine learning repository (2007)

    Google Scholar 

  33. Breiman, L.: Bagging predictors. Machine Learning 24, 123–140 (1996)

    MATH  MathSciNet  Google Scholar 

  34. Breiman, L., Friedman, J.H., Olshen, R.A., Stone, C.J.: Classification and Regression Trees. Chapman & Hall, New York (1984)

    MATH  Google Scholar 

  35. Markowitz, H.: Portfolio selection. Journal of Finance 7, 77–91 (1952)

    Article  Google Scholar 

  36. Bienstock, D.: Computational study of a family of mixed-integer quadratic programming problems. In: Balas, E., Clausen, J. (eds.) IPCO 1995. LNCS, vol. 920. Springer, Heidelberg (1995)

    Google Scholar 

  37. Chang, T.J., Meade, N., Beasley, J.E., Sharaiha, Y.M.: Heuristics for cardinality constrained portfolio optimisation. Computers and Operations Research 27, 1271–1302 (2000)

    Article  MATH  Google Scholar 

  38. Glover, F.: Future paths for integer programming and links to artificial intelligence. Computers and Operations Research 13, 533–549 (1986)

    Article  MATH  MathSciNet  Google Scholar 

  39. Crama, Y., Schyns, M.: Simulated annealing for complex portfolio selection problems. Technical report, Groupe d’Etude des Mathematiques du Management et de l’Economie 9911, Universie de Liege (1999)

    Google Scholar 

  40. Schaerf, A.: Local search techniques for constrained portfolio selection problems. Computational Economics 20, 177–190 (2002)

    Article  MATH  Google Scholar 

  41. Streichert, F., Tamaka-Tamawaki, M.: The effect of local search on the constrained portfolio selection problem. In: Proceedings of the IEEE World Congress on Evolutionary Computation (CEC 2006), Vancouver, Canada, pp. 2368–2374 (2006)

    Google Scholar 

  42. Beasley, J.E.: Or-library: Distributing test problems by electronic mail. Journal of the Operational Research Society 41(11), 1069–1072 (1990)

    Google Scholar 

  43. Buckley, I., Korn, R.: Optimal index tracking under transaction costs and impulse control. International Journal of Theoretical and Applied Finance 1(3), 315–330 (1998)

    Article  MATH  Google Scholar 

  44. Gilli, M., Këllezi, E.: Threshold accepting for index tracking. Computing in Economics and Finance 72 (2001)

    Google Scholar 

  45. Beasley, J.E., Meade, N., Chang, T.: An evolutionary heuristic for the index tracking problem. European Journal of Operations Research 148(3), 621–643 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  46. Lobo, M., Fazel, M., Boyd, S.: Portfolio optimization with linear and fixed transaction costs. Annals of Operations Research, special issue on financial optimization 152(1), 376–394 (2007)

    MathSciNet  Google Scholar 

  47. Jeurissen, R., van den Berg, J.: Index tracking using a hybrid genetic algorithm. In: ICSC Congress on Computational Intelligence Methods and Applications 2005 (2005)

    Google Scholar 

  48. Jeurissen, R., van den Berg, J.: Optimized index tracking using a hybrid genetic algorithm. In: Proceedings of the IEEE World Congress on Evolutionary Computation (CEC 2008), pp. 2327–2334 (2008)

    Google Scholar 

  49. Ruiz-Torrubiano, R., Suárez, A.: A hybrid optimization approach to index tracking. Accepted for publication in Annals of Operations Research (2007)

    Google Scholar 

  50. Moghaddam, B., Weiss, Y., Avidan, S.: Spectral bounds for sparse PCA. In: Advances in Neural Information Processing Systems, NIPS 2005 (2005)

    Google Scholar 

  51. Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006)

    Article  MathSciNet  Google Scholar 

  52. Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society B 58, 267–268 (1996)

    MATH  MathSciNet  Google Scholar 

  53. d’Aspremont, A., Ghaoui, L.E., Jordan, M., Lanckriet, G.: A direct formulation for sparse PCA using semidefinite programming. SIAM Review 49(3), 434–448 (2007)

    Article  MATH  MathSciNet  Google Scholar 

  54. d’Aspremont, A., Bach, F., Ghaoui, L.E.: Optimal solutions for sparse principal component analysis. Journal of Machine Learning Research 9, 1269–1294 (2008)

    Google Scholar 

  55. d’Aspremont, A., Ghaoui, L.E., Jordan, M., Lanckriet, G.: MATLAB code for DSPCA (2008), http://www.princeton.edu/~aspremon/DSPCA.htm

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Ruiz-Torrubiano, R., García-Moratilla, S., Suárez, A. (2010). Optimization Problems with Cardinality Constraints. In: Tenne, Y., Goh, CK. (eds) Computational Intelligence in Optimization. Adaptation, Learning, and Optimization, vol 7. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-12775-5_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-12775-5_5

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-12774-8

  • Online ISBN: 978-3-642-12775-5

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics