Skip to main content
Log in

Dynamic algorithm selection for pareto optimal set approximation

  • Published:
Journal of Global Optimization Aims and scope Submit manuscript

Abstract

This paper presents a meta-algorithm for approximating the Pareto optimal set of costly black-box multiobjective optimization problems given a limited number of objective function evaluations. The key idea is to switch among different algorithms during the optimization search based on the predicted performance of each algorithm at the time. Algorithm performance is modeled using a machine learning technique based on the available information. The predicted best algorithm is then selected to run for a limited number of evaluations. The proposed approach is tested on several benchmark problems and the results are compared against those obtained using any one of the candidate algorithms alone.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

References

  1. Bader, J., Zitzler, E.: Hype: an algorithm for fast hypervolume-based many-objective optimization. Evol Comput 19(1), 45–76 (2011)

    Article  Google Scholar 

  2. Borrett, J.E., Tsang, E.P.: Adaptive constraint satisfaction: the quickest first principle. In: Computational Intelligence, pp. 203–230. Springer (2009)

  3. Breiman, L.: Random forests. Mach. Learn. 45(1), 5–32 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  4. Deb, K.: Multi-objective Optimization Using Evolutionary Algorithms, vol. 16. Wiley, New York (2001)

    MATH  Google Scholar 

  5. Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 6(2), 182–197 (2002)

    Article  Google Scholar 

  6. Durillo, J.J., Nebro, A.J.: jMetal: a java framework for multi-objective optimization. Adv. Eng. Softw. 42(10), 760–771 (2011)

    Article  Google Scholar 

  7. Feng, Z., Zhang, Q., Zhang, Q., Tang, Q., Yang, T., Ma, Y.: A multiobjective optimization based framework to balance the global exploration and local exploitation in expensive optimization. J. Glob. Optim. 61, 677–694 (2014)

  8. Forrester, A.I., Keane, A.J.: Recent advances in surrogate-based optimization. Prog. Aerosp. Sci. 45(1–3), 50–79 (2009)

    Article  Google Scholar 

  9. Gao, F., Han, L.: Implementing the Nelder–Mead simplex algorithm with adaptive parameters. Comput. Optim. Appl. 51(1), 259–277 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  10. Garrett, D., Dasgupta, D.: Multiobjective landscape analysis and the generalized assignment problem. In: Learning and Intelligent Optimization, pp. 110–124. Springer (2008)

  11. Han, L., Neumann, M.: Effect of dimensionality on the Nelder–Mead simplex method. Optim. Methods Softw. 21(1), 1–16 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  12. Jiang, S., Ong, Y.S., Zhang, J., Feng, L.: Consistencies and contradictions of performance metrics in multiobjective optimization. IEEE Trans. Cybern. 44(12), 2391–2404 (2014)

    Article  Google Scholar 

  13. Jin, R., Chen, W., Simpson, T.: Comparative studies of metamodelling techniques under multiple modelling criteria. Struct. Multidiscip. Optim. 23(1), 1–13 (2001)

    Article  Google Scholar 

  14. Jones, D.R., Schonlau, M., Welch, W.J.: Efficient global optimization of expensive black-box functions. J. Glob. Optim. 13(4), 455–492 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  15. Kaelbling, L.P., Littman, M.L., Moore, A.W.: Reinforcement learning: a survey. J. Artif. Intell. Res. 4, 237–285 (1996)

  16. Knowles, J.: Parego: a hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems. IEEE Trans. Evol. Comput. 10(1), 50–66 (2006)

    Article  Google Scholar 

  17. Koduru, P., Dong, Z., Das, S., Welch, S.M., Roe, J.L., Charbit, E.: A multiobjective evolutionary-simplex hybrid approach for the optimization of differential equation models of gene networks. IEEE Trans. Evol. Comput. 12(5), 572–590 (2008)

    Article  Google Scholar 

  18. Kolda, T.G., Lewis, R.M., Torczon, V.: Optimization by direct search: new perspectives on some classical and modern methods. SIAM Rev. 45(3), 385–482 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  19. Kursawe, F.: A variant of evolution strategies for vector optimization. In: Schwefel, H.P., Mnner, R. (eds.) Parallel Problem Solving from Nature, vol. 496, pp. 193–197. Springer, Berlin (1991)

    Chapter  Google Scholar 

  20. Lagarias, J.C., Reeds, J.A., Wright, M.H., Wright, P.E.: Convergence properties of the Nelder–Mead simplex method in low dimensions. SIAM J. Optim. 9(1), 112–147 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  21. Lagoudakis, M.G., Littman, M.L.: Algorithm selection using reinforcement learning. In: ICML, pp. 511–518. Citeseer (2000)

  22. Luersen, M.A., Le Riche, R.: Globalized Nelder–Mead method for engineering optimization. Comput. Struct. 82(23), 2251–2260 (2004)

    Article  Google Scholar 

  23. Marler, R.T., Arora, J.S.: Survey of multi-objective optimization methods for engineering. Struct. Multidiscip. Optim. 26(6), 369–395 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  24. Miettinen, K.: Nonlinear Multiobjective Optimization, vol. 12. Springer Science & Business Media, Berlin (1999)

    MATH  Google Scholar 

  25. Mockus, J.: Bayesian Approach to Global Optimization. Kluwer Academic Publishers, Dordrecht (1989)

    Book  MATH  Google Scholar 

  26. Nelder, J.A., Mead, R.: A simplex method for function minimization. Comput. J. 7(4), 308–313 (1965)

    Article  MathSciNet  MATH  Google Scholar 

  27. Okabe, T., Jin, Y., Sendhoff, M.O.B.: On test functions for evolutionary multi-objective optimization. In: Yao, X., Burke, E., Lozano, J., Smith, J., Merelo-Guervs, J., Bullinaria, J., Rowe, J., Tio, P., Kabn, A., Schwefel, H.P. (eds.) Parallel Problem Solving from Nature—PPSN VIII, vol. 3242, pp. 792–802. Springer, Berlin (2011)

    Chapter  Google Scholar 

  28. Pham, N., Wilamowski, B.M.: Improved Nelder Meads simplex method and applications. J. Comput. 3(3), 55–63 (2011)

    Google Scholar 

  29. Ponweiser, W., Wagner, T., Biermann, D., Vincze, M.: Multiobjective optimization on a limited budget of evaluations using model-assisted \(\cal {S}\)-metric selection. In: Rudolph, G., Jansen, T., Beume, N., Lucas, S., Poloni, C. (eds.) Parallel Problem Solving from Nature—PPSN X, vol. 5199, pp. 784–794. Springer, Berlin (2008)

    Chapter  Google Scholar 

  30. Rice, J.R.: The algorithm selection problem. Comput. Sci. Tech. Rep. (1975). http://docs.lib.purdue.edu/cstech/99

  31. Santana-Quintero, L., Montaño, A., Coello, C.C.: A review of techniques for handling expensive functions in evolutionary multi-objective optimization. In: Tenne, Y., Goh, C.K. (eds.) Computational Intelligence in Expensive Optimization Problems, vol. 2, pp. 29–59. Springer, Berlin (2010)

    Chapter  Google Scholar 

  32. Steponavičė, I., Hyndman, R.J., Smith-Miles, K., Villanova, L.: Efficient identification of the Pareto optimal set. In: Learning and Intelligent Optimization, pp. 341–352. Springer International Publishing (2014)

  33. Torczon, V.J.: Multi-directional search: a direct search algorithm for parallel machines. Ph.D. thesis, Citeseer (1989)

  34. Törn, A., Žilinskas, A.: Global Optimization. Springer, New York, NY (1989)

  35. Van Veldhuizen, D.A., Lamont, G.B.: Multiobjective evolutionary algorithm test suites. In: Proceedings of the 1999 ACM Symposium on Applied Computing, pp. 351–357. ACM (1999)

  36. Viennet, R., Fonteix, C., Marc, I.: New multicriteria optimization method based on the use of a diploid genetic algorithm: example of an industrial problem. In: Selected Papers from the European Conference on Artificial Evolution, pp. 120–127. Springer, London (1996)

  37. Wagner, T.: Planning and Multi-objective Optimization of Manufacturing Processes by Means of Empirical Surrogate Models. Vulkan, Essen (2013)

    Google Scholar 

  38. Wu, J., Azarm, S.: Metrics for quality assessment of a multiobjective design optimization solution set. J. Mech. Des. 123(1), 18–25 (2001)

    Article  Google Scholar 

  39. Zahara, E., Kao, Y.T.: Hybrid Nelder–Mead simplex search and particle swarm optimization for constrained engineering design problems. Expert Syst. Appl. 36(2), 3880–3886 (2009)

    Article  Google Scholar 

  40. Zapotecas-Martínez, S., Coello, C.A.C.: Monss: a multi-objective nonlinear simplex search approach. Eng. Optim. 48, 16–38 (2016)

  41. Zhang, Q., Liu, W., Tsang, E., Virginas, B.: Expensive multiobjective optimization by MOEA/D with Gaussian process model. IEEE Trans. Evol. Comput. 14(3), 456–474 (2010)

    Article  Google Scholar 

  42. Zitzler, E., Brockhoff, D., Thiele, L.: The hypervolume indicator revisited: On the design of Pareto-compliant indicators via weighted integration. In: Evolutionary Multi-criterion Optimization, pp. 862–876. Springer (2007)

  43. Zitzler, E., Deb, K., Thiele, L.: Comparison of multiobjective evolutionary algorithms: empirical results. Evol. Comput. 8(2), 173–195 (2000)

    Article  Google Scholar 

  44. Zitzler, E., Thiele, L.: Multiobjective optimization using evolutionary algorithms—a comparative case study. In: Parallel Problem Solving from Nature—PPSN-V, pp. 292–301. Springer (1998)

  45. Zitzler, E., Thiele, L., Laumanns, M., Fonseca, C.M., Da Fonseca, V.G.: Performance assessment of multiobjective optimizers: an analysis and review. IEEE Trans. Evol. Comput. 7(2), 117–132 (2003)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ingrida Steponavičė.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Steponavičė, I., Hyndman, R.J., Smith-Miles, K. et al. Dynamic algorithm selection for pareto optimal set approximation. J Glob Optim 67, 263–282 (2017). https://doi.org/10.1007/s10898-016-0420-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10898-016-0420-x

Keywords

Navigation