Skip to main content

Advertisement

Log in

Efficient multiobjective optimization employing Gaussian processes, spectral sampling and a genetic algorithm

  • Published:
Journal of Global Optimization Aims and scope Submit manuscript

A Correction to this article was published on 09 March 2018

This article has been updated

Abstract

Many engineering problems require the optimization of expensive, black-box functions involving multiple conflicting criteria, such that commonly used methods like multiobjective genetic algorithms are inadequate. To tackle this problem several algorithms have been developed using surrogates. However, these often have disadvantages such as the requirement of a priori knowledge of the output functions or exponentially scaling computational cost with respect to the number of objectives. In this paper a new algorithm is proposed, TSEMO, which uses Gaussian processes as surrogates. The Gaussian processes are sampled using spectral sampling techniques to make use of Thompson sampling in conjunction with the hypervolume quality indicator and NSGA-II to choose a new evaluation point at each iteration. The reference point required for the hypervolume calculation is estimated within TSEMO. Further, a simple extension was proposed to carry out batch-sequential design. TSEMO was compared to ParEGO, an expected hypervolume implementation, and NSGA-II on nine test problems with a budget of 150 function evaluations. Overall, TSEMO shows promising performance, while giving a simple algorithm without the requirement of a priori knowledge, reduced hypervolume calculations to approach linear scaling with respect to the number of objectives, the capacity to handle noise and lastly the ability for batch-sequential usage.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

Change history

  • 09 March 2018

    This note lists changes to the article by Bradford et al.. A number of figures have moved to the wrong positions, while the references in the text still refer to the correct figure, i.e. a number of figures are now in the wrong section, have the wrong position and have the wrong caption.

References

  1. Farmani, R., Savic, D., Walters, G.: Evolutionary multi-objective optimization in water distribution network design. Eng. Optim. 37(2), 167–183 (2005)

    Article  Google Scholar 

  2. Censor, Y.: Pareto optimality in multiobjective problems. Appl. Math. Optim. 4(1), 41–59 (1977)

    Article  MathSciNet  MATH  Google Scholar 

  3. Zitzler, E., Thiele, L.: Multiobjective evolutionary algorithms: a comparative case study and the strength Pareto approach. IEEE Trans. Evol. Comput. 3(4), 257–271 (1999)

    Article  Google Scholar 

  4. Ehrgott, M.: Multiobjective optimization. AI Mag. 29(4), 47 (2009)

    Article  Google Scholar 

  5. Hwang, C.-R.: Simulated annealing: theory and applications. Acta Appl. Math. 12(1), 108–111 (1988)

    Google Scholar 

  6. Davis, L.: Handbook of Genetic Algorithms. Van Nostrand Reinhold, New York (1991)

    Google Scholar 

  7. Kennedy, J.: Particle swarm optimization. In: Sammut, C., Webb, G.I. (eds.) Encyclopedia of Machine Learning, pp. 760–766. Springer, Berlin (2011)

    Google Scholar 

  8. Sacks, J., Welch, W.J., Mitchell, T.J., Wynn, H.P.: Design and analysis of computer experiments. Stat. Sci. 4(4), 409–423 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  9. Simpson, T.W., Booker, A.J., Ghosh, D., Giunta, A.A., Koch, P.N., Yang, R.-J.: Approximation methods in multidisciplinary analysis and optimization: a panel discussion. Struct. Multidiscipl. Optim. 27(5), 302–313 (2004)

    Article  Google Scholar 

  10. McKay, M.D., Beckman, R.J., Conover, W.J.: A comparison of three methods for selecting values of input variables in the analysis of output from a computer code. Technometrics 42(1), 55–61 (2000)

    Article  MATH  Google Scholar 

  11. Shahriari, B., Swersky, K., Wang, Z., Adams, R.P., de Freitas, N.: Taking the human out of the loop: a review of bayesian optimization. Proc. IEEE 104(1), 148–175 (2016)

    Article  Google Scholar 

  12. Jones, D.R., Schonlau, M., Welch, W.J.: Efficient global optimization of expensive black-box functions. J. Glob. Optim. 13(4), 455–492 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  13. Močkus, J.: On Bayesian methods for seeking the extremum. In: Optimization Techniques IFIP Technical Conference, pp. 400–404. Springer (1975)

  14. Łaniewski-Wołłk, Ł.: Relative Expected Improvement in Kriging Based Optimization. arXiv preprint arXiv:0908.3321 (2009)

  15. Jones, D.R.: A taxonomy of global optimization methods based on response surfaces. J. Glob. Optim. 21(4), 345–383 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  16. Forrester, A.I., Keane, A.J.: Recent advances in surrogate-based optimization. Prog. Aerosp. Sci. 45(1), 50–79 (2009)

    Article  Google Scholar 

  17. Voutchkov, I., Keane, A.: Multi-objective optimization using surrogates. In: Tenne, Y., Goh, C.K. (eds.) Computational Intelligence in Optimization, pp. 155–175. Springer, Berlin (2010)

  18. Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 6(2), 182–197 (2002)

    Article  Google Scholar 

  19. Knowles, J.: ParEGO: a hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems. IEEE Trans. Evol. Comput. 10(1), 50–66 (2006)

    Article  Google Scholar 

  20. Ponweiser, W., Wagner, T., Biermann, D., Vincze, M.: Multiobjective optimization on a limited budget of evaluations using model-assisted\(\backslash \) mathcal S-Metric selection. In: International Conference on Parallel Problem Solving from Nature, pp. 784–794. Springer (2008)

  21. Kushner, H.J.: A new method of locating the maximum point of an arbitrary multipeak curve in the presence of noise. J. Basic Eng. 86(1), 97–106 (1964)

    Article  Google Scholar 

  22. Keane, A.J.: Statistical improvement criteria for use in multiobjective design optimization. AIAA J. 44(4), 879–891 (2006)

    Article  Google Scholar 

  23. Emmerich, M., Klinkenberg, J.-W.: The computation of the expected improvement in dominated hypervolume of Pareto front approximations. Rapport technique, Leiden University (2008)

  24. Emmerich, M.: Single-and multi-objective evolutionary design optimization assisted by gaussian random field metamodels. Ph.D. thesis, University of Dortmund (2005)

  25. Emmerich, M., Deutz, A., Klinkenberg, J.-W.: Hypervolume-based expected improvement: monotonicity properties and exact computation. In: IEEE Congress on Evolutionary Computation, pp. 2147–2154. IEEE (2011)

  26. Hupkens, I., Emmerich, M., Deutz, A.: Faster Computation of Expected Hypervolume Improvement. arXiv preprint arXiv:1408.7114 (2014)

  27. Emmerich, M., Yang, K., Deutz, A., Wang, H., Fonseca, C.M.: A multicriteria generalization of bayesian global optimization. In: Pardalos, P., Zhigljavsky, A., Žilinskas, J. (eds.) Advances in Stochastic and Deterministic Global Optimization, pp. 229–242. Springer, Berlin (2016)

  28. Yang, K., Emmerich, M., Deutz, A., Fonseca, C.M.: Computing 3-D expected hypervolume improvement and related integrals in asymptotically optimal time. In: International Conference on Evolutionary Multi-Criterion Optimization, pp. 685–700. Springer (2017)

  29. Thompson, W.R.: On the likelihood that one unknown probability exceeds another in view of the evidence of two samples. Biometrika 25(3/4), 285–294 (1933)

    Article  MATH  Google Scholar 

  30. Zhigljavsky, A., Zilinskas, A.: Stochastic Global Optimization, vol. 9. Springer, Berlin (2007)

    MATH  Google Scholar 

  31. Žilinskas, A.: A statistical model-based algorithm for ‘black-box’multi-objective optimisation. Int. J. Syst. Sci. 45(1), 82–93 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  32. Helmdach, D., Yaseneva, P., Heer, P.K., Schweidtmann, A.M., Lapkin, A.: A multi-objective optimisation including results of life cycle assessment in developing bio-renewables-based processes. ChemSusChem 10(18), 3632–3643 (2017)

    Article  Google Scholar 

  33. Rasmussen, C.E.: Gaussian processes in machine learning. In: Bousquet, O., VonLuxburg, U., Ratsch, G. (eds.) Advanced Lectures on Machine Learning. Lecture Notes in Artificial Intelligence, vol. 3176, pp. 63–71 (2004)

  34. Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. MIT Press, Cambridge (2005)

    MATH  Google Scholar 

  35. Ebden, M.: Gaussian processes: a quick introduction. arXiv preprint arXiv:1505.02965 (2015)

  36. Rasmussen, C.E.: Gaussian Processes for Machine Learning. MIT Press, Cambridge (2006)

    MATH  Google Scholar 

  37. Yang, X., Maciejowski, J.M.: Fault tolerant control using Gaussian processes and model predictive control. Int. J. Appl. Math. Comput. Sci. 25(1), 133–148 (2015)

    Article  MATH  Google Scholar 

  38. Matérn, B.: Spatial Variation, vol. 36. Springer, Berlin (2013)

    MATH  Google Scholar 

  39. Sundararajan, S., Keerthi, S.S.: Predictive approaches for choosing hyperparameters in Gaussian processes. Neural Comput. 13(5), 1103–1118 (2001)

    Article  MATH  Google Scholar 

  40. Hernández-Lobato, J.M., Hoffman, M.W., Ghahramani, Z.: Predictive entropy search for efficient global optimization of black-box functions. In: Ghahramani, Z., Welling, M., Cortes, C., Lawrence, N.D., Weinberger, K.Q. (eds.) Advances in Neural Information Processing Systems 27, vol. 1, pp. 918–926. MIT Press, Cambridge (2014)

  41. Bochner, S.: Lectures on Fourier Integrals, vol. 42. Princeton University Press, Princeton (1959)

    MATH  Google Scholar 

  42. Rahimi, A., Recht, B.: Random features for large-scale kernel machines. In: Platt, J.C., Koller, D., Singer, Y., Roweis, S.T. (eds.) Advances in Neural Information Processing Systems 20, pp. 1177–1184. MIT Press, Cambridge (2007)

    Google Scholar 

  43. Lázaro-Gredilla, M., Quiñonero-Candela, J., Rasmussen, C.E., Figueiras-Vidal, A.R.: Sparse spectrum Gaussian process regression. J. Mach. Learn. Res. 11(6), 1865–1881 (2010)

    MathSciNet  MATH  Google Scholar 

  44. Chapelle, O., Li, L.: An empirical evaluation of Thompson sampling. In: Shawe-Taylor, J., Zemel, R.S., Bartlett, P.L., Pereira, F., Weinberger, K.Q. (eds.) Advances in Neural Information Processing Systems 24, pp. 2249–2257. MIT Press, Cambridge (2011)

    Google Scholar 

  45. Scott, S.L.: A modern Bayesian look at the multi-armed bandit. Appl. Stoch. Models Bus. Ind. 26(6), 639–658 (2010)

    Article  MathSciNet  Google Scholar 

  46. Shahriari, B., Wang, Z., Hoffman, M.W., Bouchard-Côté, A., de Freitas, N.: An entropy search portfolio for Bayesian optimization. arXiv preprint arXiv:1406.4625 (2014)

  47. Kaufmann, E., Korda, N., Munos, R.: Thompson sampling: an asymptotically optimal finite-time analysis. In: International Conference on Algorithmic Learning Theory, pp. 199–213. Springer (2012)

  48. Agrawal, S., Goyal, N.: Thompson sampling for contextual bandits with linear payoffs. In: 30th International Conference on Machine Learning (ICML-13), pp. 127–135 (2013)

  49. Yahyaa, S., Manderick, B.: Thompson sampling for multi-objective multi-armed bandits problem. In: 2015 European Symposium on Artificial Neural Networks, pp. 47–52. Presses universitaires de Louvain (2015)

  50. Stein, M.: Large sample properties of simulations using Latin hypercube sampling. Technometrics 29(2), 143–151 (1987)

    Article  MathSciNet  MATH  Google Scholar 

  51. Emmerich, M., Fonseca, C.: Computing hypervolume contributions in low dimensions: asymptotically optimal algorithm and complexity results. In: Takahashi, R.H.C., Deb, K., Wanner, E.F., Greco, S. (eds.) Evolutionary Multi-Criterion Optimization, pp. 121–135. Springer, Berlin (2011)

    Chapter  Google Scholar 

  52. Bader, J., Deb, K., Zitzler, E.: Faster hypervolume-based search using Monte Carlo sampling. In: Multiple Criteria Decision Making for Sustainable Energy and Transportation Systems: Proceedings of the 19th International Conference on Multiple Criteria Decision Making. pp. 313–326. Springer, Berlin (2010)

  53. Finkel, D.E.: DIRECT Optimization Algorithm User Guide. http://www4.ncsu.edu/~ctk/Finkel_Direct/DirectUserGuide_pdf.pdf. Accessed 26 Nov 2017 (2003)

  54. Lin, S.: NGPM a NSGA-II Program in Matlab. Codelooker. http://www.codelooker.com/dfilec/6987NGPMv11.4/NGPMmanualv1.4.pdf. Accessed 26 Nov 2017 (2011)

  55. Rasmussen, C.E., Nickisch, H.: The GPML Toolbox Version 3.5. http://mlg.eng.cam.ac.uk/carl/gpml/doc/manual.pdf. Accessed 26 Nov 2017 (2014)

  56. Yi, C.: Pareto Set. Matlab File Exchange. https://se.mathworks.com/matlabcentral/fileexchange/15181-pareto-set. Accessed 26 Nov 2017 (2014)

  57. Yi, C.: Hypervolume Indicator. Matlab File Exchange. https://se.mathworks.com/matlabcentral/fileexchange/19651-hypervolume-indicator. Accessed 26 Nov 2017 (2008)

  58. Bradford, E., Schweidtmann, A.M.: TS-EMO. GitHub. https://github.com/Eric-Bradford/TS-EMO. Accessed 29 Nov 2017 (2017)

  59. Schaffer, J.D.: Some Experiments in Machine Learning Using Vector Evaluated Genetic Algorithms. Vanderbilt University, Nashville (1985)

    Google Scholar 

  60. Cristescu, C.: ParEGO_Eigen. GitHub. https://github.com/CristinaCristescu/ParEGO_Eigen/graphs/contributors. Accessed 26 Nov 2017 (2015)

  61. Gorissen, D., Couckuyt, I., Demeester, P., Dhaene, T., Crombecq, K.: A surrogate modeling and adaptive sampling toolbox for computer based design. J. Mach. Learn. Res 11(Jul), 2051–2055 (2010)

    Google Scholar 

  62. Cristescu, C., Knowles, J.: Surrogate-Based Multiobjective Optimization: ParEGO Update and Test

  63. Zitzler, E., Thiele, L., Laumanns, M., Fonseca, C.M., Da Fonseca, V.G.: Performance assessment of multiobjective optimizers: an analysis and review. IEEE Trans. Evol. Comput. 7(2), 117–132 (2003)

    Article  Google Scholar 

  64. Knowles, J., Thiele, L., Zitzler, E.: A tutorial on the performance assessment of stochastic multiobjective optimizers. Tik Rep. 214, 327–332 (2006)

    Google Scholar 

  65. Riquelme, N., Von Lücken, C., Baran, B.: Performance metrics in multi-objective optimization. In: Computing Conference (CLEI), 2015 Latin American, pp. 1–11. IEEE (2015)

  66. Ishibuchi, H., Masuda, H., Tanigaki, Y., Nojima, Y.: Modified Distance Calculation in Generational Distance and Inverted Generational Distance. Springer, Cham (2015)

    Book  Google Scholar 

  67. Zhou, A., Jin, Y., Zhang, Q., Sendhoff, B., Tsang, E.: Combining model-based and genetics-based offspring generation for multi-objective optimization using a convergence criterion. In: IEEE Congress on Evolutionary Computation, 2006 (CEC 2006), pp. 892–899. IEEE

  68. Zitzler, E.: Evolutionary Algorithms for Multiobjective Optimization: Methods and Applications. ETH Zurich, Zurich (1999)

    Google Scholar 

  69. Jiang, S., Ong, Y.-S., Zhang, J., Feng, L.: Consistencies and contradictions of performance metrics in multiobjective optimization. IEEE Trans. Cybern. 44(12), 2391–2404 (2014)

    Article  Google Scholar 

  70. Fonseca, C.M., Fleming, P.J.: On the performance assessment and comparison of stochastic multiobjective optimizers. In: International Conference on Parallel Problem Solving from Nature, pp. 584–593. Springer (1996)

  71. Knowles, J.: A summary-attainment-surface plotting method for visualizing the performance of stochastic multiobjective optimizers. In: 5th International Conference on Intelligent Systems Design and Applications, 2005, pp. 552–557. IEEE (2005)

Download references

Acknowledgements

The research leading to these results has received funding from the European Research Council under the European Union’s H2020 Programme Grant Agreement No. 636820. Artur M. Schweidtmann thanks the Ernest-Solvay-Foundation and the ERASMUS+ program for a scholarship for his exchange at the University of Cambridge.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Eric Bradford.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bradford, E., Schweidtmann, A.M. & Lapkin, A. Efficient multiobjective optimization employing Gaussian processes, spectral sampling and a genetic algorithm. J Glob Optim 71, 407–438 (2018). https://doi.org/10.1007/s10898-018-0609-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10898-018-0609-2

Keywords

Navigation