Skip to main content

Transfer of Multi-objectively Tuned CMA-ES Parameters to a Vehicle Dynamics Problem

  • Conference paper
  • First Online:
Evolutionary Multi-Criterion Optimization (EMO 2023)

Abstract

The conflict between computational budget and quality of found solutions is crucial when dealing with expensive black-box optimization problems from the industry. We show that through multi-objective parameter tuning of the Covariance Matrix Adaptation Evolution Strategy on benchmark functions different optimal algorithm configurations can be found for specific computational budgets and solution qualities. With the obtained Pareto front, tuned parameter sets are selected and transferred to a real-world optimization problem from vehicle dynamics, improving the solution quality and budget needed. The benchmark functions for tuning are selected based on their similarity to a real-world problem in terms of Exploratory Landscape Analysis features.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    HP Workstation Z4 G4 Intel Xeon W-2125 4.00 GHz/4.50 GHz 8.25MB 2666 4C 32 GB DDR4-2666 ECC SDRAM.

References

  1. Ait Elhara, O., Auger, A., Hansen, N.: A median success rule for non-elitist evolution strategies: study of feasibility. In: Proceedings of the 15th Annual Conference on Genetic and Evolutionary Computation, GECCO 2013, Association for Computing Machinery, New York, pp. 415–422 (2013). https://doi.org/10.1145/2463372.2463429

  2. Akiba, T., Sano, S., Yanase, T., Ohta, T., Koyama, M.: Optuna: a next-generation hyperparameter optimization framework. In: Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, KDD 2019, Association for Computing Machinery, New York, pp. 2623–2631 (2019). https://doi.org/10.1145/3292500.3330701

  3. Andersson, M., Bandaru, S., Ng, A.H., Syberfeldt, A.: Parameter tuned CMA-ES on the CEC’15 expensive problems. In: 2015 IEEE Congress on Evolutionary Computation (CEC), pp. 1950–1957 (2015). https://doi.org/10.1109/CEC.2015.7257124

  4. Bäck, T.: Evolutionary Algorithms in Theory and Practice: Evolution Strategies, Evolutionary Programming, Genetic Algorithms. Oxford University Press Inc, Oxford (1996). https://doi.org/10.1093/oso/9780195099713.001.0001

  5. Bäck, T., Foussette, C., Krause, P.: Contemporary Evolution Strategies. Natural Computing Series, 1st edn. Springer, Berlin (2013)

    Google Scholar 

  6. Bartz-Beielstein, T., et al.: Benchmarking in Optimization: Best Practice and Open Issues. Technical report (2020). http://arxiv.org/2007.03488arxiv.org/pdf/2007.03488

  7. Björck, Å.: Numerics of gram-Schmidt orthogonalization. Linear Algebra Appl. 197, 297–316 (1994). https://doi.org/10.1016/0024-3795(94)90493-6

    Article  MathSciNet  MATH  Google Scholar 

  8. Brockhoff, D., Auger, A., Hansen, N., Arnold, D.V., Hohm, T.: Mirrored sampling and sequential selection for evolution strategies. In: Schaefer, R., Cotta, C., Kołodziej, J., Rudolph, G. (eds.) PPSN 2010. LNCS, vol. 6238, pp. 11–21. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-15844-5_2

    Chapter  Google Scholar 

  9. Caraffini, F., Kononova, A.V., Corne, D.: Infeasibility and structural bias in differential evolution. Inf. Sci. 496, 161–179 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  10. Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 6(2), 182–197 (2002). https://doi.org/10.1109/4235.996017

    Article  Google Scholar 

  11. Dréo, J.: Using Performance Fronts for Parameter Setting of Stochastic Metaheuristics. In: Proceedings of the 11th Annual Conference Companion on Genetic and Evolutionary Computation Conference: Late Breaking Papers, pp. 2197–2200. ACM Conferences, Association for Computing Machinery, New York (2009). https://doi.org/10.1145/1570256.1570301

  12. Eiben, A.E., Smit, S.K.: Parameter tuning for configuring and analyzing evolutionary algorithms. Swarm Evol. Comput. 1(1), 19–31 (2011). https://doi.org/10.1016/j.swevo.2011.02.001

    Article  Google Scholar 

  13. Fujii, G., Takahashi, M., Akimoto, Y.: CMA-ES-based structural topology optimization using a level set boundary expression-application to optical and carpet cloaks. Comput. Methods Appl. Mech. Eng. 332, 624–643 (2018). https://doi.org/10.1016/j.cma.2018.01.008

    Article  MathSciNet  MATH  Google Scholar 

  14. Grefenstette, J.: Optimization of control parameters for genetic algorithms. IEEE Trans. Syst. Man Cybern. 16(1), 122–128 (1986). https://doi.org/10.1109/TSMC.1986.289288

    Article  Google Scholar 

  15. Hansen, N.: CMA-ES with Two-Point Step-Size Adaptation. Technical report RR-6527, INRIA (2008). https://www.hal.inserm.fr/INRIA/inria-00276854

  16. Hansen, N.: The CMA Evolution Strategy: A Tutorial. Technical report (2016). https://arxiv.org/pdf/1604.00772

  17. Hansen, N., Finck, S., Ros, R., Auger, A.: Real-Parameter Black-Box Optimization Benchmarking 2009: Noiseless Functions Definitions. Technical report RR-6829, INRIA (2009). https://hal.inria.fr/inria-00362633/

  18. Hansen, N., Ostermeier, A.: Adapting arbitrary normal mutation distributions in evolution strategies: the covariance matrix adaptation. In: Proceedings of the IEEE International Conference on Evolutionary Computation, pp. 312–317 (1996). https://doi.org/10.1109/ICEC.1996.542381

  19. Hansen, N., Ostermeier, A.: Completely derandomized self-adaptation in evolution strategies. Evol. Comput. 9(2), 159–195 (2001). https://doi.org/10.1162/106365601750190398

    Article  Google Scholar 

  20. Hernández, A.M., van Nieuwenhuyse, I., Rojas-Gonzalez, S.: A survey on multi-objective hyperparameter optimization algorithms for Machine Learning. ArXiv (2021). https://arxiv.org/pdf/2111.13755.pdf

  21. International Organization for Standardization: ISO 21994:2007 - Passenger cars - Stopping distance at straight-line braking with ABS - Open-loop test method (2007)

    Google Scholar 

  22. Jankovic, A., Eftimov, T., Doerr, C.: Towards feature-based performance regression using trajectory data. In: Castillo, P.A., Jiménez Laredo, J.L. (eds.) EvoApplications 2021. LNCS, vol. 12694, pp. 601–617. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-72699-7_38

    Chapter  Google Scholar 

  23. Jastrebski, G.A., Arnold, D.V.: Improving evolution strategies through active covariance matrix adaptation. In: IEEE International Conference on Evolutionary Computation, pp. 2814–2821 (2006). https://doi.org/10.1109/CEC.2006.1688662

  24. Kerschke, P., Preuss, M., Wessing, S., Trautmann, H.: Detecting funnel structures by means of exploratory landscape analysis. In: Proceedings of the 2015 Annual Conference on Genetic and Evolutionary Computation, pp. 265–272. ACM Digital Library, Association for Computing Machinery, New York (2015). https://doi.org/10.1145/2739480.2754642

  25. Kerschke, P., Trautmann, H.: Comprehensive feature-based landscape analysis of continuous and constrained optimization problems using the r-package flacco. In: Bauer, N., Ickstadt, K., Lübke, K., Szepannek, G., Trautmann, H., Vichi, M. (eds.) Applications in Statistical Computing. SCDAKO, pp. 93–123. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-25147-5_7

    Chapter  Google Scholar 

  26. Koch-Dücker, H.-J., Papert, U.: Antilock braking system (ABS). In: Reif, K. (ed.) Brakes, Brake Control and Driver Assistance Systems. BPAI, pp. 74–93. Springer, Wiesbaden (2014). https://doi.org/10.1007/978-3-658-03978-3_6

    Chapter  Google Scholar 

  27. Kochenderfer, M.J., Wheeler, T.A.: Algorithms for Optimization. The MIT Press, Cambridge and London (2019)

    MATH  Google Scholar 

  28. Kokoska, S., Zwillinger, D.: CRC Standard Probability and Statistics Tables and Formulae, CRC Press, Boca Raton (2000). https://doi.org/10.1201/b16923

  29. Long, F.X., van Stein, B., Frenzel, M., Krause, P., Gitterle, M., Bäck, T.: Learning the characteristics of engineering optimization problems with applications in automotive crash. In: Proceedings of the Genetic and Evolutionary Computation Conference. GECCO 2022, Association for Computing Machinery, New York, (2022). https://doi.org/10.1145/3512290.3528712

  30. Loshchilov, I., Hutter, F.: CMA-ES for Hyperparameter Optimization of Deep Neural Networks (2016). https://arxiv.org/abs/1604.07269

  31. Lunacek, M., Whitley, D.: The dispersion metric and the CMA evolution strategy. In: Proceedings of the 8th Annual Conference on Genetic and Evolutionary Computation, p. 477. Association for Computing Machinery (2006). https://doi.org/10.1145/1143997.1144085

  32. Mersmann, O., Bischl, B., Trautmann, H., Preuss, M., Weihs, C., Rudolph, G.: Exploratory landscape analysis. In: Lanzi, P.L. (ed.) Proceedings of the 13th Annual Conference on Genetic and Evolutionary Computation, ACM Conferences, ACM, New York, pp. 829–836 (2011). https://doi.org/10.1145/2001576.2001690

  33. Mersmann, O., Preuss, M., Trautmann, H.: Benchmarking evolutionary algorithms: towards exploratory landscape analysis. In: Schaefer, R., Cotta, C., Kołodziej, J., Rudolph, G. (eds.) PPSN 2010. LNCS, vol. 6238, pp. 73–82. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-15844-5_8

    Chapter  Google Scholar 

  34. Muñoz, M.A., Kirley, M., Halgamuge, S.K.: Exploratory landscape analysis of continuous space optimization problems using information content. IEEE Trans. Evol. Comput. 19(1), 74–87 (2015). https://doi.org/10.1109/TEVC.2014.2302006

    Article  Google Scholar 

  35. Niemz, T.: Reducing Braking Distance by Control of Semi-Active Suspension. Dissertation, Technische Universität Darmstadt (2007). https://tuprints.ulb.tu-darmstadt.de/912/

  36. de Nobel, J., Vermetten, D., Wang, H., Doerr, C., Bäck, T.: Tuning as a Means of Assessing the Benefits of New Ideas in Interplay with Existing Algorithmic Modules. Technical report (2021). https://arxiv.org/pdf/2102.12905

  37. Owen, A.B.: Scrambling sobol’ and niederreiter-xing points. J. Complex. 14(4), 466–489 (1998). https://doi.org/10.1006/jcom.1998.0487

    Article  MathSciNet  MATH  Google Scholar 

  38. Pacejka, H.B., Bakker, E.: The magic formula tyre model. Veh. Syst. Dyn. 21(sup001), 1–18 (1992). https://doi.org/10.1080/00423119208969994

    Article  Google Scholar 

  39. Piad-Morffis, A., Estévez-Velarde, S., Bolufé-Röhler, A., Montgomery, J., Chen, S.: Evolution strategies with thresheld convergence. In: 2015 IEEE Congress on Evolutionary Computation (CEC), pp. 2097–2104 (2015). https://doi.org/10.1109/CEC.2015.7257143

  40. Preuss, M.: Improved topological niching for real-valued global optimization. In: Chio, C., et al. (eds.) EvoApplications 2012. LNCS, vol. 7248, pp. 386–395. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-29178-4_39

    Chapter  Google Scholar 

  41. Sala, R., Müller, R.: Benchmarking for metaheuristic black-box optimization: perspectives and open challenges. In: 2020 IEEE Congress on Evolutionary Computation (CEC), pp. 1–8. IEEE (2020). https://doi.org/10.1109/CEC48606.2020.9185724

  42. Siemens Digital Industries Software: Tire Simulation & Testing (2020). https://www.plm.automation.siemens.com/global/en/products/simulation-test/tire-simulation-testing.html

  43. Sobol’, I.M.: On the distribution of points in a cube and the approximate evaluation of integrals. Comput. Math. Math. Phys. 7(4), 86–112 (1967). https://doi.org/10.1016/0041-5553(67)90144-9

    Article  MathSciNet  MATH  Google Scholar 

  44. The MathWorks Inc: Simulink (2015). https://www.mathworks.com/’

  45. Thomaser, A., Kononova, A.V., Vogt, M.E., Bäck, T.: One-shot optimization for vehicle dynamics control systems: towards benchmarking and exploratory landscape analysis. In: Proceedings of the Genetic and Evolutionary Computation Conference Companion, pp. 2036–2045. Association for Computing Machinery, New York (2022). https://doi.org/10.1145/3520304.3533979

  46. van Rijn, S., Wang, H., van Leeuwen, M., Bäck, T.: Evolving the structure of evolution strategies. In: 2016 IEEE Symposium Series on Computational Intelligence (SSCI), pp. 1–8 (2016). https://doi.org/10.1109/SSCI.2016.7850138

  47. van Rijn, S., Wang, H., van Stein, B., Bäck, T.: Algorithm configuration data mining for cma evolution strategies. In: Proceedings of the Genetic and Evolutionary Computation Conference, GECCO 2017, Association for Computing Machinery, New York, pp. 737–744 (2017). https://doi.org/10.1145/3071178.3071205

  48. Wang, H., Emmerich, M., Bäck, T.: Mirrored orthogonal sampling with pairwise selection in evolution strategies. In: Proceedings of the 29th Annual ACM Symposium on Applied Computing, SAC 2014, Association for Computing Machinery, New York, pp. 154–156 (2014). https://doi.org/10.1145/2554850.2555089

  49. Wang, H., Emmerich, M., Bäck, T.: Mirrored orthogonal sampling for covariance matrix adaptation evolution strategies. Evol. Comput. 27(4), 699–725 (2019). https://doi.org/10.1162/evco_a_00251

    Article  Google Scholar 

  50. Ye, F., Doerr, C., Wang, H., Bäck, T.: Automated configuration of genetic algorithms by tuning for anytime performance. IEEE Trans. Evol. Comput. 26(6), 1526–1538 (2022). https://doi.org/10.1109/TEVC.2022.3159087

    Article  Google Scholar 

  51. Zhao, M., Li, J.: Tuning the hyper-parameters of CMA-ES with tree-structured Parzen estimators. In: 2018 Tenth International Conference on Advanced Computational Intelligence (ICACI), pp. 613–618 (2018). https://doi.org/10.1109/ICACI.2018.8377530

Download references

Acknowledgements

This paper was written as part of the project newAIDE under the consortium leadership of BMW AG with the partners Altair Engineering GmbH, divis intelligent solutions GmbH, MSC Software GmbH, Technical University of Munich, TWT GmbH. The project is supported by the Federal Ministry for Economic Affairs and Climate Action (BMWK) on the basis of a decision of the German Bundestag.

The authors would like to thank Jacob de Nobel and Diederick Vermetten for their support with the modular CMA-ES implementation.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to André Thomaser .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Thomaser, A., Vogt, ME., Kononova, A.V., Bäck, T. (2023). Transfer of Multi-objectively Tuned CMA-ES Parameters to a Vehicle Dynamics Problem. In: Emmerich, M., et al. Evolutionary Multi-Criterion Optimization. EMO 2023. Lecture Notes in Computer Science, vol 13970. Springer, Cham. https://doi.org/10.1007/978-3-031-27250-9_39

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-27250-9_39

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-27249-3

  • Online ISBN: 978-3-031-27250-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics