Abstract
The Low-Memory Matrix Adaptation Evolution Strategy is a recent variant of CMA-ES that is specifically meant for large-scale numerical optimization. In this paper, we investigate if and how gradient information can be included in this algorithm, in order to enhance its performance. Furthermore, we consider the incorporation of Lévy flight to alleviate stability issues due to possibly unreliably gradient estimation as well as promote better exploration. In total, we propose four new variants of LMMA-ES, making use of real and estimated gradient, with and without Lévy flight. We test the proposed variants on two neural network training tasks, one for image classification through the newly introduced Forward-Forward paradigm, and one for a Reinforcement Learning problem, as well as five benchmark functions for numerical optimization.
Notes
- 1.
- 2.
Note that this method can be useful also with real gradients. In fact, in certain circumstances, even real gradients may not help find better points in the space.
- 3.
- 4.
- 5.
References
Hansen, N., Müller, S.D., Koumoutsakos, P.: Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). Evol. Comput. 11(1), 1–18 (2003)
Ostermeier, A., Gawelczyk, A., Hansen, N.: Step-size adaptation based on non-local use of selection information. In: Davidor, Y., Schwefel, H.-P., Männer, R. (eds.) PPSN 1994. LNCS, vol. 866, pp. 189–198. Springer, Heidelberg (1994). https://doi.org/10.1007/3-540-58484-6_263
Caraffini, F., Iacca, G., Neri, F., Picinali, L., Mininno, E.: A CMA-ES super-fit scheme for the re-sampled inheritance search. In: IEEE Congress on Evolutionary Computation, pp. 1123–1130. IEEE (2013)
Caraffini, F., Iacca, G., Yaman, A.: Improving (1+1) covariance matrix adaptation evolution strategy: a simple yet efficient approach. In: International Global Optimization Workshop (2019)
Igel, C., Hansen, N., Roth, S.: Covariance matrix adaptation for multi-objective optimization. Evol. Comput. 15(1), 1–28 (2007)
Arnold, D.V., Hansen, N.: A (1+ 1)-CMA-ES for constrained optimisation. In: Genetic and Evolutionary Computation Conference, pp. 297–304 (2012)
de Melo, V.V., Iacca, G.: A CMA-ES-based 2-stage memetic framework for solving constrained optimization problems. In: IEEE Symposium on Foundations of Computational Intelligence, pp. 143–150. IEEE (2014)
de Melo, V.V., Iacca, G.: A modified covariance matrix adaptation evolution strategy with adaptive penalty function and restart for constrained optimization. Expert Syst. Appl. 41(16), 7077–7094 (2014)
Ros, R., Hansen, N.: A simple modification in CMA-ES achieving linear time and space complexity. In: Rudolph, G., Jansen, T., Beume, N., Lucas, S., Poloni, C. (eds.) PPSN 2008. LNCS, vol. 5199, pp. 296–305. Springer, Heidelberg (2008). https://doi.org/10.1007/978-3-540-87700-4_30
Beyer, H.-G., Sendhoff, B.: Covariance matrix adaptation revisited – the CMSA evolution strategy –. In: Rudolph, G., Jansen, T., Beume, N., Lucas, S., Poloni, C. (eds.) PPSN 2008. LNCS, vol. 5199, pp. 123–132. Springer, Heidelberg (2008). https://doi.org/10.1007/978-3-540-87700-4_13
Beyer, H.-G., Sendhoff, B.: Simplify your covariance matrix adaptation evolution strategy. IEEE Trans. Evol. Comput. 21(5), 746–759 (2017)
Jastrebski, G.A., Arnold, D.V.: Improving evolution strategies through active covariance matrix adaptation. In: IEEE Congress on Evolutionary Computation, pp. 2814–2821. IEEE (2006)
Arabas, J., Jagodziński, D.: Toward a matrix-free covariance matrix adaptation evolution strategy. IEEE Trans. Evol. Comput. 24(1), 84–98 (2019)
Loshchilov, I., Glasmachers, T., Beyer, H.-G.: Large scale black-box optimization by limited-memory matrix adaptation. IEEE Trans. Evol. Comput. 23(2), 353–358 (2019)
Salimans, T., Ho, J., Chen, X., Sidor, S., Sutskever, I.: Evolution strategies as a scalable alternative to reinforcement learning. arXiv preprint arXiv:1703.03864 (2017)
Iacca, G., dos Santos Junior, V.C., de Melo, V.V.: An improved Jaya optimization algorithm with Lévy flight. Expert Syst. Appl. 165, 113902 (2020)
Hinton, G.: The forward-forward algorithm: some preliminary investigations (2022)
Reddi, S.J., Kale, S., Kumar, S.: On the convergence of adam and beyond. arXiv preprint arXiv:1904.09237 (2019)
Pezeshki, M., Rahman, H., Yun, J.: pytorch_forward_forward. https://github.com/mohammadpz/pytorch_forward_forward
Lee, H.-C., Song, J.: SymBa: symmetric backpropagation-free contrastive learning with forward-forward algorithm for optimizing convergence (2023)
Barto, A.G., Sutton, R.S., Anderson, C.W.: Neuronlike adaptive elements that can solve difficult learning control problems. IEEE Trans. Syst. Man Cybern. 13(5), 834–846 (1983)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Lunelli, R., Iacca, G. (2024). Low-Memory Matrix Adaptation Evolution Strategies Exploiting Gradient Information and Lévy Flight. In: Smith, S., Correia, J., Cintrano, C. (eds) Applications of Evolutionary Computation. EvoApplications 2024. Lecture Notes in Computer Science, vol 14634. Springer, Cham. https://doi.org/10.1007/978-3-031-56852-7_3
Download citation
DOI: https://doi.org/10.1007/978-3-031-56852-7_3
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-56851-0
Online ISBN: 978-3-031-56852-7
eBook Packages: Computer ScienceComputer Science (R0)