Abstract
Some numerical results with an implementation of the DEEPS algorithm, as well as the evolution of the minimizing function values, along the iterations of the optimization process are presented in this chapter. It is shown that the optimization process has two phases. In the first one, the reduction phase, for a small number of iterations, the function values are strongly reduced. In the second one, the stalling phase, the function values are very slowly reduced for a large number of iterations. For some problems, in the first phase of the optimization process, the function values are steadily reduced with some plateaus corresponding to halving the bounds of the domains where the trial points are generated. This is a typical evolution of the function values along the iterations of DEEPS. Intensive numerical results for solving a number of 140 unconstrained optimization problems, out of which 16 are real applications, have proved that DEEPS is able to solve a large variety of problems up to 500 variables. Comparisons with the Nelder-Mead algorithm showed that DEEPS is more efficient and more robust.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Abramson, M.A., (2005). Second-order behavior of pattern search. SIAM Journal on Optimization, 16(2), 315-330.
Bongartz, I., Conn, A.R., Gould, N.I.M., & Toint, Ph.L, (1995). CUTE: constrained and unconstrained testing environments. ACM Transactions on Mathematical Software, 21, 123-160.
Dolan, E.D., & Moré, J.J., (2002). Benchmarking optimization software with performance profiles. Mathematical Programming, 91, 201-213.
Frimannslund, L., & Steihaug, T., (2007). A generating set search method using curvature information. Computational Optimization and Applications, 38, 105-121.
Frimannslund, L., & Steihaug, T., (2011). On a new method for derivative free optimization, International Journal on Advances in Software, 4(3-4), 244-255.
Higham, N. (n.d.) The matrix Computation Toolbox. http://www.ma.man.ac.uk/~higham/mctoolbox/
O’Neill, R., (1971). Algorithm AS 47: Function Minimization Using a Simplex Procedure. Applied Statistics, Volume 20(3), pp. 338-345.
Powell, M.J.D., (2004). The NEWUOA software for unconstrained optimization without derivatives. Department of Applied Mathematics and Theoretical Physics, University of Cambridge. DAMTP 2004/NA05, November.
Wolfe, P., (1971). Convergence conditions for ascent methods. II: Some corrections. SIAM Review, 13(2), 185-188.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Andrei, N. (2021). Numerical Results. In: A Derivative-free Two Level Random Search Method for Unconstrained Optimization. SpringerBriefs in Optimization. Springer, Cham. https://doi.org/10.1007/978-3-030-68517-1_4
Download citation
DOI: https://doi.org/10.1007/978-3-030-68517-1_4
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-68516-4
Online ISBN: 978-3-030-68517-1
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)