Skip to main content

Part of the book series: SpringerBriefs in Optimization ((BRIEFSOPTI))

  • 332 Accesses

Abstract

Some numerical results with an implementation of the DEEPS algorithm, as well as the evolution of the minimizing function values, along the iterations of the optimization process are presented in this chapter. It is shown that the optimization process has two phases. In the first one, the reduction phase, for a small number of iterations, the function values are strongly reduced. In the second one, the stalling phase, the function values are very slowly reduced for a large number of iterations. For some problems, in the first phase of the optimization process, the function values are steadily reduced with some plateaus corresponding to halving the bounds of the domains where the trial points are generated. This is a typical evolution of the function values along the iterations of DEEPS. Intensive numerical results for solving a number of 140 unconstrained optimization problems, out of which 16 are real applications, have proved that DEEPS is able to solve a large variety of problems up to 500 variables. Comparisons with the Nelder-Mead algorithm showed that DEEPS is more efficient and more robust.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 49.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 64.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Abramson, M.A., (2005). Second-order behavior of pattern search. SIAM Journal on Optimization, 16(2), 315-330.

    Article  MathSciNet  Google Scholar 

  • Bongartz, I., Conn, A.R., Gould, N.I.M., & Toint, Ph.L, (1995). CUTE: constrained and unconstrained testing environments. ACM Transactions on Mathematical Software, 21, 123-160.

    Article  Google Scholar 

  • Dolan, E.D., & Moré, J.J., (2002). Benchmarking optimization software with performance profiles. Mathematical Programming, 91, 201-213.

    Article  MathSciNet  Google Scholar 

  • Frimannslund, L., & Steihaug, T., (2007). A generating set search method using curvature information. Computational Optimization and Applications, 38, 105-121.

    Article  MathSciNet  Google Scholar 

  • Frimannslund, L., & Steihaug, T., (2011). On a new method for derivative free optimization, International Journal on Advances in Software, 4(3-4), 244-255.

    Google Scholar 

  • Higham, N. (n.d.) The matrix Computation Toolbox. http://www.ma.man.ac.uk/~higham/mctoolbox/

  • O’Neill, R., (1971). Algorithm AS 47: Function Minimization Using a Simplex Procedure. Applied Statistics, Volume 20(3), pp. 338-345.

    Article  Google Scholar 

  • Powell, M.J.D., (2004). The NEWUOA software for unconstrained optimization without derivatives. Department of Applied Mathematics and Theoretical Physics, University of Cambridge. DAMTP 2004/NA05, November.

  • Wolfe, P., (1971). Convergence conditions for ascent methods. II: Some corrections. SIAM Review, 13(2), 185-188.

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Andrei, N. (2021). Numerical Results. In: A Derivative-free Two Level Random Search Method for Unconstrained Optimization. SpringerBriefs in Optimization. Springer, Cham. https://doi.org/10.1007/978-3-030-68517-1_4

Download citation

Publish with us

Policies and ethics