Abstract
A new accelerated gradient method for finding the minimum of a functionf(x) whose variables are unconstrained is investigated. The new algorithm can be stated as follows:
wherex is ann-vector,g(x) is the gradient of the functionf(x), δx is the change in the position vector for the iteration under consideration, and δx i is the change in the position vector for theith previous iteration. The quantities α and β i are scalars chosen at each step so as to yield the greatest decrease in the function; the scalark denotes the number of past iterations remembered.
Fork=1, the algorithm reduces to the memory gradient method of Ref. 2; it contains at most two undetermined multipliers to be optimized by a two-dimensional search. Fork=n−1, the algorithm contains at mostn undetermined multipliers to be optimized by ann-dimensional search.
Two nonquadratic test problems are considered. For both problems, the memory gradient method and the supermemory gradient method are compared with the Fletcher-Reeves method and the Fletcher-Powell-Davidon method. A comparison with quasilinearization is also presented.
Similar content being viewed by others
References
Cragg, E. E, andLevy, A. V.,Gradient Methods in Mathematical Programming, Part 3, Supermemory Gradient Method, Rice University, Aero-Astronautics Report No. 58, 1969.
Miele, A., andCantrell, J. W.,Study on a Memory Gradient Method for the Minimization of Functions, Journal of Optimization Theory and Applications, Vol. 3, No. 6, 1969.
Fletcher, R., andReeves, C. M.,Function Minimization by Conjugate Gradients, Computer Journal, Vol. 7, No. 2, 1964.
Myers, G. E.,Properties of the Conjugate Gradient and Davidson Methods, Journal of Optimization Theory and Applications, Vol. 2, No. 4, 1968.
Pearson, J. D.,On Variable Metric Methods of Minimization, Research Analysis Corporation, Technical Paper No. RAC-TP-302, 1968.
Miele, A., Huang, H. Y., andCantrell, J. W.,Gradient Methods in Mathematical Programming, Part 1, Review of Previous Techniques, Rice University, Aero-Astronautics Report No. 55, 1969.
Author information
Authors and Affiliations
Additional information
Communicated by A. Miele
This research, supported by the Office of Scientific Research, Office of Aerospace Research, United States Air Force, Grant No. AF-AFOSR-828-67, is a condensation of the investigation described in Ref. 1. The authors are indebted to Professor Angelo Miele for stimulating discussions.
Rights and permissions
About this article
Cite this article
Cragg, E.E., Levy, A.V. Study on a supermemory gradient method for the minimization of functions. J Optim Theory Appl 4, 191–205 (1969). https://doi.org/10.1007/BF00930579
Received:
Issue Date:
DOI: https://doi.org/10.1007/BF00930579