Abstract
The problem of globalizing the Newton method when the actual Hessian matrix is not used at every iteration is considered. A stabilization technique is studied that employs a new line search strategy for ensuring the global convergence under mild assumptions. Moreover, an implementable algorithmic scheme is proposed, where the evaluation of the second derivatives is conditioned to the behavior of the algorithm during the minimization process and the local convexity properties of the objective function. This is done in order to obtain a significant computational saving, while keeping acceptable the unavoidable degradation in convergence speed. The numerical results reported indicate that the method described may be employed advantageously in all applications where the computation of the Hessian matrix is highly time consuming.
Similar content being viewed by others
References
Ortega, J. M., and Rheinboldt, W. C., Iterative Solution of Nonlinear Equations in Several Variables, Academic Press, New York, NY, 1970.
Shamanskii, V. E., On a Modification of Newton's Method, Ukrainskyi Matematychnyi Zhurnal, Vol. 19, pp. 133–138, 1967 (in Russian).
Bertsekas, D. P., Constrained Optimization and Lagrange Multiplier Methods, Academic Press, New York, NY, 1980.
Kelley, C. T., Iterative Methods for Optimization, SIAM, Philadelphia, Pennsylvania, 1999.
De Leone, R., Gaudioso, M., and Grippo, L., Stopping Criteria for Linesearch Methods without Derivatives, Mathematical Programming, Vol. 30, pp. 285–300, 1984.
Grippo, L., Lampariello, F., and Lucidi, S., Global Convergence and Stabilization of Unconstrained Methods without Derivatives, Journal of Optimization Theory and Applications, Vol. 56, pp. 385–406, 1988.
Wright, S. J., Primal–Dual Interior-Point Methods, SIAM, Philadelphia, Pennsylvania, 1997.
Gill, P. E., and Murray, W., Newton-Type Methods for Unconstrained and Linearly Constrained Optimization, Mathematical Programming, Vol. 7, pp. 311–350, 1974.
MorÉ, J. J., Garbow, B. S., and Hillstrom, K. E., Testing Unconstrained Optimization Software, ACM Transactions on Mathematical Software, Vol. 7, pp. 17–41, 1981.
Haykin, S., Neural Networks, 2nd Edition, Prentice-Hall International, Upper Saddle River, New Jersey, 1999.
Prechelt, L., Proben 1: A Set of Neural Network Benchmark Problems and Benchmarking Rules, Technical Report 21/94, Fakultät für Informatik, Universität Karlsruhe, Karlsruhe, Germany, 1994.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Lampariello, F., Sciandrone, M. Global Convergence Technique for the Newton Method with Periodic Hessian Evaluation. Journal of Optimization Theory and Applications 111, 341–358 (2001). https://doi.org/10.1023/A:1011934418390
Issue Date:
DOI: https://doi.org/10.1023/A:1011934418390