Skip to main content
Log in

Global Convergence Technique for the Newton Method with Periodic Hessian Evaluation

  • Published:
Journal of Optimization Theory and Applications Aims and scope Submit manuscript

Abstract

The problem of globalizing the Newton method when the actual Hessian matrix is not used at every iteration is considered. A stabilization technique is studied that employs a new line search strategy for ensuring the global convergence under mild assumptions. Moreover, an implementable algorithmic scheme is proposed, where the evaluation of the second derivatives is conditioned to the behavior of the algorithm during the minimization process and the local convexity properties of the objective function. This is done in order to obtain a significant computational saving, while keeping acceptable the unavoidable degradation in convergence speed. The numerical results reported indicate that the method described may be employed advantageously in all applications where the computation of the Hessian matrix is highly time consuming.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Ortega, J. M., and Rheinboldt, W. C., Iterative Solution of Nonlinear Equations in Several Variables, Academic Press, New York, NY, 1970.

    Google Scholar 

  2. Shamanskii, V. E., On a Modification of Newton's Method, Ukrainskyi Matematychnyi Zhurnal, Vol. 19, pp. 133–138, 1967 (in Russian).

    Google Scholar 

  3. Bertsekas, D. P., Constrained Optimization and Lagrange Multiplier Methods, Academic Press, New York, NY, 1980.

    Google Scholar 

  4. Kelley, C. T., Iterative Methods for Optimization, SIAM, Philadelphia, Pennsylvania, 1999.

    Google Scholar 

  5. De Leone, R., Gaudioso, M., and Grippo, L., Stopping Criteria for Linesearch Methods without Derivatives, Mathematical Programming, Vol. 30, pp. 285–300, 1984.

    Google Scholar 

  6. Grippo, L., Lampariello, F., and Lucidi, S., Global Convergence and Stabilization of Unconstrained Methods without Derivatives, Journal of Optimization Theory and Applications, Vol. 56, pp. 385–406, 1988.

    Google Scholar 

  7. Wright, S. J., PrimalDual Interior-Point Methods, SIAM, Philadelphia, Pennsylvania, 1997.

    Google Scholar 

  8. Gill, P. E., and Murray, W., Newton-Type Methods for Unconstrained and Linearly Constrained Optimization, Mathematical Programming, Vol. 7, pp. 311–350, 1974.

    Google Scholar 

  9. MorÉ, J. J., Garbow, B. S., and Hillstrom, K. E., Testing Unconstrained Optimization Software, ACM Transactions on Mathematical Software, Vol. 7, pp. 17–41, 1981.

    Google Scholar 

  10. Haykin, S., Neural Networks, 2nd Edition, Prentice-Hall International, Upper Saddle River, New Jersey, 1999.

    Google Scholar 

  11. Prechelt, L., Proben 1: A Set of Neural Network Benchmark Problems and Benchmarking Rules, Technical Report 21/94, Fakultät für Informatik, Universität Karlsruhe, Karlsruhe, Germany, 1994.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Lampariello, F., Sciandrone, M. Global Convergence Technique for the Newton Method with Periodic Hessian Evaluation. Journal of Optimization Theory and Applications 111, 341–358 (2001). https://doi.org/10.1023/A:1011934418390

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1011934418390

Navigation