Skip to main content
Log in

Gradient Projection Method on Matrix Manifolds

  • Published:
Computational Mathematics and Mathematical Physics Aims and scope Submit manuscript

Abstract

The minimization of a function with a Lipschitz continuous gradient on a proximally smooth subset of a finite-dimensional Euclidean space is considered. Under the restricted secant inequality, the gradient projection method as applied to the problem converges linearly. In certain cases, the linear convergence of the gradient projection method is proved for the real Stiefel or Grassmann manifolds.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

REFERENCES

  1. A. A. Goldstein, “Convex programming in Hilbert space,” Bull. Am. Math. Soc. 70 (5), 709–710 (1964).

    Article  MathSciNet  Google Scholar 

  2. E. S. Levitin and B. T. Polyak, “Constrained minimization methods,” USSR Comput. Math. Math. Phys. 6 (5), 1–50 (1966).

    Article  Google Scholar 

  3. Yu. E. Nesterov, Introduction to Convex Optimization (Mosk. Tsentr Neprer. Mat. Obrazovan., Moscow, 2010) [in Russian].

    Google Scholar 

  4. A. Edelman, T. A. Arias, and S. T. Smith, “The geometry of algorithms with orthogonality constraints,” J. Matrix Anal. Appl. 20 (2), 303–353 (1998).

    Article  MathSciNet  Google Scholar 

  5. D. G. Luenberger, “The gradient projection methods along geodesics,” Man. Sci. 18 (11), 620–631 (1972).

    Article  MathSciNet  Google Scholar 

  6. W. W. Hager, “Minimizing a quadratic over a sphere,” SIAM J. Optim. 12 (1), 188–208 (2001).

    Article  MathSciNet  Google Scholar 

  7. P.-A. Absil, R. Mahony, and R. Sepulchre, Matrix Manifolds (Princeton Univ. Press, Princeton, 2008).

    MATH  Google Scholar 

  8. J. X. da Cruz Neto, L. L. de Lima, and P. R. Oliveira, “Geodesic algorithms in Riemannian manifolds,” Balkan J. Geom. Appl. 3 (2), 89–100 (1998).

    MathSciNet  MATH  Google Scholar 

  9. C. Udriste, Convex Functions and Optimization Methods on Riemannian Manifolds (Springer, Dordrecht, 1998).

    MATH  Google Scholar 

  10. J.-P. Vial, “Strong and weak convexity of sets and functions,” Math. Oper. Res. 8 (2), 231–259 (1983).

    Article  MathSciNet  Google Scholar 

  11. F. H. Clarke, R. J. Stern, and P. R. Wolenski, “Proximal smoothness and lower-\({{C}^{2}}\) property,” J. Convex Anal. 2 (1–2), 117–144 (1995).

    MathSciNet  MATH  Google Scholar 

  12. J.-P. Aubin and I. Ekeland, Applied Nonlinear Analysis (Wiley, New York, 1984).

    MATH  Google Scholar 

  13. M. Balashov, B. Polyak, and A. Tremba, “Gradient projection and conditional gradient methods for constrained nonconvex minimization,” Numer. Funct. Anal. Optim. 41 (7), 822–849 (2020).

    Article  MathSciNet  Google Scholar 

  14. J. H. Conway, R. H. Hardin, and N. J. A. Sloane, “Packing lines, planes, etc.: Packings in Grassmannian spaces,” Exp. Math. 5, 139–159 (1996).

    Article  MathSciNet  Google Scholar 

  15. H. Karimi, J. Nutini, and M. Schmidt, “Linear convergence of gradient and proximal-gradient methods under the Polyak–Lojasiewicz condition,” in Machine Learning and Knowledge Findy in Databases, Ed. by P. Frasconi, N. Landwehr, G. Manco, and J. Vreeken, Lecture Notes in Computer Science, Vol. 9851 (2016).

    Google Scholar 

  16. M. V. Balashov, “About the gradient projection algorithm for a strongly convex function and a proximally smooth set,” J. Convex Anal. 24 (2), 493–500 (2017).

    MathSciNet  MATH  Google Scholar 

  17. K. Wei, J.-F. Cai, T. F. Chan, and Sh. Leung, “Guarantees of Riemannian optimization for low rank matrix recovery” (2016). arXiv: 1511.01562v8

  18. R. Schneider and A. Uschmajew, “Convergence results for projected line search methods on varieties of low-rank matrices via Lojasiewicz inequality,” SIAM J. Optim. 25 (1), 622–646 (2015).

    Article  MathSciNet  Google Scholar 

  19. P. Jain and P. Kar, Nonconvex Optimization for Machine Learning: Now Foundations and Trends (2017).

  20. R. F. Barber and W. Ha, “Gradient descent with nonconvex constraints: Local concavity determines convergence” (2017). arXiv: 1703.07755v3

  21. P.-A. Absil and J. Malick, “Projection-like retraction on matrix manifolds,” SIAM J. Optim. 22 (1), 135–158 (2012).

    Article  MathSciNet  Google Scholar 

Download references

Funding

This work was supported by the Russian Science Foundation, project no. 16-11-10015.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to M. V. Balashov.

Additional information

Translated by I. Ruzanova

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Balashov, M.V. Gradient Projection Method on Matrix Manifolds. Comput. Math. and Math. Phys. 60, 1403–1411 (2020). https://doi.org/10.1134/S0965542520090079

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1134/S0965542520090079

Keywords:

Navigation