Abstract
The minimization of a function with a Lipschitz continuous gradient on a proximally smooth subset of a finite-dimensional Euclidean space is considered. Under the restricted secant inequality, the gradient projection method as applied to the problem converges linearly. In certain cases, the linear convergence of the gradient projection method is proved for the real Stiefel or Grassmann manifolds.
Similar content being viewed by others
REFERENCES
A. A. Goldstein, “Convex programming in Hilbert space,” Bull. Am. Math. Soc. 70 (5), 709–710 (1964).
E. S. Levitin and B. T. Polyak, “Constrained minimization methods,” USSR Comput. Math. Math. Phys. 6 (5), 1–50 (1966).
Yu. E. Nesterov, Introduction to Convex Optimization (Mosk. Tsentr Neprer. Mat. Obrazovan., Moscow, 2010) [in Russian].
A. Edelman, T. A. Arias, and S. T. Smith, “The geometry of algorithms with orthogonality constraints,” J. Matrix Anal. Appl. 20 (2), 303–353 (1998).
D. G. Luenberger, “The gradient projection methods along geodesics,” Man. Sci. 18 (11), 620–631 (1972).
W. W. Hager, “Minimizing a quadratic over a sphere,” SIAM J. Optim. 12 (1), 188–208 (2001).
P.-A. Absil, R. Mahony, and R. Sepulchre, Matrix Manifolds (Princeton Univ. Press, Princeton, 2008).
J. X. da Cruz Neto, L. L. de Lima, and P. R. Oliveira, “Geodesic algorithms in Riemannian manifolds,” Balkan J. Geom. Appl. 3 (2), 89–100 (1998).
C. Udriste, Convex Functions and Optimization Methods on Riemannian Manifolds (Springer, Dordrecht, 1998).
J.-P. Vial, “Strong and weak convexity of sets and functions,” Math. Oper. Res. 8 (2), 231–259 (1983).
F. H. Clarke, R. J. Stern, and P. R. Wolenski, “Proximal smoothness and lower-\({{C}^{2}}\) property,” J. Convex Anal. 2 (1–2), 117–144 (1995).
J.-P. Aubin and I. Ekeland, Applied Nonlinear Analysis (Wiley, New York, 1984).
M. Balashov, B. Polyak, and A. Tremba, “Gradient projection and conditional gradient methods for constrained nonconvex minimization,” Numer. Funct. Anal. Optim. 41 (7), 822–849 (2020).
J. H. Conway, R. H. Hardin, and N. J. A. Sloane, “Packing lines, planes, etc.: Packings in Grassmannian spaces,” Exp. Math. 5, 139–159 (1996).
H. Karimi, J. Nutini, and M. Schmidt, “Linear convergence of gradient and proximal-gradient methods under the Polyak–Lojasiewicz condition,” in Machine Learning and Knowledge Findy in Databases, Ed. by P. Frasconi, N. Landwehr, G. Manco, and J. Vreeken, Lecture Notes in Computer Science, Vol. 9851 (2016).
M. V. Balashov, “About the gradient projection algorithm for a strongly convex function and a proximally smooth set,” J. Convex Anal. 24 (2), 493–500 (2017).
K. Wei, J.-F. Cai, T. F. Chan, and Sh. Leung, “Guarantees of Riemannian optimization for low rank matrix recovery” (2016). arXiv: 1511.01562v8
R. Schneider and A. Uschmajew, “Convergence results for projected line search methods on varieties of low-rank matrices via Lojasiewicz inequality,” SIAM J. Optim. 25 (1), 622–646 (2015).
P. Jain and P. Kar, Nonconvex Optimization for Machine Learning: Now Foundations and Trends (2017).
R. F. Barber and W. Ha, “Gradient descent with nonconvex constraints: Local concavity determines convergence” (2017). arXiv: 1703.07755v3
P.-A. Absil and J. Malick, “Projection-like retraction on matrix manifolds,” SIAM J. Optim. 22 (1), 135–158 (2012).
Funding
This work was supported by the Russian Science Foundation, project no. 16-11-10015.
Author information
Authors and Affiliations
Corresponding author
Additional information
Translated by I. Ruzanova
Rights and permissions
About this article
Cite this article
Balashov, M.V. Gradient Projection Method on Matrix Manifolds. Comput. Math. and Math. Phys. 60, 1403–1411 (2020). https://doi.org/10.1134/S0965542520090079
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1134/S0965542520090079