Skip to main content
Log in

Algorithms of Inertial Mirror Descent in Convex Problems of Stochastic Optimization

  • Topical Issue
  • Published:
Automation and Remote Control Aims and scope Submit manuscript

Abstract

A minimization problem for mathematical expectation of a convex loss function over given convex compact X ∈ RN is treated. It is assumed that the oracle sequentially returns stochastic subgradients for loss function at current points with uniformly bounded second moment. The aim consists in modification of well-known mirror descent method proposed by A.S. Nemirovsky and D.B. Yudin in 1979 and having extended the standard gradient method. In the beginning, the idea of a new so-called method of Inertial Mirror Descent (IMD) on example of a deterministic optimization problem in RN with continuous time is demonstrated. Particularly, in Euclidean case the method of heavy ball is realized; it is noted that the new method no use additional point averaging. Further on, a discrete IMD algorithm is described; the upper bound on error over objective function (i.e., of the difference between current mean losses and their minimum) is proved.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Boyd, S. and Vandenberghe, L., Convex Optimization, Cambridge: Cambridge Univ. Press, 2004.

    Book  MATH  Google Scholar 

  2. Ljung, L., System Identification, Theory for the User, Upper Saddle River: Prentice Hall, 1999, 2nd ed.

    MATH  Google Scholar 

  3. Hastie, T., Tibshirani, R., and Friedman, J., The Elements of Statistical Learning. Data Mining, Inference and Prediction, New York: Springer, 2001.

    MATH  Google Scholar 

  4. Nesterov, Yu., Introductory Lectures on Convex Optimization, Boston: Kluwer, 2004.

    Book  MATH  Google Scholar 

  5. Hazan, E., Introduction to Online Convex Optimization, Foundat. Trends ® Optim., 2016, vol. 2. nos. 3–4, pp. 157–325.

    Article  Google Scholar 

  6. Nemirovskii, A.S. and Yudin, D.B., Slozhnost’ zadach i effektivnost’ metodov optimizatsii, Moscow: Nauka, 1979. Translated under the title Problem Complexity and Method Efficiency in Optimization, Chichester: Wiley, 1983.

    Google Scholar 

  7. Juditsky, A.B., Nazin, A.V., Tsybakov, A.B., and Vayatis, N., Recursive Aggregation of Estimators by the Mirror Descent Algorithm with Averaging, Probl. Informat. Transmis., 2005, vol. 41, no. 4, pp. 368–384.

    Article  MATH  Google Scholar 

  8. Polyak, B.T., Some Methods of Speeding up the Convergence of Iteration Methods, Zh. Vychisl. Mat. Mat. Fiz., 1964, vol. 4, no. 5, pp. 791–803.

    Google Scholar 

  9. Polyak, B.T., Introduction to Optimization, New York: Optimization Software, 1987.

    MATH  Google Scholar 

  10. Rockafellar, R., Convex Analysis, Princeton: Princeton Univ. Press, 1970.

    Book  MATH  Google Scholar 

  11. Magaril-Il’yaev, G.G., and Tikhomirov, V.M., Vypuklyi analis i ego prilozheniya (Convex Analysis and Its Applications), Moscow: Knizhnyi Dom “LIBROKOM,” 2011.

    Google Scholar 

  12. Beck, A. and Teboulle, M., Mirror Descent and Nonlinear Projected Subgradient Methods for Convex Optimization, Oper. Res. Lett., 2003, vol. 31, no. 3, pp. 167–175.

    Article  MathSciNet  MATH  Google Scholar 

  13. Ben-Tal, A., Margalit, T., and Nemirovski, A., The Ordered Subsets Mirror Descent Optimization Method with Applications to Tomography, SIAM J. Optim., 2001, vol. 12, no. 1, pp. 79–108.

    Article  MathSciNet  MATH  Google Scholar 

  14. Rockafellar, R.T. and Wets, R.J.B., Variational Analysis, New York: Springer, 1998.

    Book  MATH  Google Scholar 

  15. Ben-Tal, A. and Nemirovski, A.S., The Conjugate Barrier Mirror Descent Method for Non-Smooth Convex Optimization, MINERVA Optim. Center Report, Haifa: Faculty of Industrial Engineering and Management, Technion—Israel Institute of Technology, 1999.

    Google Scholar 

  16. Su, W., Boyd, S., and Candes, E., A Differential Equation for Modeling Nesterov’s Accelerated Gradient Method: Theory and Insights, J. Machine Learning Res., 2016, no. 17 (153), pp. 1–43.

    MathSciNet  MATH  Google Scholar 

  17. Nesterov, Yu.E., One Class of Methods of Unconditional Minimization of a Convex Function, Having a High Rate of Convergence, USSR. Comput. Math. Math. Phys., 1984, vol. 24, no. 4, pp. 80–82.

    Article  MATH  Google Scholar 

  18. Nesterov, Yu. and Shikhman, V., Quasi-monotone Subgradient Methods for Nonsmooth Convex Minimization, J. Optim. Theory Appl., 2015, no. 165, pp. 917–940.

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to A. V. Nazin.

Additional information

Original Russian Text © A.V. Nazin, 2018, published in Avtomatika i Telemekhanika, 2018, No. 1, pp. 100-112.

This paper was recommended for publication by A.I. Kibzun, a member of the Editorial Board

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Nazin, A.V. Algorithms of Inertial Mirror Descent in Convex Problems of Stochastic Optimization. Autom Remote Control 79, 78–88 (2018). https://doi.org/10.1134/S0005117918010071

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1134/S0005117918010071

Keywords

Navigation