REFERENCES
A. Arapostathis, V. S. Borkar, E. Fernandez-Gaucherand, M. K. Ghosh, and S. I. Markus, "Discrete-time controlled Markov processes with average cost criterion," SIAM J. Contr. Optim., 31, 282-344 (1993).
D. P. Bertsekas, Dynamic Programming: Deterministic and Stochastic Models, Prentice-Hall, Englewood Cliffs, New Jersey (1987).
E. V. Denardo and B. Fox, "Multichain Markov renewal programs," SIAM J. Appl. Math., 16, 468-487 (1968).
E. B. Dynkin and A. A. Yushkevich, Controlled Markov Processes, Springer, New York (1979).
E. A. Feinberg, "The existence of a stationary "εoptimal policy for a finite Markov chain," Teor. Veroyatn. Primen., 23, 297-313 (1978).
B. Fox, "Existence of stationary optimal policies for some Markov renewal programs," SIAM Rev., 9, 573-576 (1967).
A. Y. Golubin, "A note on the convergence of policy iteration in Markov decision processes with compact action spaces," Math. Oper. Res., 28 (2003) (to appear).
A. Hordijk, Dynamic Programming and Markov Potential Theory, Math. Centre, Amsterdam (1974).
R. A. Howard, Dynamic Programming and Markov Processes, Wiley, New York-London (1960).
S. S. Lavenberg and M. Reiser, "Mean value analysis of closed multichain queueing networks," J.A.C.M., 27, 313-322 (1980).
S. Stidham Jr. and R. R. Weber, "Control of service rates in networks of queues," Adv. Appl. Probab., 19, 202-218 (1987).
D. H. Wagner, "Survey of measurable selection theorems," SIAM J. Contr. Optim., 16, 859-903 (1977).
H. Zijm, "The optimality equations in multichain denumerable Markov decision processes with average cost criterion: The bounded cost case," Statist. Decisions, 3, 143-165 (1985).
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Golubin, A.Y. Nonstationary Policies and Average Optimality in Multichain Markov Decision Processes with a General Action Space. Journal of Mathematical Sciences 123, 3733–3740 (2004). https://doi.org/10.1023/B:JOTH.0000036314.29733.3d
Issue Date:
DOI: https://doi.org/10.1023/B:JOTH.0000036314.29733.3d