The maximum principle for discrete-time control systems and applications to dynamic games

https://doi.org/10.1016/j.jmaa.2019.02.038Get rights and content
Under an Elsevier user license
open archive

Abstract

We study deterministic nonstationary discrete-time optimal control problems in both finite and infinite horizon. With the aid of Gâteaux differentials, we prove a discrete-time maximum principle in analogy with the well-known continuous-time maximum principle. We show that this maximum principle, together a transversality condition, is a necessary condition for optimality; we also show that it is sufficient under additional hypothesis. We use Gâteaux differentials as a natural setting to derive first-order conditions. Additionally, we use the discrete-time maximum principle to derive the discrete-time Euler equation and to characterize Nash equilibria for discrete-time dynamic games.

Keywords

Maximum principle
Pontryagin principle
Discrete-time
Control system
Optimal control

Cited by (0)