Papers
Topics
Authors
Recent
Search
2000 character limit reached

The Modified MSA, a Gradient Flow and Convergence

Published 12 Dec 2022 in math.OC and math.PR | (2212.05784v2)

Abstract: The modified Method of Successive Approximations (MSA) is an iterative scheme for approximating solutions to stochastic control problems in continuous time based on Pontryagin Optimality Principle which, starting with an initial open loop control, solves the forward equation, the backward adjoint equation and then performs a static minimization step. We observe that this is an implicit Euler scheme for a gradient flow system. We prove that appropriate interpolations of the iterates of the modified MSA converge to a gradient flow with rate $\tau$. We then study the convergence of this gradient flow as time goes to infinity. In the general (non-convex) case we prove that the gradient term itself converges to zero. This is a consequence of an energy identity which shows that the optimization objective decreases along the gradient flow. Moreover, in the convex case, when Pontryagin Optimality Principle provides a sufficient condition for optimality, we prove that the optimization objective converges at rate $\tfrac{1}{S}$ to its optimal value and at exponential rate under strong convexity. The main technical difficulties lie in obtaining appropriate properties of the Hamiltonian (growth, continuity). These are obtained by utilising the theory of Bounded Mean Oscillation (BMO) martingales required for estimates on the adjoint Backward Stochastic Differential Equation (BSDE).

Citations (1)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.