Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

The Modified MSA, a Gradient Flow and Convergence (2212.05784v2)

Published 12 Dec 2022 in math.OC and math.PR

Abstract: The modified Method of Successive Approximations (MSA) is an iterative scheme for approximating solutions to stochastic control problems in continuous time based on Pontryagin Optimality Principle which, starting with an initial open loop control, solves the forward equation, the backward adjoint equation and then performs a static minimization step. We observe that this is an implicit Euler scheme for a gradient flow system. We prove that appropriate interpolations of the iterates of the modified MSA converge to a gradient flow with rate $\tau$. We then study the convergence of this gradient flow as time goes to infinity. In the general (non-convex) case we prove that the gradient term itself converges to zero. This is a consequence of an energy identity which shows that the optimization objective decreases along the gradient flow. Moreover, in the convex case, when Pontryagin Optimality Principle provides a sufficient condition for optimality, we prove that the optimization objective converges at rate $\tfrac{1}{S}$ to its optimal value and at exponential rate under strong convexity. The main technical difficulties lie in obtaining appropriate properties of the Hamiltonian (growth, continuity). These are obtained by utilising the theory of Bounded Mean Oscillation (BMO) martingales required for estimates on the adjoint Backward Stochastic Differential Equation (BSDE).

Citations (1)

Summary

We haven't generated a summary for this paper yet.