Papers
Topics
Authors
Recent
2000 character limit reached

Convergence Rate Bounds for the Mirror Descent Method: IQCs, Popov Criterion and Bregman Divergence (2304.03886v4)

Published 8 Apr 2023 in math.OC

Abstract: This paper presents a comprehensive convergence analysis for the mirror descent (MD) method, a widely used algorithm in convex optimization. The key feature of this algorithm is that it provides a generalization of classical gradient-based methods via the use of generalized distance-like functions, which are formulated using the Bregman divergence. Establishing convergence rate bounds for this algorithm is in general a non-trivial problem due to the lack of monotonicity properties in the composite nonlinearities involved. In this paper, we show that the Bregman divergence from the optimal solution, which is commonly used as a Lyapunov function for this algorithm, is a special case of Lyapunov functions that follow when the Popov criterion is applied to an appropriate reformulation of the MD dynamics. This is then used as a basis to construct an integral quadratic constraint (IQC) framework through which convergence rate bounds with reduced conservatism can be deduced. We also illustrate via examples that the convergence rate bounds derived can be tight.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Paper to Video (Beta)

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.