Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Exact Convergence rate of the subgradient method by using Polyak step size (2407.15195v1)

Published 21 Jul 2024 in math.OC

Abstract: This paper studies the last iterate of subgradient method with Polyak step size when applied to the minimization of a nonsmooth convex function with bounded subgradients. We show that the subgradient method with Polyak step size achieves a convergence rate $\mathcal{O}\left(\tfrac{1}{\sqrt[4]{N}}\right)$ in terms of the final iterate. An example is provided to show that this rate is exact and cannot be improved. We introduce an adaptive Polyak step size for which the subgradient method enjoys a convergence rate $\mathcal{O}\left(\tfrac{1}{\sqrt{N}}\right)$ for the last iterate. Its convergence rate matches exactly the lower bound on the performance of any black-box method on the considered problem class. Additionally, we propose an adaptive Polyak method with a momentum term, where the step sizes are independent of the number of iterates. We establish that the algorithm also attains the optimal convergence rate. We investigate the alternating projection method. We derive a convergence rate $\left( \frac{2N }{ 2N+1 } \right)N\tfrac{R}{\sqrt{2N+1}}$ for the last iterate, where $R$ is a bound on the distance between the initial iterate and a solution. An example is also provided to illustrate the exactness of the rate.

Citations (1)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com