Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 47 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 28 tok/s Pro
GPT-5 High 25 tok/s Pro
GPT-4o 104 tok/s Pro
Kimi K2 156 tok/s Pro
GPT OSS 120B 474 tok/s Pro
Claude Sonnet 4 36 tok/s Pro
2000 character limit reached

Convergence analysis of approximate primal solutions in dual first-order methods (1502.06368v1)

Published 23 Feb 2015 in math.OC

Abstract: Dual first-order methods are powerful techniques for large-scale convex optimization. Although an extensive research effort has been devoted to studying their convergence properties, explicit convergence rates for the primal iterates have only been established under global Lipschitz continuity of the dual gradient. This is a rather restrictive assumption that does not hold for several important classes of problems. In this paper, we demonstrate that primal convergence rate guarantees can also be obtained when the dual gradient is only locally Lipschitz. The class of problems that we analyze admits general convex constraints including nonlinear inequality, linear equality, and set constraints. As an approximate primal solution, we take the minimizer of the Lagrangian, computed when evaluating the dual gradient. We derive error bounds for this approximate primal solution in terms of the errors of the dual variables, and establish convergence rates of the dual variables when the dual problem is solved using a projected gradient or fast gradient method. By combining these results, we show that the suboptimality and infeasibility of the approximate primal solution at iteration $k$ are no worse than $O(1/\sqrt{k})$ when the dual problem is solved using a projected gradient method, and $O(1/k)$ when a fast dual gradient method is used.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.

Authors (2)

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube