Papers
Topics
Authors
Recent
2000 character limit reached

Convergence analysis of approximate primal solutions in dual first-order methods

Published 23 Feb 2015 in math.OC | (1502.06368v1)

Abstract: Dual first-order methods are powerful techniques for large-scale convex optimization. Although an extensive research effort has been devoted to studying their convergence properties, explicit convergence rates for the primal iterates have only been established under global Lipschitz continuity of the dual gradient. This is a rather restrictive assumption that does not hold for several important classes of problems. In this paper, we demonstrate that primal convergence rate guarantees can also be obtained when the dual gradient is only locally Lipschitz. The class of problems that we analyze admits general convex constraints including nonlinear inequality, linear equality, and set constraints. As an approximate primal solution, we take the minimizer of the Lagrangian, computed when evaluating the dual gradient. We derive error bounds for this approximate primal solution in terms of the errors of the dual variables, and establish convergence rates of the dual variables when the dual problem is solved using a projected gradient or fast gradient method. By combining these results, we show that the suboptimality and infeasibility of the approximate primal solution at iteration $k$ are no worse than $O(1/\sqrt{k})$ when the dual problem is solved using a projected gradient method, and $O(1/k)$ when a fast dual gradient method is used.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.