Dual Averaging With Non-Strongly-Convex Prox-Functions: New Analysis and Algorithm (2504.03613v2)
Abstract: We present new analysis and algorithm of the dual-averaging-type (DA-type) methods for solving the composite convex optimization problem ${\min}_{x\in\mathbb{R}n} \, f(\mathsf{A} x) + h(x)$, where $f$ is a convex and globally Lipschitz function, $\mathsf{A}$ is a linear operator, and $h$ is a ``simple'' and convex function that is used as the prox-function in the DA-type methods. We open new avenues of analyzing and developing DA-type methods, by going beyond the canonical setting where the prox-function $h$ is assumed to be strongly convex (on its domain). To that end, we identify two new sets of assumptions on $h$ (and also $f$ and $\mathsf{A}$) and show that they hold broadly for many important classes of non-strongly-convex functions. Under the first set of assumptions, we show that the original DA method still has a $O(1/k)$ primal-dual convergence rate. Moreover, we analyze the affine invariance of this method and its convergence rate. Under the second set of assumptions, we develop a new DA-type method with dual monotonicity, and show that it has a $O(1/k)$ primal-dual convergence rate. Finally, we consider the case where $f$ is only convex and Lipschitz on $\mathcal{C}:=\mathsf{A}(\mathsf{dom} h)$, and construct its globally convex and Lipschitz extension based on the Pasch-Hausdorff envelope. Furthermore, we characterize the sub-differential and Fenchel conjugate of this extension using the convex analytic objects associated with $f$ and $\mathcal{C}$.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.