A Proximal Method for Composite Optimization with Smooth and Convex Components
Abstract: We introduce prox-convex for minimizing $F(x)=g(x)+h(C(x))+s(R(x))$, where $g$ and $h$ are convex, $C$ and $s$ are smooth, and each component of $R$ is convex (possibly nonsmooth). Here $g$ captures general convex objectives and indicator functions for convex constraints, while the composite template simultaneously models convex penalties on smooth features $(h \circ C)$ and smooth couplings of convex (possibly nonsmooth) features $(s \circ R)$. Each prox-convex step forms a convex subproblem by linearizing only the smooth maps while preserving the existing convex structure. The resulting subproblem is made strongly convex with the proximal metric $Q_k=μ_k I+H_k+ \succ 0$ where $μ_k$ is adapted using an implicit trust-region strategy, and $H_k+ \succeq 0$ is an optional curvature term for local acceleration. Under mild Lipschitz/smoothness and a per-coordinate monotone-or-smooth condition, we prove subdifferential regularity, derive two-sided quadratic model error bounds with explicit constants, and obtain sufficient decrease with $O(\varepsilon{-2})$ complexity for driving the norm of the metric prox-gradient below $\varepsilon$. Furthermore, a local error-bound condition for $F$ guarantees a metric step-size error bound and hence local $Q$-linear convergence of the function values. Using the Taylor-like model framework of Drusvyatskiy, Ioffe, and Lewis, we show that every cluster point of the iterates is limiting-stationary; under our regularity conditions, this further implies Fréchet stationarity. The same framework also establishes robustness to inexact subproblem solves and justifies a model-decrease termination rule.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.