Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Stochastic smoothing accelerated gradient method for nonsmooth convex composite optimization (2308.01252v2)

Published 2 Aug 2023 in math.OC

Abstract: We propose a novel stochastic smoothing accelerated gradient (SSAG) method for general constrained nonsmooth convex composite optimization, and analyze the convergence rates. The SSAG method allows various smoothing techniques, and can deal with the nonsmooth term that is not easy to compute its proximal term, or that does not own the linear max structure. To the best of our knowledge, it is the first stochastic approximation type method with solid convergence result to solve the convex composite optimization problem whose nonsmooth term is the maximization of numerous nonlinear convex functions. We prove that the SSAG method achieves the best-known complexity bounds in terms of the stochastic first-order oracle ($\mathcal{SFO}$), using either diminishing smoothing parameters or a fixed smoothing parameter. We give two applications of our results to distributionally robust optimization problems. Numerical results on the two applications demonstrate the effectiveness and efficiency of the proposed SSAG method.

Citations (1)

Summary

We haven't generated a summary for this paper yet.