Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Gradient-Free Methods for Non-Smooth Convex Stochastic Optimization with Heavy-Tailed Noise on Convex Compact (2304.02442v3)

Published 5 Apr 2023 in math.OC

Abstract: We present two easy-to-implement gradient-free/zeroth-order methods to optimize a stochastic non-smooth function accessible only via a black-box. The methods are built upon efficient first-order methods in the heavy-tailed case, i.e., when the gradient noise has infinite variance but bounded $(1+\kappa)$-th moment for some $\kappa \in(0,1]$. The first algorithm is based on the stochastic mirror descent with a particular class of uniformly convex mirror maps which is robust to heavy-tailed noise. The second algorithm is based on the stochastic mirror descent and gradient clipping technique. Additionally, for the objective functions satisfying the $r$-growth condition, faster algorithms are proposed based on these methods and the restart technique.

Summary

We haven't generated a summary for this paper yet.