Papers
Topics
Authors
Recent
Search
2000 character limit reached

Gradient-Free Methods for Non-Smooth Convex Stochastic Optimization with Heavy-Tailed Noise on Convex Compact

Published 5 Apr 2023 in math.OC | (2304.02442v3)

Abstract: We present two easy-to-implement gradient-free/zeroth-order methods to optimize a stochastic non-smooth function accessible only via a black-box. The methods are built upon efficient first-order methods in the heavy-tailed case, i.e., when the gradient noise has infinite variance but bounded $(1+\kappa)$-th moment for some $\kappa \in(0,1]$. The first algorithm is based on the stochastic mirror descent with a particular class of uniformly convex mirror maps which is robust to heavy-tailed noise. The second algorithm is based on the stochastic mirror descent and gradient clipping technique. Additionally, for the objective functions satisfying the $r$-growth condition, faster algorithms are proposed based on these methods and the restart technique.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.