Papers
Topics
Authors
Recent
2000 character limit reached

Perturbed Iterate SGD for Lipschitz Continuous Loss Functions

Published 17 Mar 2020 in math.OC | (2003.07606v5)

Abstract: This paper presents an extension of stochastic gradient descent for the minimization of Lipschitz continuous loss functions. Our motivation is for use in non-smooth non-convex stochastic optimization problems, which are frequently encountered in applications such as machine learning. Using the Clarke $\epsilon$-subdifferential, we prove the non-asymptotic convergence to an approximate stationary point in expectation for the proposed method. From this result, a method with non-asymptotic convergence with high probability, as well as a method with asymptotic convergence to a Clarke stationary point almost surely are developed. Our results hold under the assumption that the stochastic loss function is a Carath\'eodory function which is almost everywhere Lipschitz continuous in the decision variables. To the best of our knowledge this is the first non-asymptotic convergence analysis under these minimal assumptions.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.