Papers
Topics
Authors
Recent
Search
2000 character limit reached

On the fast convergence of random perturbations of the gradient flow

Published 2 Jun 2017 in math.PR | (1706.00837v3)

Abstract: We consider in this work small random perturbations (of multiplicative noise type) of the gradient flow. We prove that under mild conditions, when the potential function is a Morse function with additional strong saddle condition, the perturbed gradient flow converges to the neighborhood of local minimizers in $O(\ln (\varepsilon{-1}))$ time on the average, where $\varepsilon$ is the scale of the random perturbation. Under a change of time scale, this indicates that for the diffusion process that approximates the stochastic gradient method, it takes (up to logarithmic factor) only a linear time of inverse stepsize to evade from all saddle points. This can be regarded as a manifestation of fast convergence of the discrete-time stochastic gradient method, the latter being used heavily in modern statistical machine learning.

Citations (15)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.