Papers
Topics
Authors
Recent
Search
2000 character limit reached

Gradient descent in a generalised Bregman distance framework

Published 8 Dec 2016 in math.OC | (1612.02506v2)

Abstract: We discuss a special form of gradient descent that in the literature has become known as the so-called linearised Bregman iteration. The idea is to replace the classical (squared) two norm metric in the gradient descent setting with a generalised Bregman distance, based on a more general proper, convex and lower semi-continuous functional. Gradient descent as well as the entropic mirror descent by Nemirovsky and Yudin are special cases, as is a specific form of non-linear Landweber iteration introduced by Bachmayr and Burger. We are going to analyse the linearised Bregman iteration in a setting where the functional we want to minimise is neither necessarily Lipschitz-continuous (in the classical sense) nor necessarily convex, and establish a global convergence result under the additional assumption that the functional we wish to minimise satisfies the so-called Kurdyka-{\L}ojasiewicz property.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.