Papers
Topics
Authors
Recent
Search
2000 character limit reached

Descent modulus and applications

Published 21 Nov 2022 in math.CA, math.OC, and math.PR | (2211.11819v1)

Abstract: The norm of the gradient $\nabla$f (x) measures the maximum descent of a real-valued smooth function f at x. For (nonsmooth) convex functions, this is expressed by the distance dist(0, $\partial$f (x)) of the subdifferential to the origin, while for general real-valued functions defined on metric spaces by the notion of metric slope |$\nabla$f |(x). In this work we propose an axiomatic definition of descent modulus T f of a real-valued function f at every point x, defined on a general (not necessarily metric) space. The definition encompasses all above instances as well as average descents for functions defined on probability spaces. We show that a large class of functions are completely determined by their descent modulus and corresponding critical values. This result is already surprising in the smooth case: a one-dimensional information (norm of the gradient) turns out to be almost as powerful as the knowledge of the full gradient mapping. In the nonsmooth case, the key element for this determination result is the break of symmetry induced by a downhill orientation, in the spirit of the definition of the metric slope. The particular case of functions defined on finite spaces is studied in the last section. In this case, we obtain an explicit classification of descent operators that are, in some sense, typical.

Citations (3)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.