Descent modulus and applications
Abstract
The norm of the gradient Vf f (x) x ) measures the maximum descent of a real-valued smooth function f at x . For (non- smooth) convex functions, this is expressed by the distance dist(0, , partial derivative f (x)) x )) of the subdifferential to the origin, while for general real-valued functions defined on metric spaces by the notion of metric slope |Vf f |(x). x ). In this work we propose an axiomatic definition of descent modulus T [ f ]( x ) of a real-valued function f at every point x , defined on a general (not necessarily metric) space. The definition encompasses all above instances as well as average descents for functions defined on probability spaces. We show that a large class of functions are completely determined by their descent modulus and corresponding critical values. This result is already surprising in the smooth case: a one-dimensional information (norm of the gradient) turns out to be almost as powerful as the knowledge of the full gradient mapping. In the nonsmooth case, the key element for this determination result is the break of symmetry induced by a downhill orientation, in the spirit of the definition of the metric slope. The particular case of functions defined on finite spaces is studied in the last section. In this case, we obtain an explicit classification of descent operators that are, in some sense, typical. (c) 2024 The Authors. Published by Elsevier Inc. This is an open access article under the CC BY license (http:// creativecommons .org /licenses /by /4 .0/).
Más información
Título según WOS: | Descent modulus and applications |
Título de la Revista: | JOURNAL OF FUNCTIONAL ANALYSIS |
Volumen: | 287 |
Número: | 11 |
Editorial: | ACADEMIC PRESS INC ELSEVIER SCIENCE |
Fecha de publicación: | 2024 |
DOI: |
10.1016/j.jfa.2024.110626 |
Notas: | ISI |