2000 character limit reached
Optimal lower Lipschitz bounds for ReLU layers, saturation, and phase retrieval
Published 14 Feb 2025 in cs.LG, cs.NA, math.FA, and math.NA | (2502.09898v1)
Abstract: The injectivity of ReLU layers in neural networks, the recovery of vectors from clipped or saturated measurements, and (real) phase retrieval in $\mathbb{R}n$ allow for a similar problem formulation and characterization using frame theory. In this paper, we revisit all three problems with a unified perspective and derive lower Lipschitz bounds for ReLU layers and clipping which are analogous to the previously known result for phase retrieval and are optimal up to a constant factor.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.