Nonlinearly preconditioned gradient flows (2511.20370v1)
Abstract: We study a continuous-time dynamical system which arises as the limit of a broad class of nonlinearly preconditioned gradient methods. Under mild assumptions, we establish existence of global solutions and derive Lyapunov-based convergence guarantees. For convex costs, we prove a sublinear decay in a geometry induced by some reference function, and under a generalized gradient-dominance condition we obtain exponential convergence. We further uncover a duality connection with mirror descent, and use it to establish that the flow of interest solves an infinite-horizon optimal-control problem of which the value function is the Bregman divergence generated by the cost. These results clarify the structure and optimization behavior of nonlinearly preconditioned gradient flows and connect them to known continuous-time models in non-Euclidean optimization.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.