Fast Frank--Wolfe Algorithms with Adaptive Bregman Step-Size for Weakly Convex Functions (2504.04330v3)
Abstract: We propose a Frank--Wolfe (FW) algorithm with an adaptive Bregman step-size strategy for smooth adaptable (also called: relatively smooth) (weakly-) convex functions. This means that the gradient of the objective function is not necessarily Lipschitz continuous, and we only require the smooth adaptable property. Compared to existing FW algorithms, our assumptions are less restrictive. We establish convergence guarantees in various settings, such as sublinear to linear convergence rates, depending on the assumptions for convex and nonconvex objective functions. Assuming that the objective function is weakly convex and satisfies the local quadratic growth condition, we provide both local sublinear and local linear convergence regarding the primal gap. We also propose a variant of the away-step FW algorithm using Bregman distances over polytopes. We establish global faster (up to linear) convergence for convex optimization under the H\"{o}lder error bound condition and its local linear convergence for nonconvex optimization under the local quadratic growth condition. Numerical experiments demonstrate that our proposed FW algorithms outperform existing methods.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.