Achieving optimal complexity guarantees for a class of bilevel convex optimization problems
Abstract: We design and analyze a novel accelerated gradient-based algorithm for a class of bilevel optimization problems. These problems have various applications arising from machine learning and image processing, where optimal solutions of the two levels are interdependent. That is, achieving the optimal solution of an upper-level problem depends on the solution set of a lower-level optimization problem. We significantly improve existing iteration complexity to $\mathcal{O}(\epsilon{-0.5})$ for both suboptimality and infeasibility error metrics, where $\epsilon>0$ denotes an arbitrary scalar. In addition, contrary to existing methods that require solving the optimization problem sequentially (initially solving an optimization problem to approximate the solution of the lower-level problem followed by a second algorithm), our algorithm concurrently solves the optimization problem. To the best of our knowledge, the proposed algorithm has the fastest known iteration complexity, which matches the optimal complexity for single-level optimization. We conduct numerical experiments on sparse linear regression problems to demonstrate the efficacy of our approach.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.