Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
134 tokens/sec
GPT-4o
9 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Methodology and first-order algorithms for solving nonsmooth and non-strongly convex bilevel optimization problems (2212.09843v1)

Published 19 Dec 2022 in math.OC

Abstract: Simple bilevel problems are optimization problems in which we want to find an optimal solution to an inner problem that minimizes an outer objective function. Such problems appear in many machine learning and signal processing applications as a way to eliminate undesirable solutions. %However, since these problems do not satisfy regularity conditions, they are often hard to solve exactly and are usually solved via iterative regularization. In the past few years, several algorithms were proposed to solve these bilevel problems directly and provide a rate for obtaining feasibility, assuming that the outer function is strongly convex. In our work, we suggest a new approach that is designed for bilevel problems with simple outer functions, such as the $l_1$ norm, which are not required to be either smooth or strongly convex. In our new ITerative Approximation and Level-set EXpansion (ITALEX) approach, we alternate between expanding the level-set of the outer function and approximately optimizing the inner problem over this level-set. We show that optimizing the inner function through first-order methods such as proximal gradient and generalized conditional gradient results in a feasibility convergence rate of $O(1/k)$, which up to now was a rate only achieved by bilevel algorithms for smooth and strongly convex outer functions. Moreover, we prove an $O(1/\sqrt{k})$ rate of convergence for the outer function, contrary to existing methods, which only provide asymptotic guarantees. We demonstrate this performance through numerical experiments.

Citations (8)

Summary

We haven't generated a summary for this paper yet.