Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

An iterative regularized mirror descent method for ill-posed nondifferentiable stochastic optimization (1901.09506v2)

Published 28 Jan 2019 in math.OC

Abstract: A wide range of applications arising in machine learning and signal processing can be cast as convex optimization problems. These problems are often ill-posed, i.e., the optimal solution lacks a desired property such as uniqueness or sparsity. In the literature, to address ill-posedness, a bilevel optimization problem is considered where the goal is to find among optimal solutions of the inner level optimization problem, a solution that minimizes a secondary metric, i.e., the outer level objective function. In addressing the resulting bilevel model, the convergence analysis of most existing methods is limited to the case where both inner and outer level objectives are differentiable deterministic functions. While these assumptions may not hold in big data applications, to the best of our knowledge, no solution method equipped with complexity analysis exists to address presence of uncertainty and nondifferentiability in both levels in this class of problems. Motivated by this gap, we develop a first-order method called Iterative Regularized Stochastic Mirror Descent (IR-SMD). We establish the global convergence of the iterate generated by the algorithm to the optimal solution of the bilevel problem in an almost sure and a mean sense. We derive a convergence rate of ${\cal O}\left(1/N{0.5-\delta}\right)$ for the inner level problem, where $\delta>0$ is an arbitrary small scalar. Numerical experiments for solving two classes of bilevel problems, including a large scale binary text classification application, are presented.

Summary

We haven't generated a summary for this paper yet.