Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Randomized Block Coordinate Iterative Regularized Gradient Method for High-dimensional Ill-posed Convex Optimization (1809.10035v1)

Published 26 Sep 2018 in math.OC

Abstract: Motivated by high-dimensional nonlinear optimization problems as well as ill-posed optimization problems arising in image processing, we consider a bilevel optimization model where we seek among the optimal solutions of the inner level problem, a solution that minimizes a secondary metric. Our goal is to address the high-dimensionality of the bilevel problem, and the nondifferentiability of the objective function. Minimal norm gradient, sequential averaging, and iterative regularization are some of the recent schemes developed for addressing the bilevel problem. But none of them address the high-dimensional structure and nondifferentiability. With this gap in the literature, we develop a randomized block coordinate iterative regularized gradient descent scheme (RB-IRG). We establish the convergence of the sequence generated by RB-IRG to the unique solution of the bilevel problem of interest. Furthermore, we derive a rate of convergence $\mathcal{O} \left(\frac{1}{{k}{0.5-\delta}}\right)$, with respect to the inner level objective function. We demonstrate the performance of RB-IRG in solving the ill-posed problems arising in image processing.

Summary

We haven't generated a summary for this paper yet.