Variational Regularization Theory Based on Image Space Approximation Rates
Abstract: We present a new approach to convergence rate results for variational regularization. Avoiding Bregman distances and using image space approximation rates as source conditions we prove a nearly minimax theorem showing that the modulus of continuity is an upper bound on the reconstruction error up to a constant. Applied to Besov space regularization we obtain convergence rate results for $0,2,q$- and $0,p,p$-penalties without restrictions on $p,q\in (1,\infty).$ Finally we prove equivalence of H\"older-type variational source conditions, bounds on the defect of the Tikhonov functional, and image space approximation rates.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.