Variational Convergence Analysis With Smoothed-TV Interpretation (1511.05040v1)
Abstract: The problem of minimizing the least squares functional with a Fr\'echet differentiable, lower semi-continuous, convex penalizer $J$ is considered to be solved. The penalizer maps the functions of Banach space $\mathcal{V}$ into $\mathbb{R}{+},$ $ J : \mathcal{V} \rightarrow \mathbb{R}{+}.$ It is assumed that some given data $f{\delta}$ is defined on a compact domain $\mathcal{G} \subset \mathbb{R}{+}$ and in the class of Hilbert space, $f{\delta} \in \mathcal{L}{2}(\mathcal{G}).$ Then general Tikhonov functional associated with some given linear, compact and injective forward operator $\mathcal{T} : \mathcal{V} \rightarrow \mathcal{L}{2}(\mathcal{G})$ is formulated as \begin{eqnarray} F{\alpha}(\varphi, f{\delta}) : & \mathcal{V} \times \mathcal{L}{2}(\mathcal{G}) & \rightarrow \mathbb{R}{+} \nonumber\ & (\varphi, f{\delta}) & \mapsto F{\alpha}(\varphi, f{\delta}) := \frac{1}{2}\Vert\mathcal{T}\varphi - f{\delta}\Vert_{\mathcal{L}{2}(\mathcal{G})}2 + \alpha J(\varphi) . \nonumber \end{eqnarray} Convergence of the regularized solution $\varphi_{\alpha(\delta)} \in \mathrm{argmin}{\varphi \in \mathcal{V}} F{\alpha}(\varphi, f{\delta})$ to the true solution $\varphi{\dagger}$ is analysed by means of Bregman divergence. First part of this aims to provide some general convergence analysis for generally strongly convex functional $J$ in the cost functional $F_{\alpha}$. In this part the key observation is that strong convexity of the penalty term $J$ with its convexity modulus implies norm convergence in the Bregman metric sense. In the second part, this general analysis will be interepreted for the smoothed-TV functional. The result of this work is applicable for any strongly convex functional.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.