Combining Strong Convergence, Values Fast Convergence and Vanishing of Gradients for a Proximal Point Algorithm Using Tikhonov Regularization in a Hilbert Space
Abstract: In a real Hilbert space $\mathcal{H}$. Given any function $f$ convex differentiable whose solution set $\argmin_{\mathcal{H}}\,f$ is nonempty, by considering the Proximal Algorithm $x_{k+1}=\text{prox}{\b_k f}(d x_k)$, where $0<d<1$ and $(\b_k)$ is nondecreasing function, and by assuming some assumptions on $(\b_k)$, we will show that the value of the objective function in the sequence generated by our algorithm converges in order $\mathcal{O} \left( \frac{1}{ \beta _k} \right)$ to the global minimum of the objective function, and that the generated sequence converges strongly to the minimum norm element of $\argmin{\mathcal{H}}\,f$, we also obtain a convergence rate of gradient toward zero. Afterward, we extend these results to non-smooth convex functions with extended real values.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.