Papers
Topics
Authors
Recent
Search
2000 character limit reached

Combining Strong Convergence, Values Fast Convergence and Vanishing of Gradients for a Proximal Point Algorithm Using Tikhonov Regularization in a Hilbert Space

Published 22 Sep 2023 in math.OC | (2309.13200v1)

Abstract: In a real Hilbert space $\mathcal{H}$. Given any function $f$ convex differentiable whose solution set $\argmin_{\mathcal{H}}\,f$ is nonempty, by considering the Proximal Algorithm $x_{k+1}=\text{prox}{\b_k f}(d x_k)$, where $0<d<1$ and $(\b_k)$ is nondecreasing function, and by assuming some assumptions on $(\b_k)$, we will show that the value of the objective function in the sequence generated by our algorithm converges in order $\mathcal{O} \left( \frac{1}{ \beta _k} \right)$ to the global minimum of the objective function, and that the generated sequence converges strongly to the minimum norm element of $\argmin{\mathcal{H}}\,f$, we also obtain a convergence rate of gradient toward zero. Afterward, we extend these results to non-smooth convex functions with extended real values.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (3)

Collections

Sign up for free to add this paper to one or more collections.