Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A dynamic approach to a proximal-Newton method for monotone inclusions in Hilbert spaces, with complexity O(1/n^2) (1502.04286v2)

Published 15 Feb 2015 in math.OC

Abstract: In a Hilbert setting, we introduce a new dynamical system and associated algorithms for solving monotone inclusions by rapid methods. Given a maximal monotone operator $A$, the evolution is governed by the time dependent operator $I -(I + \lambda(t) {A}){-1}$, where the positive control parameter $\lambda(t)$ tends to infinity as $t \to + \infty$. The tuning of $ \lambda (\cdot) $ is done in a closed-loop way, by resolution of the algebraic equation $\lambda \norm{(I + \lambda {A}){-1}x -x}=\theta$, where $\theta $ is a positive given constant. The existence and uniqueness of a strong global solution for the Cauchy problem follows from Cauchy-Lipschitz theorem. We prove the weak convergence of the trajectories to equilibria, and superlinear convergence under an error bound condition. When $A =\partial f$ is the subdifferential of a closed convex function $f$, we show a $\bigo(1/t2)$ convergence property of $f(x(t))$ to the infimal value of the problem. Then, we introduce proximal-like algorithms which can be obtained by time discretization of the continuous dynamic, and which share the same fast convergence properties. As distinctive features, we allow a relative error tolerance for the solution of the proximal subproblem similar to the ones proposed in ~\cite{So-Sv1, So-Sv2}, and a large step condition, as proposed in~\cite{MS1,MS2}. For general convex minimization problems, the complexity is $\bigo(1/n2)$. In the regular case, we show the global quadratic convergence of an associated proximal-Newton method.

Summary

We haven't generated a summary for this paper yet.