Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
133 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Nesterov's Accelerated Projected Gradient Method for Monotone Variational Inequalities (2212.08346v2)

Published 16 Dec 2022 in math.OC and math.DS

Abstract: In this technical note, we are concerned with the problem of solving variational inequalities with improved convergence rates. Motivated by Nesterov's accelerated gradient method for convex optimization, we propose a Nesterov's accelerated projected gradient algorithm for variational inequality problems. We prove convergence of the proposed algorithm with at least linear rate under the common assumption of Lipschitz continuity and strongly monotonicity. To the best of our knowledge, this is the first time that convergence of the Nesterov's accelerated protocol is proved for variational inequalities, other than the convex optimization or monotone inclusion problems. Simulation results are given to demonstrate the outperformance of the proposed algorithms over the well-known projected gradient approach, the reflected projected approach, and the golden ratio method. It is shown that the required number of iterations to reach the solution is greatly reduced in our proposed algorithm.

Summary

We haven't generated a summary for this paper yet.