Mirror frameworks for relatively Lipschitz and monotone-like variational inequalities
Abstract: Nonconvex-nonconcave saddle-point optimization in machine learning has triggered lots of research for studying non-monotone variational inequalities (VI). In this work, we introduce two mirror frameworks, called mirror extragradient method and mirror extrapolation method, for approximating solutions to relatively Lipschitz and monotone-like VIs. The former covers the well-known Nemirovski's mirror prox method and Nesterov's dual extrapolation method, and the recently proposed Bregman extragradient method; all of them can be reformulated into a scheme that is very similar to the original form of extragradient method. The latter includes the operator extrapolation method and the Bregman extrapolation method as its special cases. The proposed mirror frameworks allow us to present a unified and improved convergence analysis for all these existing methods under relative Lipschitzness and monotone-like conditions that may be the currently weakest assumptions.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.