Variance-based stochastic extragradient methods with line search for stochastic variational inequalities (1703.00262v3)
Abstract: A dynamic sampled stochastic approximated (DS-SA) extragradient method for stochastic variational inequalities (SVI) is proposed that is \emph{robust} with respect to an unknown Lipschitz constant $L$. To the best of our knowledge, it is the first provably convergent \emph{robust} SA \emph{method with variance reduction}, either for SVIs or stochastic optimization, assuming just an unbiased stochastic oracle in a large sample regime. This widens the applicability and improves, up to constants, the desired efficient acceleration of previous variance reduction methods, all of which still assume knowledge of $L$ (and, hence, are not robust against its estimate). Precisely, compared to the iteration and oracle complexities of $\mathcal{O}(\epsilon{-2})$ of previous robust methods with a small stepsize policy, our robust method obtains the faster iteration complexity of $\mathcal{O}(\epsilon{-1})$ with oracle complexity of $(\ln L)\mathcal{O}(d\epsilon{-2})$ (up to logs). This matches, up to constants, the sample complexity of the sample average approximation estimator which does not assume additional problem information (such as $L$). Differently from previous robust methods for ill-conditioned problems, we allow an unbounded feasible set and an oracle with multiplicative noise (MN) whose variance is not necessarily uniformly bounded. These properties are seen in our complexity estimates which depend only on $L$ and local second or forth moments at solutions. The robustness and variance reduction properties of our DS-SA line search scheme come at the expense of nonmartingale-like dependencies (NMD) due to the needed inner statistical estimation of a lower bound for $L$. In order to handle a NMD and a MN, our proofs rely on a novel localization argument based on empirical process theory. We also propose another robust method for SVIs over the wider class of H\"older continuous operators.