Papers
Topics
Authors
Recent
2000 character limit reached

Accelerating Stochastic Recursive and Semi-stochastic Gradient Methods with Adaptive Barzilai-Borwein Step Sizes (2307.13930v2)

Published 26 Jul 2023 in math.OC

Abstract: The mini-batch versions of StochAstic Recursive grAdient algoritHm and Semi-Stochastic Gradient Descent method, employed the random Barzilai-Borwein step sizes (shorted as MB-SARAH-RBB and mS2GD-RBB), have surged into prominence through timely step size sequence. Inspired by modern adaptors and variance reduction techniques, we propose two new variant rules in the paper, referred to as RHBB and RHBB+, thereby leading to four algorithms MB-SARAH-RHBB, MB-SARAH-RHBB+, mS2GD-RHBB and mS2GD-RHBB+ respectively. RHBB+ is an enhanced version that additionally incorporates the importance sampling technique. They are aggressive in updates, robust in performance and self-adaptive along iterative periods. We analyze the flexible convergence structures and the corresponding complexity bounds in strongly convex cases. Comprehensive tuning guidance is theoretically provided for reference in practical implementations. Experiments show that the proposed methods consistently outperform the original and various state-of-the-art methods on frequently tested data sets. In particular, tests on the RHBB+ verify the efficacy of applying the importance sampling technique to the step size level. Numerous explorations display the promising scalability of our iterative adaptors.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.