Accelerating Stochastic Recursive and Semi-stochastic Gradient Methods with Adaptive Barzilai-Borwein Step Sizes (2307.13930v2)
Abstract: The mini-batch versions of StochAstic Recursive grAdient algoritHm and Semi-Stochastic Gradient Descent method, employed the random Barzilai-Borwein step sizes (shorted as MB-SARAH-RBB and mS2GD-RBB), have surged into prominence through timely step size sequence. Inspired by modern adaptors and variance reduction techniques, we propose two new variant rules in the paper, referred to as RHBB and RHBB+, thereby leading to four algorithms MB-SARAH-RHBB, MB-SARAH-RHBB+, mS2GD-RHBB and mS2GD-RHBB+ respectively. RHBB+ is an enhanced version that additionally incorporates the importance sampling technique. They are aggressive in updates, robust in performance and self-adaptive along iterative periods. We analyze the flexible convergence structures and the corresponding complexity bounds in strongly convex cases. Comprehensive tuning guidance is theoretically provided for reference in practical implementations. Experiments show that the proposed methods consistently outperform the original and various state-of-the-art methods on frequently tested data sets. In particular, tests on the RHBB+ verify the efficacy of applying the importance sampling technique to the step size level. Numerous explorations display the promising scalability of our iterative adaptors.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.