Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Cutting Some Slack for SGD with Adaptive Polyak Stepsizes (2202.12328v2)

Published 24 Feb 2022 in cs.LG and math.OC

Abstract: Tuning the step size of stochastic gradient descent is tedious and error prone. This has motivated the development of methods that automatically adapt the step size using readily available information. In this paper, we consider the family of SPS (Stochastic gradient with a Polyak Stepsize) adaptive methods. These are methods that make use of gradient and loss value at the sampled points to adaptively adjust the step size. We first show that SPS and its recent variants can all be seen as extensions of the Passive-Aggressive methods applied to nonlinear problems. We use this insight to develop new variants of the SPS method that are better suited to nonlinear models. Our new variants are based on introducing a slack variable into the interpolation equations. This single slack variable tracks the loss function across iterations and is used in setting a stable step size. We provide extensive numerical results supporting our new methods and a convergence theory.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Robert M. Gower (41 papers)
  2. Mathieu Blondel (43 papers)
  3. Nidham Gazagnadou (8 papers)
  4. Fabian Pedregosa (48 papers)
Citations (20)

Summary

We haven't generated a summary for this paper yet.