Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Gradient-only line searches: An Alternative to Probabilistic Line Searches (1903.09383v2)

Published 22 Mar 2019 in stat.ML and cs.LG

Abstract: Step sizes in neural network training are largely determined using predetermined rules such as fixed learning rates and learning rate schedules. These require user input or expensive global optimization strategies to determine their functional form and associated hyperparameters. Line searches are capable of adaptively resolving learning rate schedules. However, due to discontinuities induced by mini-batch sub-sampling, they have largely fallen out of favour. Notwithstanding, probabilistic line searches, which use statistical surrogates over a limited spatial domain, have recently demonstrated viability in resolving learning rates for stochastic loss functions. This paper introduces an alternative paradigm, Gradient-Only Line Searches that are Inexact (GOLS-I), as an alternative strategy to automatically determine learning rates in stochastic loss functions over a range of 15 orders of magnitude without the use of surrogates. We show that GOLS-I is a competitive strategy to reliably determine step sizes, adding high value in terms of performance, while being easy to implement.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Dominic Kafka (5 papers)
  2. Daniel Wilke (2 papers)
Citations (14)

Summary

We haven't generated a summary for this paper yet.