Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Catastrophic Fisher Explosion: Early Phase Fisher Matrix Impacts Generalization (2012.14193v3)

Published 28 Dec 2020 in cs.LG and stat.ML

Abstract: The early phase of training a deep neural network has a dramatic effect on the local curvature of the loss function. For instance, using a small learning rate does not guarantee stable optimization because the optimization trajectory has a tendency to steer towards regions of the loss surface with increasing local curvature. We ask whether this tendency is connected to the widely observed phenomenon that the choice of the learning rate strongly influences generalization. We first show that stochastic gradient descent (SGD) implicitly penalizes the trace of the Fisher Information Matrix (FIM), a measure of the local curvature, from the start of training. We argue it is an implicit regularizer in SGD by showing that explicitly penalizing the trace of the FIM can significantly improve generalization. We highlight that poor final generalization coincides with the trace of the FIM attaining a large value early in training, to which we refer as catastrophic Fisher explosion. Finally, to gain insight into the regularization effect of penalizing the trace of the FIM, we show that it limits memorization by reducing the learning speed of examples with noisy labels more than that of the examples with clean labels.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (9)
  1. Devansh Arpit (31 papers)
  2. Oliver Astrand (2 papers)
  3. Giancarlo Kerg (7 papers)
  4. Huan Wang (211 papers)
  5. Caiming Xiong (337 papers)
  6. Richard Socher (115 papers)
  7. Kyunghyun Cho (292 papers)
  8. Krzysztof Geras (4 papers)
  9. Stanislaw Jastrzebski (7 papers)
Citations (60)

Summary

We haven't generated a summary for this paper yet.