2000 character limit reached
A geometric interpretation of stochastic gradient descent using diffusion metrics (1910.12194v1)
Published 27 Oct 2019 in cs.LG, gr-qc, math.DG, and stat.ML
Abstract: Stochastic gradient descent (SGD) is a key ingredient in the training of deep neural networks and yet its geometrical significance appears elusive. We study a deterministic model in which the trajectories of our dynamical systems are described via geodesics of a family of metrics arising from the diffusion matrix. These metrics encode information about the highly non-isotropic gradient noise in SGD. We establish a parallel with General Relativity models, where the role of the electromagnetic field is played by the gradient of the loss function. We compute an example of a two layer network.