Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Analysis of the Gradient Descent Algorithm for a Deep Neural Network Model with Skip-connections (1904.05263v3)

Published 10 Apr 2019 in cs.LG, math.OC, and stat.ML

Abstract: The behavior of the gradient descent (GD) algorithm is analyzed for a deep neural network model with skip-connections. It is proved that in the over-parametrized regime, for a suitable initialization, with high probability GD can find a global minimum exponentially fast. Generalization error estimates along the GD path are also established. As a consequence, it is shown that when the target function is in the reproducing kernel Hilbert space (RKHS) with a kernel defined by the initialization, there exist generalizable early-stopping solutions along the GD path. In addition, it is also shown that the GD path is uniformly close to the functions given by the related random feature model. Consequently, in this "implicit regularization" setting, the deep neural network model deteriorates to a random feature model. Our results hold for neural networks of any width larger than the input dimension.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Weinan E (127 papers)
  2. Chao Ma (187 papers)
  3. Qingcan Wang (6 papers)
  4. Lei Wu (319 papers)
Citations (22)

Summary

We haven't generated a summary for this paper yet.