Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Generalization Performance of Empirical Risk Minimization on Over-parameterized Deep ReLU Nets (2111.14039v3)

Published 28 Nov 2021 in cs.LG

Abstract: In this paper, we study the generalization performance of global minima for implementing empirical risk minimization (ERM) on over-parameterized deep ReLU nets. Using a novel deepening scheme for deep ReLU nets, we rigorously prove that there exist perfect global minima achieving almost optimal generalization error bounds for numerous types of data under mild conditions. Since over-parameterization is crucial to guarantee that the global minima of ERM on deep ReLU nets can be realized by the widely used stochastic gradient descent (SGD) algorithm, our results indeed fill a gap between optimization and generalization.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Shao-Bo Lin (34 papers)
  2. Yao Wang (331 papers)
  3. Ding-Xuan Zhou (59 papers)
Citations (5)

Summary

We haven't generated a summary for this paper yet.