Dice Question Streamline Icon: https://streamlinehq.com

Clarify width growth requirements in neural tangent kernel analyses

Determine whether the number of neurons required to satisfy the "sufficiently large" condition in neural tangent kernel (NTK) analyses of over-parameterized neural networks must grow exponentially with the sample size or at a slower rate, and specify the minimal width-growth regime under which NTK-derived guarantees hold.

Information Square Streamline Icon: https://streamlinehq.com

Background

In surveying NTK-based results, the paper points out that equivalence to deep networks is often established only pointwise and not for global L2 errors. Moreover, these analyses typically assume that the number of neurons is "sufficiently large" without a precise quantitative requirement.

The authors explicitly note the lack of clarity on how the required width should scale with sample size, highlighting the need for a definitive characterization of the minimal growth rate necessary for NTK conclusions to transfer to practical deep networks.

References

In addition, it is required that the number of neurons be sufficiently large, but it was not specified what this exactly means, i.e., it is not clear whether the number of neurons must grow e.g. exponentially in the sample size or not.

Statistically guided deep learning (2504.08489 - Kohler et al., 11 Apr 2025) in Section 1.9 (Discussion of related results), Neural tangent kernel setting