Asymptotic theory for SKINNs with growing network architectures
Establish consistency and asymptotic normality for the Structured-Knowledge-Informed Neural Networks (SKINNs) M-estimator in the regime where the neural network architecture (width or depth) grows with the sample size N, by deriving explicit regularity conditions on the rate of architectural growth under which analogous asymptotic results to the fixed-architecture case continue to hold.
References
Extending these results to the growing-width or growing-depth regime, in which the network architecture scales with N as in sieve estimation theory, would require additional regularity conditions on the rate of growth and is left for future work.
— Bridging Structured Knowledge and Data: A Unified Framework with Finance Applications
(2604.00987 - Cao et al., 1 Apr 2026) in Theoretical Foundations of SKINNs, Section 2.3.2 (Asymptotic Normality and Convergence Rates)