Asymptotic theory for SKINNs with growing network architectures

Establish consistency and asymptotic normality for the Structured-Knowledge-Informed Neural Networks (SKINNs) M-estimator in the regime where the neural network architecture (width or depth) grows with the sample size N, by deriving explicit regularity conditions on the rate of architectural growth under which analogous asymptotic results to the fixed-architecture case continue to hold.

Background

The paper proves that SKINNs—joint estimators of neural network parameters and structured-knowledge parameters—are consistent and asymptotically normal at the parametric rate under standard regularity conditions when the neural network architecture is held fixed as the sample size increases.

The authors note that these results align with standard M-estimation theory and common empirical practice of fixing architectures during training. However, many modern theoretical and applied settings consider architectures that scale with the sample size (growing width/depth, akin to sieve estimation). Extending SKINNs’ asymptotic guarantees to such growing-architecture regimes would broaden the theoretical foundation and align with contemporary practice, but requires additional conditions on growth rates, which the authors explicitly leave for future work.

References

Extending these results to the growing-width or growing-depth regime, in which the network architecture scales with N as in sieve estimation theory, would require additional regularity conditions on the rate of growth and is left for future work.

Bridging Structured Knowledge and Data: A Unified Framework with Finance Applications  (2604.00987 - Cao et al., 1 Apr 2026) in Theoretical Foundations of SKINNs, Section 2.3.2 (Asymptotic Normality and Convergence Rates)