Extend simultaneous approximation lower bounds to Lp Sobolev spaces on manifolds

Extend the lower-bound analysis for simultaneous approximation by constant-depth ReLU^{k−1} neural networks from the L∞ Sobolev setting to Lp Sobolev spaces W_p^k(𝓜^d) (for 1 ≤ p < ∞) on compact, connected, complete Riemannian manifolds with bounded geometry. Specifically, establish rigorous lower bounds on the number of nonzero parameters required to approximate any function in W_p^k(𝓜^d) to accuracy ε in the W_p^s(𝓜^d) norm (for integers s < k), analogous to the L∞ case proven in the paper, and formulate the dependence on the intrinsic dimension d.

Background

The paper proves nearly optimal simultaneous approximation rates and matching lower bounds for constant-depth ReLU{k−1} networks in L∞-based Sobolev norms on compact manifolds with bounded geometry. The lower bound leverages new VC and pseudo-dimension estimates for high-order derivative classes of the networks and shows dependence only on the manifold’s intrinsic dimension d.

The authors note that while Lp (1 ≤ p < ∞) lower bounds exist in Euclidean settings, those proofs rely on geometric constructions (e.g., cube decompositions) that do not directly generalize to manifolds. As a result, transferring their manifold-based lower bound framework from L∞ to Lp norms remains unresolved and is highlighted as an open problem.

References

Consequently, extending our analysis to L_p Sobolev spaces on manifolds remains an important open problem and a promising direction for future research.

Expressive Power of Deep Networks on Manifolds: Simultaneous Approximation (2509.09362 - Zhou et al., 11 Sep 2025) in Remark following Theorem “lower bounds”, Section 3 (Main Results)