Dice Question Streamline Icon: https://streamlinehq.com

Training cylindrical approximation on nonsmooth or divergent functions

Develop a training methodology for the cylindrical approximation (basis expansion of input functions) that enables physics-informed neural networks to reliably handle nonsmooth, highly oscillatory, or divergent input functions whose accurate representation requires very large expansion degrees and currently leads to numerical instability in computing expansion coefficients.

Information Square Streamline Icon: https://streamlinehq.com

Background

The cylindrical approximation represents input functions via orthonormal basis expansions truncated at degree m. For smooth inputs, moderate m suffices, but nonsmooth, highly oscillatory, or divergent functions demand very large degrees, where numerical integration for expansion coefficients becomes unstable.

The authors note this practical limitation and explicitly identify the need for a robust training approach that can overcome instability for such challenging function classes, which would broaden the applicability of cylindrical approximation in solving functional differential equations.

References

Training on such functions with the cylindrical approximation is an open problem.

Physics-informed Neural Networks for Functional Differential Equations: Cylindrical Approximation and Its Convergence Guarantees (2410.18153 - Miyagawa et al., 23 Oct 2024) in Section 6 (Conclusion and Limitations), Challenges toward even higher degrees