Dice Question Streamline Icon: https://streamlinehq.com

Design improved activation functions and neural network frameworks to enhance deep neural network capacity

Develop activation functions and deep neural network architectures that mitigate spectral bias and enhance the representational and optimization capacity of deep neural networks, with particular emphasis on capturing high-frequency and high-order behaviors encountered when using deep learning to solve partial differential equations such as biharmonic equations.

Information Square Streamline Icon: https://streamlinehq.com

Background

The paper analyzes how activation function design critically affects deep neural network performance, especially in physics-informed settings where higher-order derivatives are required. It discusses the boundedness and regularity of common activation functions (Gaussian, tanh), their derivatives, and the limitations these imply for representing information outside narrow input ranges.

The authors highlight spectral bias—DNNs’ preference for low frequencies—as a key challenge for accurately learning high-frequency components and multi-scale behaviors, which are essential in solving high-order PDEs like the biharmonic equation. Fourier feature mapping is proposed as a partial remedy, but the broader issue of designing better activation functions and network frameworks remains unresolved.

References

Therefore, designing better activation functions and neural network frameworks remains an open question in enhancing DNN capacity.

Fourier heuristic PINNs to solve the biharmonic equations based on its coupled scheme (2509.15004 - Huang et al., 18 Sep 2025) in Section 3.2, Choice of Activation Function for PINN