Neuroevolution of Physics-Informed Neural Nets: Benchmark Problems and Comparative Results (2212.07624v3)
Abstract: The potential of learned models for fundamental scientific research and discovery is drawing increasing attention worldwide. Physics-informed neural networks (PINNs), where the loss function directly embeds governing equations of scientific phenomena, is one of the key techniques at the forefront of recent advances. PINNs are typically trained using stochastic gradient descent methods, akin to their deep learning counterparts. However, analysis in this paper shows that PINNs' unique loss formulations lead to a high degree of complexity and ruggedness that may not be conducive for gradient descent. Unlike in standard deep learning, PINN training requires globally optimum parameter values that satisfy physical laws as closely as possible. Spurious local optimum, indicative of erroneous physics, must be avoided. Hence, neuroevolution algorithms, with their superior global search capacity, may be a better choice for PINNs relative to gradient descent methods. Here, we propose a set of five benchmark problems, with open-source codes, spanning diverse physical phenomena for novel neuroevolution algorithm development. Using this, we compare two neuroevolution algorithms against the commonly used stochastic gradient descent, and our baseline results support the claim that neuroevolution can surpass gradient descent, ensuring better physics compliance in the predicted outputs. %Furthermore, implementing neuroevolution with JAX leads to orders of magnitude speedup relative to standard implementations.
- Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational physics, 378:686–707, 2019.
- Physics-informed machine learning. Nature Reviews Physics, 3(6):422–440, 2021.
- Characterizing possible failure modes in physics-informed neural networks. Advances in Neural Information Processing Systems, 34:26548–26560, 2021.
- Mopinns: an evolutionary multi-objective approach to physics-informed neural networks. In Proceedings of the Genetic and Evolutionary Computation Conference Companion, pages 228–231, 2022.
- On the spectral bias of neural networks. In International Conference on Machine Learning, pages 5301–5310. PMLR, 2019.
- Learning in sinusoidal spaces with physics-informed neural networks. IEEE Transactions on Artificial Intelligence, 2022.
- Loss surfaces, mode connectivity, and fast ensembling of dnns. Advances in neural information processing systems, 31, 2018.
- On the pareto front of physics-informed neural networks. arXiv preprint arXiv:2105.00862, 2021.
- Loss landscape engineering via data regulation on pinns. arXiv preprint arXiv:2205.07843, 2022.
- Simple evolutionary optimization can rival stochastic gradient descent in neural networks. In Proceedings of the Genetic and Evolutionary Computation Conference 2016, pages 477–484, 2016.
- Designing neural networks through neuroevolution. Nature Machine Intelligence, 1(1):24–35, 2019.
- Can transfer neuroevolution tractably solve your differential equations? IEEE Computational Intelligence Magazine, 16(2):14–30, 2021.
- Nikolaus Hansen. The cma evolution strategy: a comparing review. Towards a new evolutionary computation: Advances in the estimation of distribution algorithms, pages 75–102, 2006.
- Natural evolution strategies. The Journal of Machine Learning Research, 15(1):949–980, 2014.
- Evojax: Hardware-accelerated neuroevolution. In Proceedings of the Genetic and Evolutionary Computation Conference Companion, pages 308–311, 2022.
- Tensorflow: Large-scale machine learning on heterogeneous distributed systems. arXiv preprint arXiv:1603.04467, 2016.
- Sensitivity of principal hessian direction analysis. 2007.
- Nikolaus Hansen. The cma evolution strategy: A tutorial. arXiv preprint arXiv:1604.00772, 2016.
- Information-geometric optimization algorithms: A unifying picture via invariance principles. The Journal of Machine Learning Research, 18(1):564–628, 2017.
- A dynamical systems perspective on nesterov acceleration. In International Conference on Machine Learning, pages 4656–4662. PMLR, 2019.
- Descending through a crowded valley-benchmarking deep learning optimizers. In International Conference on Machine Learning, pages 9367–9376. PMLR, 2021.
- Pao-Hsiung Chiu. An improved divergence-free-condition compensated method for solving incompressible flows on collocated grids. Computers & Fluids, 162:39–54, 2018.
- BP Leonard. The ultimate conservative difference scheme applied to unsteady one-dimensional advection. Computer methods in applied mechanics and engineering, 88(1):17–74, 1991.
- Nicholas Sung Wei Yong (1 paper)
- Jian Cheng Wong (15 papers)
- Pao-Hsiung Chiu (11 papers)
- Abhishek Gupta (226 papers)
- Chinchun Ooi (5 papers)
- Yew-Soon Ong (105 papers)