PINNACLE: PINN Adaptive ColLocation and Experimental points selection (2404.07662v1)
Abstract: Physics-Informed Neural Networks (PINNs), which incorporate PDEs as soft constraints, train with a composite loss function that contains multiple training point types: different types of collocation points chosen during training to enforce each PDE and initial/boundary conditions, and experimental points which are usually costly to obtain via experiments or simulations. Training PINNs using this loss function is challenging as it typically requires selecting large numbers of points of different types, each with different training dynamics. Unlike past works that focused on the selection of either collocation or experimental points, this work introduces PINN Adaptive ColLocation and Experimental points selection (PINNACLE), the first algorithm that jointly optimizes the selection of all training point types, while automatically adjusting the proportion of collocation point types as training progresses. PINNACLE uses information on the interaction among training point types, which had not been considered before, based on an analysis of PINN training dynamics via the Neural Tangent Kernel (NTK). We theoretically show that the criterion used by PINNACLE is related to the PINN generalization error, and empirically demonstrate that PINNACLE is able to outperform existing point selection methods for forward, inverse, and transfer learning problems.
- Fine-Grained Analysis of Optimization and Generalization for Overparameterized Two-Layer Neural Networks. In Proc. ICML, pp. 322–332, May 2019a.
- On exact computation with an infinitely wide neural net. In Proc. NeurIPS, pp. 8141–8150. December 2019b.
- Deep Batch Active Learning by Diverse, Uncertain Gradient Lower Bounds. In Proc. ICLR, April 2020.
- PINNeik: Eikonal solution using physics-informed neural networks. Computers & Geosciences, 155:104833, October 2021. ISSN 0098-3004. doi: 10.1016/j.cageo.2021.104833.
- JAX: Composable transformations of Python+NumPy programs, 2018.
- Physics-Informed Neural Networks for Heat Transfer Problems. Journal of Heat Transfer, 143(6):060801, June 2021. ISSN 0022-1481, 1528-8943.
- Generalization Bounds of Stochastic Gradient Descent for Wide and Deep Neural Networks. In Proc. NeurIPS, volume 32. Curran Associates, Inc., 2019.
- Physics-informed learning of governing equations from scarce data. Nature Communications, 12(1):6136, December 2021. ISSN 2041-1723. doi: 10.1038/s41467-021-26434-1.
- CAN-PINN: A fast physics-informed neural network based on coupled-automatic–numerical differentiation method. Computer Methods in Applied Mechanics and Engineering, 395:114909, May 2022. ISSN 0045-7825.
- Separable Physics-Informed Neural Networks. (arXiv:2306.15969), July 2023.
- Scientific Machine Learning Through Physics–Informed Neural Networks: Where we are and What’s Next. Journal of Scientific Computing, 92, July 2022.
- Sample-then-optimize batch neural Thompson sampling. In Proc. NeurIPS, 2022.
- Quantum Bayesian Optimization. In Proc. NeurIPS, 2023a.
- Federated neural bandits. In Proc. ICLR, 2023b.
- Generic bounds on the approximation error for physics-informed (and) operator learning. In Proc. NeurIPS, volume 35, pp. 10945–10958, 2022.
- Deep Bayesian Active Learning with Image Data. In Proc. ICML, pp. 1183–1192, July 2017.
- Active Learning Based Sampling for High-Dimensional Nonlinear Partial Differential Equations. Journal of Computational Physics, 475(C), February 2023. ISSN 0021-9991. doi: 10.1016/j.jcp.2022.111848.
- Gradient Descent Finds the Global Optima of Two-Layer Physics-Informed Neural Networks. In Proc. ICML, pp. 10676–10707, July 2023.
- Roman Garnett. Bayesian Optimization. Cambridge Univ. Press, 2022.
- Alex Gittens. The spectral norm error of the naive Nystrom extension. (arXiv:1110.5305), October 2011.
- Revisiting the Nystrom method for improved large-scale machine learning. In Proc. ICML, volume 28, pp. 567–575, June 2013.
- Bayesian Deep Ensembles via the Neural Tangent Kernel. In Proc. NeurIPS, volume 33, pp. 1010–1022. Curran Associates, Inc., 2020.
- Training-free neural active learning with initialization-robustness guarantees. In Proc. ICML, pp. 12931––12971, July 2023.
- Neural Tangent Kernel: Convergence and Generalization in Neural Networks. In Proc. NeurIPS, volume 31, 2018.
- Locally adaptive activation functions with slope recovery for deep and physics-informed neural networks. Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, 476(2239):20200334, July 2020a.
- Adaptive activation functions accelerate convergence in deep and physics-informed neural networks. Journal of Computational Physics, 404:109136, March 2020b. ISSN 0021-9991.
- Physics-informed neural networks for inverse problems in supersonic flows. Journal of Computational Physics, 466:111402, October 2022. ISSN 00219991.
- An E-PINN assisted practical uncertainty quantification for inverse problems. (arXiv:2209.10195), September 2022.
- BatchBALD: Efficient and Diverse Batch Acquisition for Deep Bayesian Active Learning. In Proc. NeurIPS, volume 32, 2019.
- Neural Spectrum Alignment: Empirical Study. Artificial Neural Networks and Machine Learning, 12397:168–179, 2020.
- Characterizing possible failure modes in physics-informed neural networks. In Proc. NeurIPS, volume 34, pp. 26548–26560, 2021.
- Artificial neural networks for solving ordinary and partial differential equations. IEEE Transactions on Neural Networks, 9(5):987–1000, 1998. doi: 10.1109/72.712178.
- Wide Neural Networks of Any Depth Evolve as Linear Models Under Gradient Descent. Journal of Statistical Mechanics: Theory and Experiment, 2019(12):124002, December 2019. ISSN 1742-5468.
- A two-stage physics-informed neural network method based on conserved quantities and applications in localized wave solutions. Journal of Computational Physics, 457:111053, May 2022. ISSN 00219991. doi: 10.1016/j.jcp.2022.111053.
- DeepXDE: A deep learning library for solving differential equations. SIAM Review, 63(1):208–228, January 2021. ISSN 0036-1445, 1095-7200.
- Estimates on the generalization error of Physics Informed Neural Networks (PINNs) for approximating PDEs. (arXiv:2006.16144), September 2021.
- Making Look-Ahead Active Learning Strategies Feasible with Neural Tangent Kernels. In Proc. NeurIPS, 2022.
- Foundations of Machine Learning (Second Edition). Adaptive Computation and Machine Learning series. MIT Press, 2018.
- Efficient training of physics-informed neural networks via importance sampling. Computer-Aided Civil and Infrastructure Engineering, 36(8):962–977, August 2021. ISSN 1093-9687, 1467-8667.
- RANG: A Residual-based Adaptive Node Generation Method for Physics-Informed Neural Networks. (arXiv:2205.01051), May 2022.
- Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational Physics, 378:686–707, February 2019. ISSN 0021-9991.
- Deep active learning for image classification. In Proc. ICIP, pp. 3934–3938, Beijing, September 2017. ISBN 978-1-5090-2175-8.
- A Survey of Deep Active Learning. ACM Computing Surveys, 54(9):180:1–180:40, October 2021. ISSN 0360-0300.
- Deep Active Learning for Scientific Computing in the Wild. (arXiv:2302.00098), January 2023.
- Einns: Epidemiologically-informed neural networks. In Brian Williams, Yiling Chen, and Jennifer Neville (eds.), Proc. AAAI, pp. 14453–14460, 2023.
- On the Role of Fixed Points of Dynamical Systems in Training Physics-Informed Neural Networks. (arXiv:2203.13648), February 2023.
- Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. Adaptive Computation and Machine Learning Series. MIT Press, Cambridge, Mass., reprint. edition, 2002. ISBN 978-0-262-19475-4 978-0-262-53657-8.
- Active learning for convolutional neural networks: A core-set approach. In Proc. ICLR, 2018.
- Burr Settles. Active Learning. Springer International Publishing, Cham, 2012. ISBN 978-3-031-00432-2 978-3-031-01560-1.
- Taking the human out of the loop: A review of Bayesian optimization. Proceedings of the IEEE, 104(1):148–175, 2016.
- NASI: Label- and Data-agnostic Neural Architecture Search at Initialization. In Proc. ICLR, 2022a.
- Unifying and Boosting Gradient-Based Training-Free Neural Architecture Search. In Proc. NeurIPS, 2022b.
- PDEBench: An Extensive Benchmark for Scientific Machine Learning. In Proc. NeurIPS, volume 35, pp. 1596–1611, 2022.
- DAS-PINNs: A deep adaptive sampling method for solving high-dimensional partial differential equations. Journal of Computational Physics, 476:111868, March 2023. ISSN 0021-9991.
- Uniform Generalization Bounds for Overparameterized Neural Networks. CoRR, abs/2109.06099, 2021.
- Deep Active Learning by Leveraging Training Dynamics. In Proc. NeurIPS, volume 35, pp. 25171–25184, December 2022a.
- Scalable Kernel K-Means Clustering with Nyström Approximation: Relative-Error Bounds. Journal of Machine Learning Research, 20(1):431–479, January 2019. ISSN 1532-4435.
- On the eigenvector bias of Fourier feature networks: From regression to solving multi-scale PDEs with physics-informed neural networks. Computer Methods in Applied Mechanics and Engineering, 384:113938, October 2021a. ISSN 00457825.
- Respecting causality is all you need for training physics-informed neural networks. (arXiv:2203.07404), March 2022b.
- When and why PINNs fail to train: A neural tangent kernel perspective. Journal of Computational Physics, 449:110768, January 2022c. ISSN 0021-9991.
- Querying discriminative and representative samples for batch mode active learning. ACM Transactions on Knowledge Discovery from Data, 9(3), February 2015. ISSN 1556-4681.
- Neural Active Learning with Performance Guarantees. In Proc. NeurIPS, volume 34, pp. 7510–7521, 2021b.
- Using the Nyström Method to Speed Up Kernel Machines. In Proc. NeurIPS, volume 13, 2000.
- A comprehensive study of non-adaptive and residual-based adaptive sampling for physics-informed neural networks. Computer Methods in Applied Mechanics and Engineering, 403:115671, 2023. ISSN 0045-7825.
- DAVINZ: Data Valuation using Deep Neural Networks at Initialization. In Proc. ICML, pp. 24150–24176, June 2022.
- Adaptive deep neural networks methods for high-dimensional partial differential equations. Journal of Computational Physics, 463:111232, August 2022. ISSN 0021-9991.