Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Adversarial Adaptive Sampling: Unify PINN and Optimal Transport for the Approximation of PDEs (2305.18702v2)

Published 30 May 2023 in stat.ML, cs.LG, cs.NA, and math.NA

Abstract: Solving partial differential equations (PDEs) is a central task in scientific computing. Recently, neural network approximation of PDEs has received increasing attention due to its flexible meshless discretization and its potential for high-dimensional problems. One fundamental numerical difficulty is that random samples in the training set introduce statistical errors into the discretization of loss functional which may become the dominant error in the final approximation, and therefore overshadow the modeling capability of the neural network. In this work, we propose a new minmax formulation to optimize simultaneously the approximate solution, given by a neural network model, and the random samples in the training set, provided by a deep generative model. The key idea is to use a deep generative model to adjust random samples in the training set such that the residual induced by the approximate PDE solution can maintain a smooth profile when it is being minimized. Such an idea is achieved by implicitly embedding the Wasserstein distance between the residual-induced distribution and the uniform distribution into the loss, which is then minimized together with the residual. A nearly uniform residual profile means that its variance is small for any normalized weight function such that the Monte Carlo approximation error of the loss functional is reduced significantly for a certain sample size. The adversarial adaptive sampling (AAS) approach proposed in this work is the first attempt to formulate two essential components, minimizing the residual and seeking the optimal training set, into one minmax objective functional for the neural network approximation of PDEs.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (37)
  1. Sobolev Spaces, 2nd Edition. Elsevier, Amsterdam, 2003.
  2. Wasserstein generative adversarial networks. In International Conference on Machine Learning, pp. 214–223. PMLR, 2017.
  3. A unified deep artificial neural network approach to partial differential equations in complex geometries. Neurocomputing, 317:28–41, 2018.
  4. The deep Ritz method: A deep learning-based numerical algorithm for solving variational problems. Communications in Mathematics and Statistics, 6(1):1–12, 2018.
  5. Finite elements and fast iterative solvers: With applications in incompressible fluid dynamics. Oxford University Press, USA, 2014.
  6. Lawrence C Evans. Partial Differential Equations. American Mathematical Soc., 2010.
  7. Active learning based sampling for high-dimensional nonlinear partial differential equations. Journal of Computational Physics, 475:111848, 2023.
  8. Failure-informed adaptive sampling for PINNs. arXiv preprint arXiv:2210.00279, 2022.
  9. Inverse aerodynamic design of gas turbine blades using probabilistic machine learning. Journal of Mechanical Design, 144(2), 2022.
  10. Generative adversarial nets. In Advances in Neural Information Processing Systems, pp.  2672–2680, 2014.
  11. Improved training of Wasserstein GANs. Advances in Neural Information Processing Systems, 30, 2017.
  12. Solving high-dimensional partial differential equations using deep learning. Proceedings of the National Academy of Sciences, 115(34):8505–8510, 2018.
  13. GAS: A Gaussian mixture distribution-based adaptive sampling method for PINNs. arXiv preprint arXiv:2303.15849, 2023.
  14. Physics-informed machine learning. Nature Reviews Physics, 3(6):422–440, 2021.
  15. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980, 2017.
  16. DeepXDE: A deep learning library for solving differential equations. SIAM Review, 63(1):208–228, 2021.
  17. Convergence of adaptive finite element methods for general second order linear elliptic PDEs. SIAM Journal on Numerical Analysis, 43(5):1803–1827, 2005.
  18. William F Mitchell. A collection of 2D elliptic problems for testing adaptive grid refinement algorithms. Applied Mathematics and Computation, 220:350–364, 2013.
  19. Convergence of adaptive finite element methods. SIAM Review, 44(4):631–658, 2002.
  20. Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational Physics, 378:686–707, 2019.
  21. PFNN: A penalty-free neural network method for solving a class of second-order boundary-value problems on complex geometries. Journal of Computational Physics, 428:110085, 2021.
  22. DGM: A deep learning algorithm for solving partial differential equations. Journal of Computational Physics, 375:1339–1364, 2018.
  23. Adaptive deep density approximation for Fokker-Planck equations. Journal of Computational Physics, 457:111080, 2022.
  24. DAS-PINNs: A deep adaptive sampling method for solving high-dimensional partial differential equations. Journal of Computational Physics, 476:111868, 2023.
  25. Michael E Taylor. Partial Differential Equations I: Basic Theory, 2nd Edition. Springer, 2011.
  26. Cédric Villani. Topics in Optimal Transportation. Number 58 in Graduate Studies in Mathematics. American Mathematical Society, 2003.
  27. E Weinan. The dawning of a new era in applied mathematics. Notices of the American Mathematical Society, 68(4):565–571, 2021.
  28. A comprehensive study of non-adaptive and residual-based adaptive sampling for physics-informed neural networks. Computer Methods in Applied Mechanics and Engineering, 403:115671, 2023.
  29. Modeling uncertainty in flow simulations via generalized polynomial chaos. Journal of Computational Physics, 187(1):137–167, 2003.
  30. AONN: An adjoint-oriented neural network method for all-at-once solutions of parametric optimal control problems. arXiv preprint arXiv:2302.02076, 2023.
  31. Gradient-enhanced physics-informed neural networks for forward and inverse PDE problems. Computer Methods in Applied Mechanics and Engineering, 393:114823, 2022.
  32. Weak adversarial networks for high-dimensional partial differential equations. Journal of Computational Physics, 411:109409, 2020.
  33. Bounded KRnet and its applications to density estimation and approximation. arXiv:2305.09063, 2023.
  34. Competitive physics informed networks. In ICLR 2022 Workshop on Gamification and Multiagent Solutions, 2022.
  35. A deep learning method for solving Fokker-Planck equations. In Mathematical and Scientific Machine Learning, pp. 568–597. PMLR, 2022.
  36. Bayesian deep convolutional encoder–decoder networks for surrogate modeling and uncertainty quantification. Journal of Computational Physics, 366:415–447, 2018.
  37. Physics-constrained deep learning for high-dimensional surrogate modeling and uncertainty quantification without labeled data. Journal of Computational Physics, 394:56–81, 2019.
Citations (8)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com