Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Operator Learning with Neural Fields: Tackling PDEs on General Geometries (2306.07266v2)

Published 12 Jun 2023 in cs.LG and cs.AI

Abstract: Machine learning approaches for solving partial differential equations require learning mappings between function spaces. While convolutional or graph neural networks are constrained to discretized functions, neural operators present a promising milestone toward mapping functions directly. Despite impressive results they still face challenges with respect to the domain geometry and typically rely on some form of discretization. In order to alleviate such limitations, we present CORAL, a new method that leverages coordinate-based networks for solving PDEs on general geometries. CORAL is designed to remove constraints on the input mesh, making it applicable to any spatial sampling and geometry. Its ability extends to diverse problem domains, including PDE solving, spatio-temporal forecasting, and inverse problems like geometric design. CORAL demonstrates robust performance across multiple resolutions and performs well in both convex and non-convex domains, surpassing or performing on par with state-of-the-art models.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (52)
  1. Physical design using differentiable learned simulators. 2 2022. URL http://arxiv.org/abs/2202.00728.
  2. Scheduled sampling for sequence prediction with recurrent neural networks. CoRR, abs/1506.03099, 2015. URL http://arxiv.org/abs/1506.03099.
  3. Clifford neural layers for pde modeling. In The Eleventh International Conference on Learning Representations, 2022a.
  4. Message passing neural pde solvers. International Conference on Learning Representations, 2022b.
  5. Dedalus: A flexible framework for numerical simulations with spectral methods. Physical Review Research, 2, 2020. ISSN 26431564. doi: 10.1103/PhysRevResearch.2.023068.
  6. Physics-informed neural networks (pinns) for fluid mechanics: a review. Acta Mechanica Sinica/Lixue Xuebao, 37, 2021. ISSN 16143116. doi: 10.1007/s10409-021-01148-1.
  7. Pm10 and pm2.5 real-time prediction models using an interpolated convolutional neural network. Scientific Reports, 11, 2021. ISSN 20452322. doi: 10.1038/s41598-021-91253-9.
  8. Crom: Continuous reduced-order modeling of pdes using implicit neural representations. International Conference on Learning Representation, 6 2022. URL http://arxiv.org/abs/2206.02607.
  9. Learning implicit fields for generative shape modeling. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2019-June, 2019. ISSN 10636919. doi: 10.1109/CVPR.2019.00609.
  10. Deep learning for physical processes: incorporating prior scientific knowledge. Journal of Statistical Mechanics: Theory and Experiment, 2019, 12 2019. ISSN 17425468. doi: 10.1088/1742-5468/ab3195.
  11. An image is worth 16x16 words: Transformers for image recognition at scale. International Conference on Learning Representations., 10 2021. URL http://arxiv.org/abs/2010.11929.
  12. From data to functa: Your data point is a function and you can treat it like one. Proceedings of the 39 th International Conference on Machine Learning, 1 2022. URL http://arxiv.org/abs/2201.12204.
  13. Multiplicative filter networks. International Conference on Learning Representations., 2021.
  14. Fast graph representation learning with PyTorch Geometric. In ICLR Workshop on Representation Learning on Graphs and Manifolds, 2019.
  15. An initial-value problem for testing numerical models of the global shallow-water equations. Tellus, Series A: Dynamic Meteorology and Oceanography, 56, 2004. ISSN 02806495. doi: 10.1111/j.1600-0870.2004.00071.x.
  16. William L. Hamilton. Graph Representation Learning: Foundations, Methods, Applications and Systems. Morgan and Claypool, 2020. ISBN 9781450383325. doi: 10.1145/3447548.3470824.
  17. Inductive representation learning on large graphs. volume 2017-December, 2017.
  18. Gnot: A general neural operator transformer for operator learning. In International Conference on Machine Learning, pp.  12556–12569. PMLR, 2023.
  19. Deep residual learning for image recognition. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2016-December, 2016. ISSN 10636919. doi: 10.1109/CVPR.2016.90.
  20. Modelling spatiotemporal dynamics from Earth observation data with neural differential equations. Machine Learning, 111(6):2349–2380, 2022. ISSN 15730565. URL https://doi.org/10.1007/s10994-022-06139-2.
  21. Neural operator: Learning maps between function spaces. 8 2021. URL http://arxiv.org/abs/2108.08481.
  22. Imagenet classification with deep convolutional neural networks. Communications of the ACM, 60, 2017. ISSN 15577317. doi: 10.1145/3065386.
  23. Meta-sgd: Learning to learn quickly for few-shot learning. In Proceedings of the 34th International Conference on Machine Learning-Volume 70, pp.  2111–2120. JMLR. org, 2017.
  24. Transformer for partial differential equations’ operator learning. Transactions on Machine Learning Research (April/2023), 2023.
  25. Fourier neural operator for parametric partial differential equations. International Conference on Learning Representations., 10 2021. URL http://arxiv.org/abs/2010.08895.
  26. Fourier Neural Operator with Learned Deformations for PDEs on General Geometries. In arXiv:2207.05209v1, 2022a.
  27. Physics-informed neural operator for learning partial differential equations. 39th International Conference on Machine Learning (ICML2022), 11 2022b. URL http://arxiv.org/abs/2111.03794.
  28. Bacon: Band-limited coordinate networks for multiscale scene representation. Conference on Computer Vision and Pattern Recognition, 12 2022. URL http://arxiv.org/abs/2112.04645.
  29. Pde-net: Learning pdes from data. volume 7, 2018.
  30. Deeponet: Learning nonlinear operators for identifying differential equations based on the universal approximation theorem of operators. Nat Mach Intell, 3:218–229, 10 2021. doi: 10.1038/s42256-021-00302-5. URL http://arxiv.org/abs/1910.03193http://dx.doi.org/10.1038/s42256-021-00302-5.
  31. A comprehensive and fair comparison of two neural operators (with practical extensions) based on FAIR data, volume 393. Elsevier B.V., 4 2022. doi: 10.1016/j.cma.2022.114778.
  32. Occupancy networks: Learning 3d reconstruction in function space. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2019-June, 2019. ISSN 10636919. doi: 10.1109/CVPR.2019.00459.
  33. Numerical solution of partial differential equations: An introduction. 2005. doi: 10.1017/CBO9780511812248.
  34. Instant neural graphics primitives with a multiresolution hash encoding. ACM Transactions on Graphics, 41(4):1–15, jul 2022. doi: 10.1145/3528223.3530127. URL https://doi.org/10.1145%2F3528223.3530127.
  35. On first-order meta-learning algorithms, 2018.
  36. Peter J. Olver. Introduction to partial differential equations. Undergraduate Texts in Mathematics. Springer Cham, 2014.
  37. Deepsdf: Learning continuous signed distance functions for shape representation. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2019-June, 2019. ISSN 10636919. doi: 10.1109/CVPR.2019.00025.
  38. Pytorch: An imperative style, high-performance deep learning library. In Advances in Neural Information Processing Systems 32, pp.  8024–8035. Curran Associates, Inc., 2019. URL http://papers.neurips.cc/paper/9015-pytorch-an-imperative-style-high-performance-deep-learning-library.pdf.
  39. Film: Visual reasoning with a general conditioning layer. 32nd AAAI Conference on Artificial Intelligence, AAAI 2018, 2018. ISSN 2159-5399. doi: 10.1609/aaai.v32i1.11671.
  40. Learning mesh-based simulation with graph networks. International Conference on Learning Representations., 10 2021. URL http://arxiv.org/abs/2010.03409.
  41. U-net: Convolutional networks for biomedical image segmentation. volume 9351, 2015. doi: 10.1007/978-3-319-24574-4˙28.
  42. Miner: Multiscale implicit neural representations, 2022.
  43. Metasdf: Meta-learning signed distance functions. Advances in Neural Information Processing Systems, 2020-December, 2020a. ISSN 10495258.
  44. Implicit neural representations with periodic activation functions. Advances in Neural Information Processing Systems, 2020-December, 2020b. ISSN 10495258.
  45. Variable bitrate neural fields. ACM Transactions on Graphics, 2022.
  46. Fourier features let networks learn high frequency functions in low dimensional domains. Advances in Neural Information Processing Systems, 2020-December, 2020. ISSN 10495258.
  47. Learned initializations for optimizing coordinate-based neural representations. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2021. ISSN 10636919. doi: 10.1109/CVPR46437.2021.00287.
  48. Factorized fourier neural operators. 11 2023. URL http://arxiv.org/abs/2111.13802.
  49. Attention is all you need. Advances in Neural Information Processing Systems, 2017-December, 2017. ISSN 10495258.
  50. Graph attention networks. 2018. doi: 10.1007/978-3-031-01587-8˙7.
  51. Continuous pde dynamics forecasting with implicit neural representations. International Conference on Learning Representations, 9 2022. URL http://arxiv.org/abs/2209.14855.
  52. Fast context adaptation via meta-learning. 36th International Conference on Machine Learning, ICML 2019, 2019-June, 2019.
Citations (26)

Summary

We haven't generated a summary for this paper yet.