Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Lagrangian Flow Networks for Conservation Laws (2305.16846v2)

Published 26 May 2023 in cs.LG, physics.data-an, physics.flu-dyn, and stat.ML

Abstract: We introduce Lagrangian Flow Networks (LFlows) for modeling fluid densities and velocities continuously in space and time. By construction, the proposed LFlows satisfy the continuity equation, a PDE describing mass conservation in its differentiable form. Our model is based on the insight that solutions to the continuity equation can be expressed as time-dependent density transformations via differentiable and invertible maps. This follows from classical theory of the existence and uniqueness of Lagrangian flows for smooth vector fields. Hence, we model fluid densities by transforming a base density with parameterized diffeomorphisms conditioned on time. The key benefit compared to methods relying on numerical ODE solvers or PINNs is that the analytic expression of the velocity is always consistent with changes in density. Furthermore, we require neither expensive numerical solvers, nor additional penalties to enforce the PDE. LFlows show higher predictive accuracy in density modeling tasks compared to competing models in 2D and 3D, while being computationally efficient. As a real-world application, we model bird migration based on sparse weather radar measurements.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (47)
  1. Optuna: A next-generation hyperparameter optimization framework. In Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery & data mining, pp.  2623–2631, 2019.
  2. Existence, uniqueness, stability and differentiability properties of the flow associated to weakly differentiable vector fields. In Transport equations and multi-D hyperbolic conservation laws, volume 5 of Lect. Notes Unione Mat. Ital., pp.  3–57. Springer, Berlin, 2008. doi: 10.1007/978-3-540-76781-7_1.
  3. Mesh-free eulerian physics-informed neural networks. arXiv preprint arXiv:2206.01545, 2022.
  4. Semi-conditional normalizing flows for semi-supervised learning. arXiv preprint arXiv:1905.00505, 2019.
  5. Neural flows: Efficient alternative to neural odes. Advances in neural information processing systems, 34:21325–21337, 2021.
  6. Displacement interpolation using lagrangian mass transport. In Proceedings of the 2011 SIGGRAPH Asia conference, pp.  1–12, 2011.
  7. Haim Brezis. Functional analysis, Sobolev spaces and partial differential equations. Universitext. Springer, New York, 2011. ISBN 978-0-387-70913-0.
  8. Dan G Cacuci. Sensitivity theory for nonlinear systems. i. nonlinear functional analysis approach. Journal of Mathematical Physics, 22(12):2794–2802, 1981a.
  9. Dan G Cacuci. Sensitivity theory for nonlinear systems. ii. extensions to additional classes of responses. Journal of Mathematical Physics, 22(12):2803–2812, 1981b.
  10. Ricky T. Q. Chen. torchdiffeq, 2018. URL https://github.com/rtqichen/torchdiffeq.
  11. Neural ordinary differential equations. Advances in neural information processing systems, 31, 2018.
  12. Neural spatio-temporal point processes. In International Conference on Learning Representations, 2020.
  13. Radar aeroecology. Aeroecology, pp.  277–309, 2017.
  14. Lagrangian neural networks. arXiv preprint arXiv:2003.04630, 2020.
  15. Michail Diamantakis. The semi-lagrangian technique in atmospheric modelling: current status and future challenges. In ECMWF Seminar in numerical methods for atmosphere and ocean modelling, pp.  183–200, 2013.
  16. Sensitivity of the ecmwf model to semi-lagrangian departure point iterations. Monthly Weather Review, 144(9):3233–3250, 2016.
  17. Bird migration flight altitudes studied by a network of operational weather radars. Journal of the Royal Society Interface, 8(54):30–43, 2011.
  18. nflows: normalizing flows in PyTorch, November 2020. URL https://doi.org/10.5281/zenodo.4296287.
  19. Sigmoid-weighted linear units for neural network function approximation in reinforcement learning. Neural networks, 107:3–11, 2018.
  20. Pot: Python optimal transport. The Journal of Machine Learning Research, 22(1):3571–3578, 2021.
  21. Ffjord: Free-form continuous dynamics for scalable reversible generative models. In International Conference on Learning Representations, 2019.
  22. Hamiltonian neural networks. Advances in neural information processing systems, 32, 2019.
  23. Hypernetworks. In International Conference on Learning Representations, 2017.
  24. Philip Hartman. Ordinary differential equations, volume 38 of Classics in Applied Mathematics. Society for Industrial and Applied Mathematics (SIAM), Philadelphia, PA, 2002. ISBN 0-89871-510-5. doi: 10.1137/1.9780898719222. Corrected reprint of the second (1982) edition [Birkhäuser, Boston, MA; MR0658490 (83e:34002)], With a foreword by Peter Bates.
  25. The era5 global reanalysis. Quarterly Journal of the Royal Meteorological Society, 146(730):1999–2049, 2020.
  26. Conservative physics-informed neural networks on discrete domains for conservation laws: Applications to forward and inverse problems. Computer Methods in Applied Mechanics and Engineering, 365:113028, 2020.
  27. Physics-informed machine learning. Nature Reviews Physics, 3(6):422–440, 2021.
  28. Glow: Generative flow with invertible 1x1 convolutions. In S. Bengio, H. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, and R. Garnett (eds.), Advances in Neural Information Processing Systems, volume 31. Curran Associates, Inc., 2018. URL https://proceedings.neurips.cc/paper_files/paper/2018/file/d139db6a236200b21cc7f752979132d0-Paper.pdf.
  29. Normalizing flows: An introduction and review of current methods. IEEE transactions on pattern analysis and machine intelligence, 43(11):3964–3979, 2020.
  30. Self-consistent velocity matching of probability flows. arXiv preprint arXiv:2301.13737, 2023.
  31. Fourier neural operator for parametric partial differential equations. arXiv preprint arXiv:2010.08895, 2020.
  32. Jacobian determinant of normalizing flows, 2021.
  33. Learning to predict spatiotemporal movement dynamics from weather radar networks. Methods in Ecology and Evolution, 13(12):2811–2826, 2022a.
  34. Physics-informed inference of aerial animal movements from weather radar data. In NeurIPS 2022 AI for Science: Progress and Promises, 2022b.
  35. Deeponet: Learning nonlinear operators for identifying differential equations based on the universal approximation theorem of operators. arXiv preprint arXiv:1910.03193, 2019.
  36. Optimal transport mapping via input convex neural networks. In International Conference on Machine Learning, pp.  6672–6681. PMLR, 2020.
  37. A geostatistical approach to estimate high resolution nocturnal bird migration densities from a weather radar network. Remote Sensing, 11(19):2233, 2019.
  38. Quantifying year-round nocturnal bird migration with a fluid dynamics model. Journal of the Royal Society Interface, 18(179):20210194, 2021.
  39. Normalizing flows for probabilistic modeling and inference. The Journal of Machine Learning Research, 22(1):2617–2680, 2021.
  40. Invertible densenets with concatenated lipswish. Advances in Neural Information Processing Systems, 34:17246–17257, 2021.
  41. Lev Semenovich Pontryagin. Mathematical theory of optimal processes. CRC press, 1987.
  42. Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational physics, 378:686–707, 2019.
  43. Neural conservation laws: A divergence-free perspective. In Advances in Neural Information Processing Systems, 2022.
  44. Andre Robert. A semi-lagrangian and semi-implicit numerical integration scheme for the primitive meteorological equations. Journal of the Meteorological Society of Japan. Ser. II, 60(1):319–325, 1982.
  45. Measuring the effects of data parallelism on neural network training. arXiv preprint arXiv:1811.03600, 2018.
  46. Implicit neural representations with periodic activation functions. Advances in Neural Information Processing Systems, 33:7462–7473, 2020.
  47. Semi-lagrangian integration schemes for atmospheric models—a review. Monthly weather review, 119(9):2206–2223, 1991.
Citations (1)

Summary

We haven't generated a summary for this paper yet.