Joint Parameter and Parameterization Inference with Uncertainty Quantification through Differentiable Programming (2403.02215v3)
Abstract: Accurate representations of unknown and sub-grid physical processes through parameterizations (or closure) in numerical simulations with quantified uncertainty are critical for resolving the coarse-grained partial differential equations that govern many problems ranging from weather and climate prediction to turbulence simulations. Recent advances have seen ML increasingly applied to model these subgrid processes, resulting in the development of hybrid physics-ML models through the integration with numerical solvers. In this work, we introduce a novel framework for the joint estimation of physical parameters and machine learning parameterizations with uncertainty quantification. Our framework incorporates online training and efficient Bayesian inference within a high-dimensional parameter space, facilitated by differentiable programming. This proof of concept underscores the substantial potential of differentiable programming in synergistically combining machine learning with differential equations, thereby enhancing the capabilities of hybrid physics-ML modeling.
- Julia: A fresh approach to numerical computing. SIAM review, 59(1):65–98, 2017. URL https://doi.org/10.1137/141000671.
- History-based, bayesian, closure for stochastic parameterization: Application to lorenz’96. arXiv preprint arXiv:2210.14488, 2022.
- Multi-fidelity climate model parameterization for better generalization and extrapolation. arXiv preprint arXiv:2309.10231, 2023.
- Clouds, circulation and climate sensitivity. Nature Geoscience, 8(4):261–268, 2015.
- JAX: composable transformations of Python+NumPy programs, 2018. URL http://github.com/google/jax.
- Stochastic gradient hamiltonian monte carlo. In International conference on machine learning, pp. 1683–1691. PMLR, 2014.
- Machine learning with data assimilation and uncertainty quantification for dynamical systems: a review. IEEE/CAA Journal of Automatica Sinica, 10(6):1361–1387, 2023.
- Calibrate, emulate, sample. Journal of Computational Physics, 424:109716, 2021.
- David Draper. Assessment and propagation of model uncertainty. Journal of the Royal Statistical Society Series B: Statistical Methodology, 57(1):45–70, 1995.
- Geir Evensen. The ensemble kalman filter for combined state and parameter estimation. IEEE Control Systems Magazine, 29(3):83–104, 2009.
- Online model error correction with neural networks in the incremental 4d-var framework. Journal of Advances in Modeling Earth Systems, 15(9):e2022MS003474, 2023.
- A posteriori learning for quasi-geostrophic turbulence parametrization. Journal of Advances in Modeling Earth Systems, 14(11):e2022MS003124, 2022.
- Could machine learning break the convection parameterization deadlock? Geophysical Research Letters, 45(11):5742–5751, 2018.
- Flax: A neural network library and ecosystem for JAX, 2023. URL http://github.com/google/flax.
- IPCC. Climate Change 2021: The Physical Science Basis. Contribution of Working Group I to the Sixth Assessment Report of the Intergovernmental Panel on Climate Change. IPCC, 2021.
- Neural general circulation models. arXiv preprint arXiv:2311.07222, 2023.
- Chaosbench: A multi-channel, physics-based benchmark for subseasonal-to-seasonal climate prediction. arXiv preprint arXiv:2402.00712, 2024.
- Radford M Neal et al. Mcmc using hamiltonian dynamics. Handbook of markov chain monte carlo, 2(11):2, 2011.
- Data-driven multiscale modeling of subgrid parameterizations in climate models. arXiv preprint arXiv:2303.17496, 2023.
- Pytorch: An imperative style, high-performance deep learning library. In Advances in Neural Information Processing Systems 32, pp. 8024–8035. Curran Associates, Inc., 2019. URL http://papers.neurips.cc/paper/9015-pytorch-an-imperative-style-high-performance-deep-learning-library.pdf.
- Can a machine learning–enabled numerical model help extend effective forecast range through consistently trained subgrid-scale models? Artificial Intelligence for the Earth Systems, 2(1):e220050, 2023.
- Breaking the cloud parameterization deadlock. Bulletin of the American Meteorological Society, 84(11):1547–1564, 2003.
- Deep learning to represent subgrid processes in climate models. Proceedings of the National Academy of Sciences, 115(39):9684–9689, 2018.
- Benchmarking of machine learning ocean subgrid parameterizations in an idealized model. Journal of Advances in Modeling Earth Systems, 15(1):e2022MS003258, 2023.
- Earth system modeling 2.0: A blueprint for models that learn from observations and targeted high-resolution simulations. Geophysical Research Letters, 44(24):12–396, 2017.
- Differentiable modelling to unify machine learning and physical models for geosciences. Nature Reviews Earth & Environment, 4(8):552–567, 2023.
- A combined eddy-diffusivity mass-flux approach for the convective boundary layer. Journal of the atmospheric sciences, 64(4):1230–1248, 2007.
- Joseph Smagorinsky. General circulation experiments with the primitive equations: I. the basic experiment. Monthly weather review, 91(3):99–164, 1963.
- Variational data assimilation for parameter estimation: application to a simple morphodynamic model. Ocean Dynamics, 59:697–708, 2009.
- Soil moisture stress as a major driver of carbon cycle uncertainty. Geophysical Research Letters, 45(13):6495–6503, 2018.
- Nonlinear Data Assimilation for high-dimensional systems: -with geophysical applications. Springer, 2015.
- A simple introduction to markov chain monte–carlo sampling. Psychonomic bulletin & review, 25(1):143–154, 2018.
- Bayesian differential programming for robust systems identification under uncertainty. Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, 476(2243):20200290, 2020. doi: 10.1098/rspa.2020.0290. URL https://royalsocietypublishing.org/doi/abs/10.1098/rspa.2020.0290.
- Data-driven equation discovery of ocean mesoscale closures. Geophysical Research Letters, 47(17):e2020GL088376, 2020.
- Adabelief optimizer: Adapting stepsizes by the belief in observed gradients. Advances in neural information processing systems, 33:18795–18806, 2020.
- On the convergence of hamiltonian monte carlo with stochastic gradients. In International Conference on Machine Learning, pp. 13012–13022. PMLR, 2021.
- Yongquan Qu (5 papers)
- Mohamed Aziz Bhouri (11 papers)
- Pierre Gentine (51 papers)