Physics-integrated generative modeling using attentive planar normalizing flow based variational autoencoder (2404.12267v1)
Abstract: Physics-integrated generative modeling is a class of hybrid or grey-box modeling in which we augment the the data-driven model with the physics knowledge governing the data distribution. The use of physics knowledge allows the generative model to produce output in a controlled way, so that the output, by construction, complies with the physical laws. It imparts improved generalization ability to extrapolate beyond the training distribution as well as improved interpretability because the model is partly grounded in firm domain knowledge. In this work, we aim to improve the fidelity of reconstruction and robustness to noise in the physics integrated generative model. To this end, we use variational-autoencoder as a generative model. To improve the reconstruction results of the decoder, we propose to learn the latent posterior distribution of both the physics as well as the trainable data-driven components using planar normalizng flow. Normalizng flow based posterior distribution harnesses the inherent dynamical structure of the data distribution, hence the learned model gets closer to the true underlying data distribution. To improve the robustness of generative model against noise injected in the model, we propose a modification in the encoder part of the normalizing flow based VAE. We designed the encoder to incorporate scaled dot product attention based contextual information in the noisy latent vector which will mitigate the adverse effect of noise in the latent vector and make the model more robust. We empirically evaluated our models on human locomotion dataset [33] and the results validate the efficacy of our proposed models in terms of improvement in reconstruction quality as well as robustness against noise injected in the model.
- Combining physical simulators and object-based networks for control. In In Proceedings of the 2019 IEEE International Conference on Robotics and Automation, page 3217–322, 2019.
- Augmenting physical simulators with stochastic neural networks: Case study of planar pushing and bouncing. In In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems, page 3066–3073, 2018.
- Deep attentive variational inference. In International Conference on Learning Representations, 2021.
- Self-supervised learning with physics-aware neural networks – i. galaxy model fitting. Monthly Notices of the Royal Astronomical Society, 498(3):3713–3719, 2020.
- A physics-aware learning architecture with input transfer networks for predictive modeling. Applied Soft Computing, 96:106665, 2020.
- Variational inference: A review for statisticians’. Journal of the American Statistical Association, 112(518):859–877, 2017.
- E. Challis and D. Barber. Affine independent variational inference. In In NIPS, 2012.
- Physics-informed generative adversarial networks for sequence generation with limited data. In NeurIPS Workshop on Interpretable Inductive Biases and Physically Structured Learning, 2020.
- Combining differentiable pde solvers and graph neural networks for fluid flow prediction. In In Proceedings of the 37th International Conference on Machine Learning, page 2402–2411, 2020.
- Deep learning for physical processes: Incorporating prior scientific knowledge. Journal of Statistical Mechanics: Theory and Experiment, 2019(12):124009, 2019.
- Continuous graph flow. 2019. arXiv:1908.02436.
- Nice: Non-linear independent components estimation. 2014. arXiv:1410.8516.
- Bridging dynamical models and deep networks to solve forward and inverse problems. In NeurIPS workshop on Interpretable Inductive Biases and Physically Structured Learning, 2020.
- Nonparametric variational inference. In ICML, 2012.
- Hamiltonian neural networks. In In Advances in Neural Information Processing Systems, 2019.
- V. Le Guen and N. Thome. Disentangling physical dynamics from unknown factors for unsupervised video prediction. In In Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition, page 11471–11481, 2020.
- Moglow: Probabilistic and controllable motion synthesis using normalising flows. 2019. arXiv:1905.06598.
- Graph residual flow for molecular graph generation. 2019. arXiv:1909.13521.
- Integer discrete flows and lossless compression. In In Advances in Neural Information Processing Systems, 2019.
- Improving the mean field approximation via the use of mixture distributions. In Learning in graphical models, pages 163–173. Springer, 1998.
- An introduction to variational methods for graphical models. Machine learning, 37(2):183–233, 1999.
- Physics-informed machine learning. Nature Reviews Physics, 2021.
- Theory-guided data science: A new paradigm for scientific discovery from data. IEEE Transactions on Knowledge and Data Engineering, 29(10):2318–2331, 2017.
- Physics-guided neural networks (pgnn): An application in lake temperature modeling. 2017. arXiv:1710.11431.
- Adjointnet: Constraining machine learning models with physics-based codes. 2021. arXiv:2109.03956.
- Ambientflow: Invertible generative models from incomplete, noisy measurements. Transactions on Machine Learning Research, 2024.
- D. P. Kingma and M. Welling. Auto-encoding variational bayes. In International Conference on Learning Representations, 2014.
- Adam: A method for stochastic optimization. In Yoshua Bengio and Yann LeCun, editors, 3rd International Conference on Learning Representations, 2015.
- Glow: Generative flow with invertible 1x1 convolutions. In In Advances in Neural Information Processing Systems, page 10215–10224, 2018.
- Improved variational inference with inverse autoregressive flow. In In Neural Information Processing Systems, page 4743–4751, 2016.
- Normalizing flows: An introduction and review of current methods. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2020.
- Hybrid physical-deep learning model for astronomical inverse problems. 2019. arXiv:1912.03980.
- Human kinematic, kinetic and emg data during different walking and stair ascending and descending tasks. Scientific Data, 6(1):309, 2019.
- Kohn-sham equations as regularizer: Building prior knowledge into machine-learned physics. Phyiscal Review Letters, 126(3):036401, 2020.
- Generative ode modeling with known unknowns. 2020. arXiv:2003.10775.
- Y. Long and X. She. Hybridnet: Integrating model-based and data-driven learning to predict evolution of dynamical systems. In In Proceedings of the 2nd Conference on Robot Learning, page 551–560, 2018.
- Multiplicative normalizing flows for variational bayesian neural networks. In In Proceedings of the 34th International Conference on Machine Learning, page 2218–2227, 2017.
- Graph-nvp: An invertible flow model for generating molecular graphs. 2019. arXiv:1905.11600.
- Neural dynamical systems: Balancing structure and flexibility in physical prediction. 2020. arXiv:2006.12682.
- A. Mnih and K. Gregor. Neural variational inference and learning in belief networks. In ICML, 2014.
- Phynet:physics guided neural networks for particle drag force prediction in assembly. In In Proceedings of the 2020 SIAM International Conference on Data Mining, page 559–567, 2020.
- Data-driven urban energy simulation (due-s): A framework for integrating engineering simulation and machine learning methods in a multi-scale urban energy modeling workflow. Applied Energy, 225:225:1176–1189, 2018.
- Normalizing flows for probabilistic modeling and inference. Journal of Machine Learning Research, 22:57:1–57:64, 2019.
- Masked autoregressive flow for density estimation. In In Advances in Neural Information Processing Systems, page 2338–2347, 2017.
- Grey-box models for wave loading prediction. Mechanical Systems and Signal Processing, 159:107741, 2021.
- A hybrid neural network-first principles approach to process modeling. AIChE Journal, 38(10):1499–1511, 1992.
- Integrating expert odes into neural odes: Pharmacology and disease progression. 2021. arXiv:2106.02875.
- Universal differential equations for scientific machine learning. 2020. arXiv:2001.04385.
- Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational Physics, 378:686–707, 2019.
- Hybrid analytical and data-driven modeling for feed-forward robot control. Sensors, 17(2):311, 2017.
- Stochastic backpropagation and approximate inference in deep generative models. In Proceedings of the 31st International Conference on Machine Learning, 2014.
- Variational inference with normalizing flows. In Francis Bach and David Blei, editors, Proceedings of the 32nd International Conference on Machine Learning, volume 37 of Proceedings of Machine Learning Research, pages 1530–1538, Lille, France, 07–09 Jul 2015. PMLR.
- M. Rixner and P.-S. Koutsourelakis. A probabilistic generative model for semi-supervised training of coarse-grained surrogates and enforcing physical constraints through virtual observables. 2020. arXiv:2006.01789.
- Modeling system dynamics with physics-informed neural networks based on lagrangian mechanic. 2020. arXiv:2005.14617.
- Latent ordinary differential equations for irregularly sampled time series. In Advances in Neural Information Processing Systems, page 5321–5331, 2019.
- Markov chain monte carlo and variational inference: Bridging the gap. In In ICML, 2015.
- Ensembling geophysical models with bayesian neural networks. In In Advances in Neural Information Processing Systems, 2020.
- Pi-lstm: Physics-infused long short-term memory metwork. In In Proceedings of the 18th IEEE International Conference on Machine Learning and Applications, pages 34–41, 2019.
- S.Kaltenbach and P.-S.Koutsourelakis. Incorporating physical constraints in a deep probabilistic machine learning framework for coarse-graining dynamical systems. Journal of Computational Physics, 419:109673, 2020.
- R. Stewart and S. Ermon. Label-free supervision of neural networks with physics and domain knowledge. In In Proceedings of the 31st AAAI Conference on Artificial Intelligence, page 2576–2582, 2017.
- Enforcing constraints for interpolation and extrapolation in generative adversarial networks. Journal of Computational Physics, 397:108844, 2019.
- A family of nonparametric density estimation algorithms. Communications on Pure and Applied Mathematics, 66(2):145–164, 2013.
- Physics-integrated variational autoencoders for robust and interpretable generative modeling. In A. Beygelzimer, Y. Dauphin, P. Liang, and J. Wortman Vaughan, editors, Advances in Neural Information Processing Systems, 2021.
- Improving variational auto-encoders using householder flow. In NeurIPS Workshop on Bayesian Deep Learning, 2016.
- Hamiltonian generative networks. In In Proceedings of the 8th International Conference on Learning Representations, 2020.
- Discrete flows: Invertible generative models of discrete data. In In Advances in Neural Information Processing Systems, 2019.
- R. E. Turner and M. Sahani. Two problems with variational expectation maximisation for time-series models. In D. Barber, T. Cemgil, and S. Chiappa, editors, Bayesian Time series models, chapter 5, pages 109–130. Cambridge University Press, Cambridge, 2011.
- Sylvester normalizing flows for variational inference. In The 34th Conference on Uncertainty in Artificial Intelligence, 2018.
- Attention is all you need. In I. Guyon, U. Von Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett, editors, Advances in Neural Information Processing Systems, volume 30, 2017.
- Estimating model inadequacy in ordinary differential equations with physics-informed neural networks. Computers & Structures, 245:106458, 2021.
- Informed machine learning – a taxonomy and survey of integrating knowledge into learning systems. 2020. arXiv:1903.12394v2.
- Combining machine learning and simulation to a hybrid modelling approach: Current and future directions. In In Advances in Intelligent Data Analysis XVIII, number 12080 in Lecture Notes in Computer Science, pages 548–560. Springer, 2020.
- Data-assisted reduced-order modeling of extreme events in complex dynamical systems. PLOS ONE, 13(5):e0197704, 2018.
- Integrating model-driven and data-driven methods for power system frequency stability assessment and control. IEEE Transactions on Power Systems, 34(6):4557–4568, 2019.
- Understanding and mitigating gradient flow pathologies in physics-informed neural networks. SIAM Journal on Scientific Computing, 43(5):A3055–A3081, 2021.
- Integrating physics-based modeling with machine learning: A survey. 2020. arXiv:2003.04919.
- Learning likelihoods with conditional normalizing flows. 2019. arXiv:1912.00042.
- Pointflow: 3d point cloud generation with continuous normalizing flows. In In Proceedings of the International Conference on Computer Vision, 2019.
- Augmenting physical models with deep networks for complex dynamics forecasting. In In Proceedings of the 9th International Conference on Learning Representations, 2021.
- A physics-aware learning architecture with input transfer networks for predictive modeling. Applied Soft Computing, 53:205–216, 2017.
- Ode2vae: Deep generative second order odes with bayesian neural networks. In In Advances in Neural Information Processing Systems, page 13412–13421, 2019.
- Tossingbot: Learning to throw arbitrary objects with residual physics. In In Proceedings of Robotics: Science and Systems, 2019.