Towards Cross Domain Generalization of Hamiltonian Representation via Meta Learning (2212.01168v4)
Abstract: Recent advances in deep learning for physics have focused on discovering shared representations of target systems by incorporating physics priors or inductive biases into neural networks. While effective, these methods are limited to the system domain, where the type of system remains consistent and thus cannot ensure the adaptation to new, or unseen physical systems governed by different laws. For instance, a neural network trained on a mass-spring system cannot guarantee accurate predictions for the behavior of a two-body system or any other system with different physical laws. In this work, we take a significant leap forward by targeting cross domain generalization within the field of Hamiltonian dynamics. We model our system with a graph neural network (GNN) and employ a meta learning algorithm to enable the model to gain experience over a distribution of systems and make it adapt to new physics. Our approach aims to learn a unified Hamiltonian representation that is generalizable across multiple system domains, thereby overcoming the limitations of system-specific models. We demonstrate that the meta-trained model captures the generalized Hamiltonian representation that is consistent across different physical domains. Overall, through the use of meta learning, we offer a framework that achieves cross domain generalization, providing a step towards a unified model for understanding a wide array of dynamical systems via deep learning.
- The impact of reinitialization on generalization in convolutional neural networks. arXiv preprint arXiv:2109.00267, 2021.
- Vladimir Igorevich Arnol’d. Mathematical methods of classical mechanics, volume 60. Springer Science & Business Media, 2013.
- Learning the dynamics of physical systems with hamiltonian graph neural networks. In ICLR 2023 Workshop on Physics for Machine Learning, 2023.
- JAX: composable transformations of Python+NumPy programs, 2018.
- A simple framework for contrastive learning of visual representations. In International conference on machine learning, pp. 1597–1607. PMLR, 2020.
- Neural symplectic form: Learning hamiltonian equations on general coordinate systems. Advances in Neural Information Processing Systems, 34:16659–16670, 2021.
- Lagrangian neural networks. arXiv preprint arXiv:2003.04630, 2020.
- Resonant normal form and asymptotic normal form behaviour in magnetic bottle hamiltonians. Nonlinearity, 28(4):851, 2015.
- Testing the manifold hypothesis. Journal of the American Mathematical Society, 29(4):983–1049, 2016.
- Model-agnostic meta-learning for fast adaptation of deep networks. In International conference on machine learning, pp. 1126–1135. PMLR, 2017.
- Simplifying hamiltonian and lagrangian neural networks via explicit constraints. Advances in neural information processing systems, 33:13880–13889, 2020.
- Hamiltonian neural networks. In Advances in Neural Information Processing Systems, volume 32, 2019.
- Nature’s cost function: Simulating physics by minimizing the action. In ICLR 2023 Workshop on Physics for Machine Learning, 2023.
- Deconstructing the inductive biases of hamiltonian neural networks. In International Conference on Learning Representations, 2022.
- Vector quantized diffusion model for text-to-image synthesis. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 10696–10706, 2022.
- Inductive representation learning on large graphs. Advances in neural information processing systems, 30, 2017.
- Michel Hénon. Numerical exploration of hamiltonian systems. Chaotic Behavior of Deterministic Systems (Les Houches, 1981), pp. 53–170, 1983.
- The applicability of the third integral of motion: some numerical experiments. Astronomical Journal, Vol. 69, p. 73 (1964), 69:73, 1964.
- Meta-learning in neural networks: A survey, 2020.
- Sequential latent variable models for few-shot high-dimensional time-series forecasting. In The Eleventh International Conference on Learning Representations, 2022.
- Constants of motion network. In Alice H. Oh, Alekh Agarwal, Danielle Belgrave, and Kyunghyun Cho (eds.), Advances in Neural Information Processing Systems, 2022.
- Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980, 2014.
- Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907, 2016.
- Generalizing to new physical systems via context-informed dynamics model. arXiv preprint arXiv:2202.01889, 2022.
- Similarity of neural network representations revisited. In International Conference on Machine Learning, pp. 3519–3529. PMLR, 2019.
- Identifying physical law of hamiltonian systems via meta-learning. In International Conference on Learning Representations, 2021.
- Metalearning generalizable dynamics from trajectories. Phys. Rev. Lett., 131:067301, Aug 2023.
- Symplectic geometry and analytical mechanics, volume 35. Springer Science & Business Media, 2012.
- Machine learning conservation laws from trajectories. Physical Review Letters, 126(18):180604, 2021.
- Pytorch: An imperative style, high-performance deep learning library. In H. Wallach, H. Larochelle, A. Beygelzimer, F. d'Alché-Buc, E. Fox, and R. Garnett (eds.), Advances in Neural Information Processing Systems, volume 32. Curran Associates, Inc., 2019.
- Linda Petzold. Automatic selection of methods for solving stiff and nonstiff systems of ordinary differential equations. SIAM journal on scientific and statistical computing, 4(1):136–148, 1983.
- Exploring the limits of transfer learning with a unified text-to-text transformer. The Journal of Machine Learning Research, 21(1):5485–5551, 2020.
- Rapid learning or feature reuse? towards understanding the effectiveness of maml. In International Conference on Learning Representations, 2020.
- Transfusion: Understanding transfer learning for medical imaging. In H. Wallach, H. Larochelle, A. Beygelzimer, F. d'Alché-Buc, E. Fox, and R. Garnett (eds.), Advances in Neural Information Processing Systems, volume 32. Curran Associates, Inc., 2019.
- Generative adversarial text to image synthesis. In International conference on machine learning, pp. 1060–1069. PMLR, 2016.
- Variational integrator networks for physically structured embeddings. In International Conference on Artificial Intelligence and Statistics, pp. 3078–3087. PMLR, 2020.
- Hamiltonian graph networks with ode integrators. arXiv preprint arXiv:1909.12790, 2019.
- Sequential neural processes. Advances in Neural Information Processing Systems, 32, 2019.
- Efficientnet: Rethinking model scaling for convolutional neural networks. In International conference on machine learning, pp. 6105–6114. PMLR, 2019.
- Neural discrete representation learning. Advances in neural information processing systems, 30, 2017.
- SciPy 1.0: Fundamental Algorithms for Scientific Computing in Python. Nature Methods, 17:261–272, 2020.
- Meta-learning dynamics forecasting using task inference. Advances in Neural Information Processing Systems, 35:21640–21653, 2022.
- Leads: Learning dynamical systems that generalize across environments. Advances in Neural Information Processing Systems, 34:7561–7573, 2021.
- How transferable are features in deep neural networks? In Z. Ghahramani, M. Welling, C. Cortes, N. Lawrence, and K.Q. Weinberger (eds.), Advances in Neural Information Processing Systems, volume 27. Curran Associates, Inc., 2014.