Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Discovering Symmetry Breaking in Physical Systems with Relaxed Group Convolution (2310.02299v7)

Published 3 Oct 2023 in cs.LG and cs.AI

Abstract: Modeling symmetry breaking is essential for understanding the fundamental changes in the behaviors and properties of physical systems, from microscopic particle interactions to macroscopic phenomena like fluid dynamics and cosmic structures. Thus, identifying sources of asymmetry is an important tool for understanding physical systems. In this paper, we focus on learning asymmetries of data using relaxed group convolutions. We provide both theoretical and empirical evidence that this flexible convolution technique allows the model to maintain the highest level of equivariance that is consistent with data and discover the subtle symmetry-breaking factors in various physical systems. We employ various relaxed group convolution architectures to uncover various symmetry-breaking factors that are interpretable and physically meaningful in different physical systems, including the phase transition of crystal structure, the isotropy and homogeneity breaking in turbulent flow, and the time-reversal symmetry breaking in pendulum systems.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (66)
  1. Character of a Representation, pp.  29–55. Springer Berlin Heidelberg, Berlin, Heidelberg, 2008. doi: 10.1007/978-3-540-32899-5_3. URL https://doi.org/10.1007/978-3-540-32899-5_3.
  2. Cormorant: Covariant molecular neural networks. In Advances in neural information processing systems (NeurIPS), 2019.
  3. Equivariant neural networks and equivarification. arXiv preprint arXiv:1906.07172, 2019.
  4. Batchelor, G. K. The theory of homogeneous turbulence. Cambridge university press, 1953.
  5. An introduction to spontaneous symmetry breaking. SciPost Physics Lecture Notes, pp.  011, 2019.
  6. Deep learning for time series forecasting: Tutorial and literature survey. ACM Computing Surveys, 55(6):1–36, 2022.
  7. The mathematical theory of symmetry in solids: representation theory for point groups and space groups. Oxford University Press, 2010.
  8. Geometric deep learning: Grids, groups, graphs, geodesics, and gauges. arXiv:2104.13478, 2021.
  9. Symmetry breaking. In Knox, E. and Wilson, A. (eds.), The Routledge Companion to Philosophy of Physics. Routledge, 2021.
  10. Rotation equivariance and invariance in convolutional neural networks. arXiv preprint arXiv:1805.12301, 2018.
  11. Group equivariant convolutional networks. In Proceedings of the International Conference on Machine Learning (ICML), 2016a.
  12. Steerable CNNs. arXiv preprint arXiv:1612.08498, 2016b.
  13. Gauge equivariant convolutional networks and the icosahedral CNN. In Proceedings of the 36th International Conference on Machine Learning (ICML), volume 97, pp.  1321–1330, 2019.
  14. Automatic symmetry discovery with lie algebra convolutional network. Advances in Neural Information Processing Systems, 34:2503–2515, 2021.
  15. Symmetry discovery with deep learning. Physical Review D, 105(9):096031, 2022.
  16. Revisiting spatial invariance with low-rank local connectivity. In Proceedings of the 37th International Conference of Machine Learning (ICML), 2020.
  17. Learning so (3) equivariant representations with spherical cnns. In Proceedings of the European Conference on Computer Vision (ECCV), pp.  52–68, 2018.
  18. Symmetry breaking. In Principles and Practice of Constraint Programming—CP 2001: 7th International Conference, CP 2001 Paphos, Cyprus, November 26–December 1, 2001 Proceedings 7, pp.  93–107. Springer, 2001.
  19. Model-agnostic meta-learning for fast adaptation of deep networks. In International conference on machine learning, pp.  1126–1135. PMLR, 2017.
  20. Generalizing convolutional neural networks for equivariance to lie groups on arbitrary continuous data. arXiv preprint arXiv:2002.12880, 2020.
  21. Residual pathway priors for soft equivariance constraints. In Beygelzimer, A., Dauphin, Y., Liang, P., and Vaughan, J. W. (eds.), Advances in Neural Information Processing Systems, 2021. URL https://openreview.net/forum?id=k505ekjMzww.
  22. Bayesian conditional diffusion models for versatile spatiotemporal turbulence generation. arXiv preprint arXiv:2311.07896, 2023.
  23. e3nn: Euclidean neural networks. arXiv preprint arXiv:2207.09453, 2022.
  24. Scale steerable filters for locally scale-invariant convolutional neural networks. arXiv preprint arXiv:1906.03861, 2019.
  25. Group equivariant fourier neural operators for partial differential equations. arXiv preprint arXiv:2306.05697, 2023.
  26. phiflow: A differentiable pde solving framework for deep learning via physical simulations. In Workshop on Differentiable Vision, Graphics, and Physics in Machine Learning at NeurIPS, 2020.
  27. Commentary: The materials project: A materials genome approach to accelerating materials innovation. APL materials, 1(1), 2013.
  28. Time-reversal-symmetry breaking in turbulence. Physical review letters, 113(5):054501, 2014.
  29. The johns hopkins turbulence databases: An open simulation laboratory for turbulence research. Computing in Science & Engineering, 17(5):10–17, 2015.
  30. Exploiting redundancy: Separable group convolutional networks on lie groups. In International Conference on Machine Learning, pp.  11359–11386. PMLR, 2022.
  31. On the generalization of equivariance and convolution in neural networks to the action of compact groups. In Proceedings of the 35th International Conference on Machine Learning (ICML), volume 80, pp.  2747–2755, 2018.
  32. Time-reversal symmetry in dynamical systems: a survey. Physica D: Nonlinear Phenomena, 112(1-2):1–39, 1998.
  33. Landau, L. The theory of phase transitions. Nature, 138(3498):840–841, 1936.
  34. Equiformerv2: Improved equivariant transformer for scaling to higher-degree representations. arXiv preprint arXiv:2306.12059, 2023.
  35. Spherical message passing for 3d molecular graphs. In International Conference on Learning Representations, 2022. URL https://openreview.net/forum?id=givsRXsOt9r.
  36. Time-reversal symmetry-breaking superconductivity in sr2ruo4. Nature, 394(6693):558–561, 1998.
  37. McNeela, D. Almost equivariance via lie algebra convolutions. arXiv preprint arXiv:2310.13164, 2023.
  38. Nabarro, F. Dislocations in a simple cubic lattice. Proceedings of the Physical Society, 59(2):256, 1947.
  39. Onuki, A. Phase transition dynamics. Cambridge University Press, 2002.
  40. A unified framework to enforce, discover, and promote symmetry in machine learning. arXiv preprint arXiv:2311.00212, 2023.
  41. Reducing so (3) convolutions to so (2) for efficient equivariant gnns. arXiv preprint arXiv:2302.03655, 2023.
  42. Approximation-generalization trade-offs under (approximate) group equivariance. arXiv preprint arXiv:2305.17592, 2023.
  43. Pope, S. B. Turbulent flows. Measurement Science and Technology, 12(11):2020–2021, 2001.
  44. E(n) equivariant graph neural networks. arXiv preprint arXiv:2102.09844, 2021a.
  45. E (n) equivariant graph neural networks. In International Conference on Machine Learning, pp.  9323–9332. PMLR, 2021b.
  46. Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems, 30, 2017.
  47. Finding symmetry breaking order parameters with euclidean neural networks. Physical Review Research, 3(1):L012002, 2021.
  48. Strocchi, F. Symmetry breaking, volume 643. Springer, 2005.
  49. Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219, 2018.
  50. The 2p absorption spectra of 3d transition metal compounds in tetrahedral and octahedral symmetry. Journal of Physics: Condensed Matter, 4(16):4189, 1992.
  51. Learning layer-wise equivariances automatically using gradients. arXiv preprint arXiv:2310.06131, 2023.
  52. Trajectory prediction using equivariant continuous convolution. International Conference on Learning Representations, 2021.
  53. Incorporating symmetry into deep dynamics models for improved generalization. In International Conference on Learning Representations (ICLR), 2020.
  54. Approximately equivariant networks for imperfectly symmetric dynamics. In International Conference on Machine Learning, pp.  23078–23091. PMLR, 2022.
  55. General E(2)-equivariant steerable CNNs. In Advances in Neural Information Processing Systems (NeurIPS), pp.  14334–14345, 2019.
  56. Weinberg, S. Implications of dynamical symmetry breaking. Physical Review D, 13(4):974, 1976.
  57. Woodward, P. M. Octahedral tilting in perovskites. i. geometrical considerations. Acta Crystallographica Section B: Structural Science, 53(1):32–43, 1997.
  58. Harmonic networks: Deep translation and rotation equivariance. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp.  5028–5037, 2017.
  59. Latent space symmetry discovery. arXiv preprint arXiv:2310.00105, 2023a.
  60. Generative adversarial symmetry discovery. arXiv preprint arXiv:2302.00236, 2023b.
  61. Stochastic learning under random reshuffling with constant step-sizes. IEEE Transactions on Signal Processing, 67(2):474–489, 2018.
  62. Deep sets. arXiv preprint arXiv:1703.06114, 2017.
  63. Kolmogorov spectra of turbulence I: Wave turbulence. Springer Science & Business Media, 2012.
  64. Zhang, W. Shift-invariant pattern recognition neural network and its optical architecture. In Proceedings of annual conference of the Japan Society of Applied Physics, 1988.
  65. Meta-learning symmetries by reparameterization. In International Conference on Learning Representations, 2021. URL https://openreview.net/forum?id=-QxT4mJdijq.
  66. Spherical channels for modeling atomic interactions. Advances in Neural Information Processing Systems, 35:8054–8067, 2022.
Citations (6)

Summary

We haven't generated a summary for this paper yet.