A Characterization Theorem for Equivariant Networks with Point-wise Activations (2401.09235v1)
Abstract: Equivariant neural networks have shown improved performance, expressiveness and sample complexity on symmetrical domains. But for some specific symmetries, representations, and choice of coordinates, the most common point-wise activations, such as ReLU, are not equivariant, hence they cannot be employed in the design of equivariant neural networks. The theorem we present in this paper describes all possible combinations of finite-dimensional representations, choice of coordinates and point-wise activations to obtain an exactly equivariant layer, generalizing and strengthening existing characterizations. Notable cases of practical relevance are discussed as corollaries. Indeed, we prove that rotation-equivariant networks can only be invariant, as it happens for any network which is equivariant with respect to connected compact groups. Then, we discuss implications of our findings when applied to important instances of exactly equivariant networks. First, we completely characterize permutation equivariant networks such as Invariant Graph Networks with point-wise nonlinearities and their geometric counterparts, highlighting a plethora of models whose expressive power and performance are still unknown. Second, we show that feature spaces of disentangled steerable convolutional neural networks are trivial representations.
- A PAC-Bayesian Generalization Bound for Equivariant Networks. In Advances in Neural Information Processing Systems, 2022.
- Roto-Translation Covariant Convolutional Networks for Medical Image Analysis. Roto-Translation Covariant Convolutional Networks for Medical Image Analysis, 1st Conference on Medical Imaging with Deep Learning (MIDL 2018), Amsterdam, The Netherlands, April 2018.
- Fast, Expressive SE$(n)$ Equivariant Networks through Weight-Sharing in Position-Orientation Space, October 2023. URL http://arxiv.org/abs/2310.02970. arXiv:2310.02970 [cs, math].
- Dimensions of irreducible modules for partition algebras and tensor power multiplicities for symmetric and alternating groups, May 2016. arXiv:1605.06543 [math].
- Geometric Deep Learning: Going beyond Euclidean data. IEEE Signal Processing Magazine, 34(4):18–42, July 2017. Conference Name: IEEE Signal Processing Magazine.
- Geometric Deep Learning: Grids, Groups, Graphs, Geodesics, and Gauges. arXiv:2104.13478 [cs, stat], May 2021. arXiv: 2104.13478.
- Learning the Irreducible Representations of Commutative Lie Groups. In Proceedings of the 31st International Conference on Machine Learning, pp. 1755–1763. PMLR, June 2014.
- Group Equivariant Convolutional Networks. In Proceedings of The 33rd International Conference on Machine Learning, pp. 2990–2999. PMLR, June 2016a.
- Steerable CNNs. In International Conference on Learning Representations, November 2016b.
- Spherical CNNs. In International Conference on Learning Representations, February 2018.
- A General Theory of Equivariant CNNs on Homogeneous Spaces. In Advances in Neural Information Processing Systems, volume 32. Curran Associates, Inc., 2019.
- Rotation-invariant convolutional neural networks for galaxy morphology prediction. Monthly Notices of the Royal Astronomical Society, 450(2):1441–1459, June 2015.
- Exploiting Cyclic Symmetry in Convolutional Neural Networks. In Proceedings of The 33rd International Conference on Machine Learning, pp. 1889–1898. PMLR, June 2016.
- Low Dimensional Invariant Embeddings for Universal Geometric Learning, May 2022. arXiv:2205.02956 [cs, math].
- On the Universality of Rotation Equivariant Point Cloud Networks. In International Conference on Learning Representations, October 2020.
- Bryn Elesedy. Group Symmetry in PAC Learning. In ICLR 2022 Workshop on Geometrical and Topological Representation Learning, 2022.
- A Simple and Universal Rotation Equivariant Point-Cloud Network. In Proceedings of Topological, Algebraic, and Geometric Learning Workshops 2022, pp. 107–115. PMLR, November 2022.
- Peter Flor. On groups of non-negative matrices. Compositio Mathematica, 21(4):376–382, 1969.
- Representation Theory, volume 129 of Graduate Texts in Mathematics. Springer, New York, NY, 2004.
- GemNet: Universal Directional Graph Neural Networks for Molecules 2022. 35th Conference on Neural Information Processing Systems.
- Floris Geerts. On the expressive power of linear algebra on graphs, February 2020. arXiv:1812.04379 [cs].
- Expressiveness and Approximation Properties of Graph Neural Networks. pp. 43, 2022.
- HexaConv. In International Conference on Learning Representations, February 2018.
- Predicting molecular properties with covariant compositional networks. The Journal of Chemical Physics, 148:241745, June 2018.
- On the Expressive Power of Geometric Graph Neural Networks. International Conference of Learning Representations, 2023.
- Ramakrishna Kakarala. Triple correlation on groups. phd, University of California at Irvine, USA, 1992. UMI Order No. GAX93-04094.
- Imre Risi Kondor. Group theoretical methods in machine learning. Columbia University, 2008.
- Risi Kondor. N-body Networks: a Covariant Hierarchical Neural Network Architecture for Learning Atomic Potentials, March 2018. arXiv:1803.01588 [cs].
- On the Generalization of Equivariance and Convolution in Neural Networks to the Action of Compact Groups. In Proceedings of the 35th International Conference on Machine Learning, pp. 2747–2755. PMLR, July 2018.
- Covariant Compositional Networks For Learning Graphs. International Conference of Learning Representations, February 2018.
- A Wigner-Eckart Theorem for Group Equivariant Convolution Kernels. In International Conference on Learning Representations, 2021.
- Rotation Equivariant Vector Field Networks. In 2017 IEEE International Conference on Computer Vision (ICCV), pp. 5058–5067, Venice, October 2017. IEEE.
- Invariant and Equivariant Graph Networks. In International Conference on Learning Representations, September 2018.
- Provably Powerful Graph Networks. International Conference of Learning Representations, 2019a.
- On the Universality of Invariant Networks. In Proceedings of the 36th International Conference on Machine Learning, pp. 4363–4371. PMLR, May 2019b.
- On Learning Sets of Symmetric Elements. In Proceedings of the 37th International Conference on Machine Learning, pp. 6734–6744. PMLR, November 2020.
- Permutation Equivariant Layers for Higher Order Interactions. In Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, pp. 5987–6001. PMLR, May 2022.
- Equivariant Polynomials for Graph Neural Networks, 2023. In Proceedings of the 40th International Conference on Machine Learning
- PointNet: Deep Learning on Point Sets for 3D Classification and Segmentation. In 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 77–85, Honolulu, HI, July 2017. IEEE.
- Siamak Ravanbakhsh. Universal Equivariant Multilayer Perceptrons. In Proceedings of the 37th International Conference on Machine Learning, pp. 7996–8006. PMLR, November 2020.
- Dynamic Routing Between Capsules. Advances in Neural Information Processing Systems, 2017.
- Bruce E. Sagan. The Symmetric Group, volume 203 of Graduate Texts in Mathematics. Springer New York, New York, NY, 2001.
- Improved generalization bounds of group invariant / equivariant deep networks via quotient feature spaces. In Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence, pp. 771–780. PMLR, December 2021.
- Rotation, Scaling and Deformation Invariant Scattering for Texture Discrimination. In 2013 IEEE Conference on Computer Vision and Pattern Recognition, pp. 1233–1240, June 2013.
- Tensor field networks: Rotation- and translation-equivariant neural networks for 3D point clouds, May 2018.
- General E(2) - Equivariant Steerable CNNs. Advances in Neural Information Processing Systems, 2019.
- 3D steerable CNNs: learning rotationally equivariant features in volumetric data. In Proceedings of the 32nd International Conference on Neural Information Processing Systems, NIPS’18, pp. 10402–10413, Red Hook, NY, USA, 2018. Curran Associates Inc.
- Representation theory and invariant neural networks. Discrete Applied Mathematics, 69(1-2):33–60, August 1996.
- Harmonic Networks: Deep Translation and Rotation Equivariance. In 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 7168–7177, July 2017.
- Dmitry Yarotsky. Universal approximations of invariant maps by neural networks, April 2018. arXiv:1804.10306 [cs].
- Deep Sets. In Advances in Neural Information Processing Systems, volume 30. Curran Associates, Inc., 2017.
- Ding-Xuan Zhou. Universality of deep convolutional neural networks. Applied and Computational Harmonic Analysis, 48(2):787–794, March 2020.
- A Functional Perspective on Learning Symmetric Functions with Neural Networks. In Proceedings of the 38th International Conference on Machine Learning, pp. 13023–13032. PMLR, July 2021.
- Marco Pacini (24 papers)
- Xiaowen Dong (84 papers)
- Bruno Lepri (121 papers)
- Gabriele Santin (42 papers)