On the Role of Neural Collapse in Meta Learning Models for Few-shot Learning (2310.00451v2)
Abstract: Meta-learning frameworks for few-shot learning aims to learn models that can learn new skills or adapt to new environments rapidly with a few training examples. This has led to the generalizability of the developed model towards new classes with just a few labelled samples. However these networks are seen as black-box models and understanding the representations learnt under different learning scenarios is crucial. Neural collapse ($\mathcal{NC}$) is a recently discovered phenomenon which showcases unique properties at the network proceeds towards zero loss. The input features collapse to their respective class means, the class means form a Simplex equiangular tight frame (ETF) where the class means are maximally distant and linearly separable, and the classifier acts as a simple nearest neighbor classifier. While these phenomena have been observed in simple classification networks, this study is the first to explore and understand the properties of neural collapse in meta learning frameworks for few-shot learning. We perform studies on the Omniglot dataset in the few-shot setting and study the neural collapse phenomenon. We observe that the learnt features indeed have the trend of neural collapse, especially as model size grows, but to do not necessarily showcase the complete collapse as measured by the $\mathcal{NC}$ properties.
- Dnn or k-nn: That is the generalize vs. memorize question. arXiv preprint arXiv:1805.06822, 2018.
- Deep residual learning for image recognition. CoRR, abs/1512.03385, 2015.
- Delving deep into rectifiers: Surpassing human-level performance on imagenet classification. CoRR, abs/1502.01852, 2015.
- Machine learning in a data-limited regime: Augmenting experiments with synthetic data uncovers order in crumpled sheets. Science Advances, 5(4):eaau6792, 2019.
- Meta-learning in neural networks: A survey. CoRR, abs/2004.05439, 2020.
- Human-level concept learning through probabilistic program induction. Science, 350(6266):1332–1338, 2015.
- Parseval networks: Improving robustness to adversarial examples. In International Conference on Machine Learning, pages 854–863. PMLR, 2017.
- T. Munkhdalai and H. Yu. Meta networks. CoRR, abs/1703.00837, 2017.
- Prevalence of neural collapse during the terminal phase of deep learning training. Proceedings of the National Academy of Sciences, 117(40):24652–24663, 2020.
- Deep neural networks with random gaussian weights: A universal classification strategy? IEEE Transactions on Signal Processing, 64(13):3444–3457,, 2016.
- Meta-learning with memory-augmented neural networks. In International conference on machine learning, pages 1842–1850. PMLR, 2016.
- L. Sifre and S. Mallat. Rotation, scaling and deformation invariant scattering for texture discrimination. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 1233–1240, 2013.
- Prototypical networks for few-shot learning. Advances in neural information processing systems, 30, 2017.
- On the role of neural collapse in transfer learning. arXiv:2112.15121, 2021.
- Y. Wang and Q. Yao. Few-shot learning: A survey. CoRR, abs/1904.05046, 2019.
- L. Weng. Meta-learning: Learning to learn fast. lilianweng.github.io, 2018.
- A geometric analysis of neural collapse with unconstrained features. CoRR, abs/2105.02375, 2021.