Episodic-free Task Selection for Few-shot Learning (2402.00092v1)
Abstract: Episodic training is a mainstream training strategy for few-shot learning. In few-shot scenarios, however, this strategy is often inferior to some non-episodic training strategy, e. g., Neighbourhood Component Analysis (NCA), which challenges the principle that training conditions must match testing conditions. Thus, a question is naturally asked: How to search for episodic-free tasks for better few-shot learning? In this work, we propose a novel meta-training framework beyond episodic training. In this framework, episodic tasks are not used directly for training, but for evaluating the effectiveness of some selected episodic-free tasks from a task set that are performed for training the meta-learners. The selection criterion is designed with the affinity, which measures the degree to which loss decreases when executing the target tasks after training with the selected tasks. In experiments, the training task set contains some promising types, e. g., contrastive learning and classification, and the target few-shot tasks are achieved with the nearest centroid classifiers on the miniImageNet, tiered-ImageNet and CIFAR-FS datasets. The experimental results demonstrate the effectiveness of our approach.
- Metaheuristic algorithms on feature selection: A survey of one decade of research (2009-2019). Ieee Access, 9:26766–26791, 2021.
- Infinite mixture prototypes for few-shot learning. In International conference on machine learning, pages 232–241. PMLR, 2019.
- Low data drug discovery with one-shot learning. ACS central science, 3(4):283–293, 2017.
- Binary jaya algorithm with adaptive mutation for feature selection. Arabian Journal for Science and Engineering, 45(12):10875–10890, 2020.
- Meta-learning with differentiable closed-form solvers. arXiv preprint arXiv:1805.08136, 2018.
- Improving few-shot learning through multi-task representation learning theory. In European Conference on Computer Vision, pages 435–452. Springer, 2022.
- A closer look at the training strategy for modern meta-learning. Advances in Neural Information Processing Systems, 33:396–406, 2020.
- A closer look at few-shot classification. arXiv preprint arXiv:1904.04232, 2019.
- Meta-baseline: Exploring simple meta-learning for few-shot learning. In Proceedings of the IEEE/CVF international conference on computer vision, pages 9062–9071, 2021.
- A baseline for few-shot image classification. In International Conference on Learning Representations, 2020.
- Few-shot learning via learning the representation, provably. arXiv preprint arXiv:2002.09434, 2020.
- One-shot learning of object categories. IEEE transactions on pattern analysis and machine intelligence, 28(4):594–611, 2006.
- Efficiently identifying task groupings for multi-task learning. Advances in Neural Information Processing Systems, 34:27503–27516, 2021.
- Model-agnostic meta-learning for fast adaptation of deep networks. In International conference on machine learning, pages 1126–1135. PMLR, 2017.
- Stanislav Fort. Gaussian prototypical networks for few-shot learning on omniglot. arXiv preprint arXiv:1708.02735, 2017.
- Dynamic few-shot visual learning without forgetting. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pages 4367–4375, 2018.
- Meta-learning in neural networks: A survey. IEEE transactions on pattern analysis and machine intelligence, 44(9):5149–5169, 2021.
- Cross attention network for few-shot classification. In NIPS, pages 4005–4016, 2019.
- Relational embedding for few-shot classification. In ICCV, pages 8822–8833, 2021.
- Multi-task learning using uncertainty to weigh losses for scene geometry and semantics. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 7482–7491, 2018.
- Supervised contrastive learning. Advances in neural information processing systems, 33:18661–18673, 2020.
- Hyperbolic image embeddings. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 6418––6428, 2020.
- Learning multiple layers of features from tiny images. 2009.
- On episodes, prototypical networks, and few-shot learning. Advances in Neural Information Processing Systems, 34:24581–24592, 2021.
- Meta-learning with differentiable convex optimization. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pages 10657–10665, 2019.
- Health analytics on covid-19 data with few-shot learning. In Big Data Analytics and Knowledge Discovery: 23rd International Conference, DaWaK 2021, Virtual Event, September 27–30, 2021, Proceedings 23, pages 67–80. Springer, 2021.
- Finding task-relevant features for few-shot learning by category traversal. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pages 1–10, 2019.
- Prototype rectification for few-shot learning. In Computer Vision–ECCV 2020: 16th European Conference, Glasgow, UK, August 23–28, 2020, Proceedings, Part I 16, pages 741–756. Springer, 2020.
- Rapid adaptation with conditionally shifted neurons. In International Conference on Machine Learning, pages 3664–3673, 2018.
- Tadam: Task dependent adaptive metric for improved few-shot learning. Advances in neural information processing systems, 31, 2018.
- Multimodal prototypical networks for few-shot learning. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, pages 2644–2653, 2021.
- Automatic differentiation in pytorch. 2017.
- Rapid learning or feature reuse? towards understanding the effectiveness of maml. arXiv preprint arXiv:1909.09157, 2019.
- Few-shot learning with embedded class models and shot-free meta training. In ICCV, pages 331–339, 2019.
- Meta-learning for semi-supervised few-shot classification. arXiv preprint arXiv:1803.00676, 2018.
- Multi-task learning as multi-objective optimization. Advances in neural information processing systems, 31, 2018.
- Adaptive subspaces for few-shot learning. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pages 4136–4145, 2020.
- Prototypical networks for few-shot learning. Advances in neural information processing systems, 30, 2017.
- Learning to compare: Relation network for few-shot learning. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 1199–1208, 2018.
- Rethinking few-shot image classification: a good embedding is all you need? In Computer Vision–ECCV 2020: 16th European Conference, Glasgow, UK, August 23–28, 2020, Proceedings, Part XIV 16, pages 266–282. Springer, 2020.
- Multi-task meta learning: learn how to adapt to unseen tasks. In 2023 International Joint Conference on Neural Networks (IJCNN), pages 1–10. IEEE, 2023.
- Matching networks for one shot learning. In Advances in neural information processing systems, pages 3637–3645, 2016.
- Generalizing from a few examples: A survey on few-shot learning. ACM computing surveys (csur), 53(3):1–34, 2020.
- A survey of few-shot learning in smart agriculture: developments, applications, and challenges. Plant Methods, 18(1):1–12, 2022a.
- Few-shot classification with contrastive learning. In European Conference on Computer Vision, pages 293–309. Springer, 2022b.
- Deepemd: Few-shot image classification with differentiable earth mover’s distance and structured classifiers. In CVPR, pages 12200–12210, 2020.
- Few-shot classification with contrastive learning. In ECCV(2022), pages 1–14, 2022.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.