Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Unsupervised Meta-Learning through Latent-Space Interpolation in Generative Models (2006.10236v1)

Published 18 Jun 2020 in cs.LG and stat.ML

Abstract: Unsupervised meta-learning approaches rely on synthetic meta-tasks that are created using techniques such as random selection, clustering and/or augmentation. Unfortunately, clustering and augmentation are domain-dependent, and thus they require either manual tweaking or expensive learning. In this work, we describe an approach that generates meta-tasks using generative models. A critical component is a novel approach of sampling from the latent space that generates objects grouped into synthetic classes forming the training and validation data of a meta-task. We find that the proposed approach, LAtent Space Interpolation Unsupervised Meta-learning (LASIUM), outperforms or is competitive with current unsupervised learning baselines on few-shot classification tasks on the most widely used benchmark datasets. In addition, the approach promises to be applicable without manual tweaking over a wider range of domains than previous approaches.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Siavash Khodadadeh (7 papers)
  2. Sharare Zehtabian (3 papers)
  3. Saeed Vahidian (21 papers)
  4. Weijia Wang (36 papers)
  5. Bill Lin (23 papers)
  6. Ladislau Bölöni (27 papers)
Citations (32)