Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Improving Compositional Generalization Using Iterated Learning and Simplicial Embeddings (2310.18777v1)

Published 28 Oct 2023 in cs.LG and cs.AI

Abstract: Compositional generalization, the ability of an agent to generalize to unseen combinations of latent factors, is easy for humans but hard for deep neural networks. A line of research in cognitive science has hypothesized a process, ``iterated learning,'' to help explain how human language developed this ability; the theory rests on simultaneous pressures towards compressibility (when an ignorant agent learns from an informed one) and expressivity (when it uses the representation for downstream tasks). Inspired by this process, we propose to improve the compositional generalization of deep networks by using iterated learning on models with simplicial embeddings, which can approximately discretize representations. This approach is further motivated by an analysis of compositionality based on Kolmogorov complexity. We show that this combination of changes improves compositional generalization over other approaches, demonstrating these improvements both on vision tasks with well-understood latent factors and on real molecular graph prediction tasks where the latent structure is unknown.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Yi Ren (215 papers)
  2. Samuel Lavoie (9 papers)
  3. Mikhail Galkin (39 papers)
  4. Danica J. Sutherland (49 papers)
  5. Aaron Courville (201 papers)
Citations (13)

Summary

We haven't generated a summary for this paper yet.