Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Capacity, Bandwidth, and Compositionality in Emergent Language Learning (1910.11424v3)

Published 24 Oct 2019 in cs.CL, cs.AI, cs.LG, cs.MA, and stat.ML

Abstract: Many recent works have discussed the propensity, or lack thereof, for emergent languages to exhibit properties of natural languages. A favorite in the literature is learning compositionality. We note that most of those works have focused on communicative bandwidth as being of primary importance. While important, it is not the only contributing factor. In this paper, we investigate the learning biases that affect the efficacy and compositionality of emergent languages. Our foremost contribution is to explore how capacity of a neural network impacts its ability to learn a compositional language. We additionally introduce a set of evaluation metrics with which we analyze the learned languages. Our hypothesis is that there should be a specific range of model capacity and channel bandwidth that induces compositional structure in the resulting language and consequently encourages systematic generalization. While we empirically see evidence for the bottom of this range, we curiously do not find evidence for the top part of the range and believe that this is an open question for the community.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Cinjon Resnick (11 papers)
  2. Abhinav Gupta (178 papers)
  3. Jakob Foerster (100 papers)
  4. Andrew M. Dai (40 papers)
  5. Kyunghyun Cho (292 papers)
Citations (48)