Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Emergence of Compositional Language with Deep Generational Transmission (1904.09067v2)

Published 19 Apr 2019 in cs.LG, cs.AI, cs.CL, and stat.ML

Abstract: Recent work has studied the emergence of language among deep reinforcement learning agents that must collaborate to solve a task. Of particular interest are the factors that cause language to be compositional -- i.e., express meaning by combining words which themselves have meaning. Evolutionary linguists have found that in addition to structural priors like those already studied in deep learning, the dynamics of transmitting language from generation to generation contribute significantly to the emergence of compositionality. In this paper, we introduce these cultural evolutionary dynamics into language emergence by periodically replacing agents in a population to create a knowledge gap, implicitly inducing cultural transmission of language. We show that this implicit cultural transmission encourages the resulting languages to exhibit better compositional generalization.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Michael Cogswell (19 papers)
  2. Jiasen Lu (32 papers)
  3. Stefan Lee (62 papers)
  4. Devi Parikh (129 papers)
  5. Dhruv Batra (160 papers)
Citations (49)