Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Compositional Languages Emerge in a Neural Iterated Learning Model (2002.01365v2)

Published 4 Feb 2020 in cs.CL, cs.AI, and cs.LG

Abstract: The principle of compositionality, which enables natural language to represent complex concepts via a structured combination of simpler ones, allows us to convey an open-ended set of messages using a limited vocabulary. If compositionality is indeed a natural property of language, we may expect it to appear in communication protocols that are created by neural agents in language games. In this paper, we propose an effective neural iterated learning (NIL) algorithm that, when applied to interacting neural agents, facilitates the emergence of a more structured type of language. Indeed, these languages provide learning speed advantages to neural agents during training, which can be incrementally amplified via NIL. We provide a probabilistic model of NIL and an explanation of why the advantage of compositional language exist. Our experiments confirm our analysis, and also demonstrate that the emerged languages largely improve the generalizing power of the neural agent communication.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Yi Ren (215 papers)
  2. Shangmin Guo (18 papers)
  3. Matthieu Labeau (15 papers)
  4. Shay B. Cohen (78 papers)
  5. Simon Kirby (7 papers)
Citations (93)