Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
38 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Emergent Discrete Communication in Semantic Spaces (2108.01828v3)

Published 4 Aug 2021 in cs.LG, cs.CL, cs.MA, and cs.RO

Abstract: Neural agents trained in reinforcement learning settings can learn to communicate among themselves via discrete tokens, accomplishing as a team what agents would be unable to do alone. However, the current standard of using one-hot vectors as discrete communication tokens prevents agents from acquiring more desirable aspects of communication such as zero-shot understanding. Inspired by word embedding techniques from natural language processing, we propose neural agent architectures that enables them to communicate via discrete tokens derived from a learned, continuous space. We show in a decision theoretic framework that our technique optimizes communication over a wide range of scenarios, whereas one-hot tokens are only optimal under restrictive assumptions. In self-play experiments, we validate that our trained agents learn to cluster tokens in semantically-meaningful ways, allowing them communicate in noisy environments where other techniques fail. Lastly, we demonstrate both that agents using our method can effectively respond to novel human communication and that humans can understand unlabeled emergent agent communication, outperforming the use of one-hot communication.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Mycal Tucker (11 papers)
  2. Huao Li (11 papers)
  3. Siddharth Agrawal (7 papers)
  4. Dana Hughes (11 papers)
  5. Katia Sycara (93 papers)
  6. Michael Lewis (11 papers)
  7. Julie Shah (38 papers)
Citations (26)