Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Interaction Modeling with Multiplex Attention (2208.10660v2)

Published 23 Aug 2022 in cs.LG and cs.AI

Abstract: Modeling multi-agent systems requires understanding how agents interact. Such systems are often difficult to model because they can involve a variety of types of interactions that layer together to drive rich social behavioral dynamics. Here we introduce a method for accurately modeling multi-agent systems. We present Interaction Modeling with Multiplex Attention (IMMA), a forward prediction model that uses a multiplex latent graph to represent multiple independent types of interactions and attention to account for relations of different strengths. We also introduce Progressive Layer Training, a training strategy for this architecture. We show that our approach outperforms state-of-the-art models in trajectory forecasting and relation inference, spanning three multi-agent scenarios: social navigation, cooperative task achievement, and team sports. We further demonstrate that our approach can improve zero-shot generalization and allows us to probe how different interactions impact agent behavior.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Fan-Yun Sun (18 papers)
  2. Isaac Kauvar (3 papers)
  3. Ruohan Zhang (34 papers)
  4. Jiachen Li (144 papers)
  5. Mykel Kochenderfer (43 papers)
  6. Jiajun Wu (249 papers)
  7. Nick Haber (48 papers)
Citations (14)

Summary

We haven't generated a summary for this paper yet.