Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Toward Compositional Generalization in Object-Oriented World Modeling (2204.13661v2)

Published 28 Apr 2022 in cs.LG, cs.AI, and cs.RO

Abstract: Compositional generalization is a critical ability in learning and decision-making. We focus on the setting of reinforcement learning in object-oriented environments to study compositional generalization in world modeling. We (1) formalize the compositional generalization problem with an algebraic approach and (2) study how a world model can achieve that. We introduce a conceptual environment, Object Library, and two instances, and deploy a principled pipeline to measure the generalization ability. Motivated by the formulation, we analyze several methods with exact or no compositional generalization ability using our framework, and design a differentiable approach, Homomorphic Object-oriented World Model (HOWM), that achieves soft but more efficient compositional generalization.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Linfeng Zhao (17 papers)
  2. Lingzhi Kong (2 papers)
  3. Robin Walters (73 papers)
  4. Lawson L. S. Wong (30 papers)
Citations (17)

Summary

We haven't generated a summary for this paper yet.