Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Compositional Generalization by Learning Analytical Expressions (2006.10627v2)

Published 18 Jun 2020 in cs.AI and cs.CL

Abstract: Compositional generalization is a basic and essential intellective capability of human beings, which allows us to recombine known parts readily. However, existing neural network based models have been proven to be extremely deficient in such a capability. Inspired by work in cognition which argues compositionality can be captured by variable slots with symbolic functions, we present a refreshing view that connects a memory-augmented neural model with analytical expressions, to achieve compositional generalization. Our model consists of two cooperative neural modules, Composer and Solver, fitting well with the cognitive argument while being able to be trained in an end-to-end manner via a hierarchical reinforcement learning algorithm. Experiments on the well-known benchmark SCAN demonstrate that our model seizes a great ability of compositional generalization, solving all challenges addressed by previous works with 100% accuracies.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (9)
  1. Qian Liu (252 papers)
  2. Shengnan An (12 papers)
  3. Jian-Guang Lou (69 papers)
  4. Bei Chen (56 papers)
  5. Zeqi Lin (25 papers)
  6. Yan Gao (157 papers)
  7. Bin Zhou (161 papers)
  8. Nanning Zheng (146 papers)
  9. Dongmei Zhang (193 papers)
Citations (71)

Summary

We haven't generated a summary for this paper yet.