Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning to Induce Causal Structure (2204.04875v2)

Published 11 Apr 2022 in stat.ML and cs.LG

Abstract: The fundamental challenge in causal induction is to infer the underlying graph structure given observational and/or interventional data. Most existing causal induction algorithms operate by generating candidate graphs and evaluating them using either score-based methods (including continuous optimization) or independence tests. In our work, we instead treat the inference process as a black box and design a neural network architecture that learns the mapping from both observational and interventional data to graph structures via supervised training on synthetic graphs. The learned model generalizes to new synthetic graphs, is robust to train-test distribution shifts, and achieves state-of-the-art performance on naturalistic graphs for low sample complexity.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (10)
  1. Nan Rosemary Ke (40 papers)
  2. Silvia Chiappa (26 papers)
  3. Jane Wang (19 papers)
  4. Anirudh Goyal (93 papers)
  5. Jorg Bornschein (22 papers)
  6. Melanie Rey (7 papers)
  7. Matthew Botvinic (1 paper)
  8. Michael Mozer (17 papers)
  9. Danilo Jimenez Rezende (27 papers)
  10. Theophane Weber (23 papers)
Citations (38)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com