Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Generating by Understanding: Neural Visual Generation with Logical Symbol Groundings (2310.17451v2)

Published 26 Oct 2023 in cs.AI, cs.CV, and cs.GR

Abstract: Despite the great success of neural visual generative models in recent years, integrating them with strong symbolic reasoning systems remains a challenging task. There are two levels of symbol grounding problems among the core challenges: the first is symbol assignment, i.e. mapping latent factors of neural visual generators to semantic-meaningful symbolic factors from the reasoning systems by learning from limited labeled data. The second is rule learning, i.e. learning new rules that govern the generative process to enhance the symbolic reasoning systems. To deal with these two problems, we propose a neurosymbolic learning approach, Abductive visual Generation (AbdGen), for integrating logic programming systems with neural visual generative models based on the abductive learning framework. To achieve reliable and efficient symbol grounding, the quantized abduction method is introduced for generating abduction proposals by the nearest-neighbor lookup within semantic codebooks. To achieve precise rule learning, the contrastive meta-abduction method is proposed to eliminate wrong rules with positive cases and avoid less informative rules with negative cases simultaneously. Experimental results show that compared to the baseline approaches, AbdGen requires significantly less labeled data for symbol assignment. Furthermore, AbdGen can effectively learn underlying logical generative rules from data, which is out of the capability of existing approaches. The code is released at this link: https://github.com/candytalking/AbdGen.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Yifei Peng (2 papers)
  2. Yu Jin (45 papers)
  3. Zhexu Luo (2 papers)
  4. Yao-Xiang Ding (11 papers)
  5. Wang-Zhou Dai (9 papers)
  6. Zhong Ren (7 papers)
  7. Kun Zhou (217 papers)

Summary

We haven't generated a summary for this paper yet.

Github Logo Streamline Icon: https://streamlinehq.com

GitHub

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets