Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Neural Logic Networks (1910.08629v1)

Published 17 Oct 2019 in cs.AI, cs.LG, cs.NE, and stat.ML

Abstract: Recent years have witnessed the great success of deep neural networks in many research areas. The fundamental idea behind the design of most neural networks is to learn similarity patterns from data for prediction and inference, which lacks the ability of logical reasoning. However, the concrete ability of logical reasoning is critical to many theoretical and practical problems. In this paper, we propose Neural Logic Network (NLN), which is a dynamic neural architecture that builds the computational graph according to input logical expressions. It learns basic logical operations as neural modules, and conducts propositional logical reasoning through the network for inference. Experiments on simulated data show that NLN achieves significant performance on solving logical equations. Further experiments on real-world data show that NLN significantly outperforms state-of-the-art models on collaborative filtering and personalized recommendation tasks.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Shaoyun Shi (12 papers)
  2. Hanxiong Chen (10 papers)
  3. Min Zhang (630 papers)
  4. Yongfeng Zhang (163 papers)
Citations (14)