Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Towards Unifying Logical Entailment and Statistical Estimation (2202.13406v1)

Published 27 Feb 2022 in cs.AI

Abstract: This paper gives a generative model of the interpretation of formal logic for data-driven logical reasoning. The key idea is to represent the interpretation as likelihood of a formula being true given a model of formal logic. Using the likelihood, Bayes' theorem gives the posterior of the model being the case given the formula. The posterior represents an inverse interpretation of formal logic that seeks models making the formula true. The likelihood and posterior cause Bayesian learning that gives the probability of the conclusion being true in the models where all the premises are true. This paper looks at statistical and logical properties of the Bayesian learning. It is shown that the generative model is a unified theory of several different types of reasoning in logic and statistics.

Summary

We haven't generated a summary for this paper yet.