Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A General Non-Probabilistic Theory of Inductive Reasoning (1304.2375v1)

Published 27 Mar 2013 in cs.AI

Abstract: Probability theory, epistemically interpreted, provides an excellent, if not the best available account of inductive reasoning. This is so because there are general and definite rules for the change of subjective probabilities through information or experience; induction and belief change are one and same topic, after all. The most basic of these rules is simply to conditionalize with respect to the information received; and there are similar and more general rules. 1 Hence, a fundamental reason for the epistemological success of probability theory is that there at all exists a well-behaved concept of conditional probability. Still, people have, and have reasons for, various concerns over probability theory. One of these is my starting point: Intuitively, we have the notion of plain belief; we believe propositions2 to be true (or to be false or neither). Probability theory, however, offers no formal counterpart to this notion. Believing A is not the same as having probability 1 for A, because probability 1 is incorrigible3; but plain belief is clearly corrigible. And believing A is not the same as giving A a probability larger than some 1 - c, because believing A and believing B is usually taken to be equivalent to believing A & B.4 Thus, it seems that the formal representation of plain belief has to take a non-probabilistic route. Indeed, representing plain belief seems easy enough: simply represent an epistemic state by the set of all propositions believed true in it or, since I make the common assumption that plain belief is deductively closed, by the conjunction of all propositions believed true in it. But this does not yet provide a theory of induction, i.e. an answer to the question how epistemic states so represented are changed tbrough information or experience. There is a convincing partial answer: if the new information is compatible with the old epistemic state, then the new epistemic state is simply represented by the conjunction of the new information and the old beliefs. This answer is partial because it does not cover the quite common case where the new information is incompatible with the old beliefs. It is, however, important to complete the answer and to cover this case, too; otherwise, we would not represent plain belief as conigible. The crucial problem is that there is no good completion. When epistemic states are represented simply by the conjunction of all propositions believed true in it, the answer cannot be completed; and though there is a lot of fruitful work, no other representation of epistemic states has been proposed, as far as I know, which provides a complete solution to this problem. In this paper, I want to suggest such a solution. In [4], I have more fully argued that this is the only solution, if certain plausible desiderata are to be satisfied. Here, in section 2, I will be content with formally defining and intuitively explaining my proposal. I will compare my proposal with probability theory in section 3. It will turn out that the theory I am proposing is structurally homomorphic to probability theory in important respects and that it is thus equally easily implementable, but moreover computationally simpler. Section 4 contains a very brief comparison with various kinds of logics, in particular conditional logic, with Shackle's functions of potential surprise and related theories, and with the Dempster - Shafer theory of belief functions.

Citations (191)

Summary

  • The paper presents a novel framework using Natural Conditional Functions (NCFs) to rigorously formalize plain belief beyond probabilistic models.
  • It translates probabilistic operations into NCF analogues, enabling efficient computational modeling of belief revision and conditional independence.
  • It discusses both the theoretical strengths and practical limitations of NCFs, highlighting challenges when integrating statistical data.

A General Non-Probabilistic Theory of Inductive Reasoning

Wolfgang Spohn's paper, "A General Non-Probabilistic Theory of Inductive Reasoning," introduces a novel framework for modeling inductive reasoning through non-probabilistic means, addressing perceived inadequacies in probabilistic models, particularly in their handling of plain belief. The paper posits that plain belief, intuitively understood as the act of holding propositions as true or false, lacks a formal representation within traditional probability theory. Spohn's approach seeks to rectify this by employing Natural Conditional Functions (NCFs), a construct that enables a rigorous formalization of belief states and their transformations through incoming information.

Key Concepts and Framework

The paper begins by delineating the algebraic underpinning necessary for Spohn's proposed theory. It introduces the notion of a world WW, composed of possible states or events, within which propositions are defined as subsets. The central construct of the paper, an A-measurable natural conditional function (A-NCF), is defined over WW with natural numbers serving as the basis for grading disbelief.

Under this framework, NCFs distinguish themselves by not merely indicating the likelihood of events, but rather the degree of disbelief assigned to them. It proposes that disbelieved propositions, coupled with a concept of minimal disbelief (i.e., K(A)=0K(A) = 0), frame the foundation for belief formation and transformation. An epistemic state's change is conceptualized through the introduction of new propositions from incoming information, with special attention to compatibility with existing beliefs.

Structural and Comparative Analysis

Structurally, the theory shares analogous mechanisms with probability theory, such as the translation of probabilistic operations to NCF terms (minimum for sum, addition for multiplication, and subtraction for division), thus allowing similar applications in computation and inference. Spohn draws significant parallels, highlighting the theory's potential utility in addressing challenges typically managed with probabilistic models, such as causation and conditional independence.

Computational and Epistemological Implications

Spohn emphasizes the computational benefits of using NCFs over traditional probability measures, noting reduced complexity and potentially easier elicitation of expert judgment. From an epistemological perspective, NCFs provide a more flexible framework for plain belief representation, addressing issues arising from the rigidity and incorrigibility of probability 1 in standard models.

The paper implies that while NCFs offer numerous advantages in flexibility and simplicity, challenges persist in integrating statistical data sources, which are inherently probabilistic. Spohn acknowledges this gap as a critical limitation, suggesting that while NCFs excel in theoretical modeling, their practical application in areas reliant on statistical data may currently lack robustness.

Comparison with Alternative Models

Spohn offers a critical examination of his theory relative to other epistemic frameworks, including various logics, Shackle's functions of potential surprise, Shafer's belief functions, and fuzzy logic. Key distinctions are drawn around the treatment and representation of belief change, particularly regarding the conditionalization processes and maintenance of epistemic consistency. Unlike these models, NCFs propose a more comprehensive handling of iterative belief revisions, incorporating a robust system of conditional independence. Notably, the paper argues against the commensurability of NCFs with fuzzy logic due to differing applications and conceptual foundations.

Future Directions

Given the outlined advantages and intrinsic limitations, Spohn's theory invites further exploration in both theoretical and applied contexts. Potential avenues could involve the development of hybrid frameworks that leverage NCFs for belief representation while integrating probabilistic methodologies for empirical data handling. In doing so, the framework may yield deeper insights into deterministic causation and parallel processing of belief updates.

In conclusion, Spohn's contribution to the domain of epistemic modeling through non-probabilistic theory introduces a structured and potentially transformative paradigm for inductive reasoning that is consistent, computationally efficient, and adaptable to dynamic belief systems. Continued investigation into its applicability, particularly concerning statistical integration, remains a promising trajectory for advancing the field of epistemology and inductive logic.