Syntax-based Default Reasoning as Probabilistic Model-based Diagnosis (1302.6827v1)
Abstract: We view the syntax-based approaches to default reasoning as a model-based diagnosis problem, where each source giving a piece of information is considered as a component. It is formalized in the ATMS framework (each source corresponds to an assumption). We assume then that all sources are independent and "fail" with a very small probability. This leads to a probability assignment on the set of candidates, or equivalently on the set of consistent environments. This probability assignment induces a Dempster-Shafer belief function which measures the probability that a proposition can be deduced from the evidence. This belief function can be used in several different ways to define a non-monotonic consequence relation. We study and compare these consequence relations. The -case of prioritized knowledge bases is briefly considered.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.