Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Model Transformations for Ranking Functions and Total Preorders (2203.14018v1)

Published 26 Mar 2022 in cs.AI

Abstract: In the field of knowledge representation, the considered epistemic states are often based on propositional interpretations, also called worlds. E.g., epistemic states of agents can be modelled by ranking functions or total preorders on worlds. However, there are usually different ways of how to describe a real world situation in a propositional language; this can be seen as different points of view on the same situation. In this paper we introduce the concept of model transformations to convert an epistemic state from one point of view to another point of view, yielding a novel notion of equivalence of epistemic states. We show how the well-known advantages of syntax splitting, originally developed for belief sets and later extended to representation of epistemic states and to nonmonotonic reasoning, can be exploited for belief revision via model transformation by uncovering splittings not being present before. Furthermore, we characterize situations where belief change operators commute with model transformations.

Summary

We haven't generated a summary for this paper yet.