Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 102 tok/s
Gemini 2.5 Pro 58 tok/s Pro
GPT-5 Medium 25 tok/s
GPT-5 High 35 tok/s Pro
GPT-4o 99 tok/s
GPT OSS 120B 472 tok/s Pro
Kimi K2 196 tok/s Pro
2000 character limit reached

An epistemic interpretation of quantum probability via contextuality (1806.09125v1)

Published 24 Jun 2018 in quant-ph

Abstract: According to a standard view, quantum mechanics (QM) is a contextual theory and quantum probability does not satisfy Kolmogorov's axioms. We show, by considering the macroscopic contexts associated with measurement procedures and the microscopic contexts (mu-contexts) underlying them, that one can interpret quantum probability as epistemic, despite its non-Kolmogorovian structure. To attain this result we introduce a predicate language L(x), a classical probability measure on it and a family of classical probability measures on sets of mu contexts, each element of the family corresponding to a (macroscopic) measurement procedure. By using only Kolmogorovian probability measures we can thus define mean conditional probabilities on the set of properties of any quantum system that admit an epistemic interpretation but are not bound to satisfy Kolmogorov's axioms. The generalized probability measures associated with states in QM can then be seen as special cases of these mean probabilities, which explains how they can be non-classical and provides them with an epistemic interpretation. Moreover, the distinction between compatible and incompatible properties is explained in a natural way, and purely theoretical classical conditional probabilities coexist with empirically testable quantum conditional probabilities.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

Authors (1)