Papers
Topics
Authors
Recent
Search
2000 character limit reached

Entropic Efficiency of Bayesian Inference Protocols

Published 24 Jan 2026 in cond-mat.stat-mech and quant-ph | (2601.17282v1)

Abstract: Inference is a versatile tool that underlies scientific discovery, machine learning, and everyday decision-making: it describes how an agent updates a probability distribution as partial information is acquired from multiple measurements, reducing ignorance about a system's latent state. We define an inferential efficiency as the ratio of information gain to cumulative memory erasure cost, with inefficiency arising from unexploited correlations between the measured system and memories, and/or between memories and environment (noise). Using this efficiency, we benchmark two limiting measurement paradigms: sequential, in which the same memory is exploited iteratively, and parallel, in which many memories are exploited simultaneously. In both cases, the minimal erasure cost reflects correlations across memories: temporal in sequential, spatial in parallel. Remarkably, when all system-memory correlations are exploited for inference, both paradigms attain the same minimal erasure cost, even in the presence of noise. Conversely, the parallel paradigm performs better in the presence of unexploited correlations, stemming from hidden memories' degrees of freedom. This approach provides a quantitative, physically grounded criterion to compare inference strategies, determine their efficiency, and link target information gains to their minimal entropic cost.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.