Entropic Efficiency of Bayesian Inference Protocols
Abstract: Inference is a versatile tool that underlies scientific discovery, machine learning, and everyday decision-making: it describes how an agent updates a probability distribution as partial information is acquired from multiple measurements, reducing ignorance about a system's latent state. We define an inferential efficiency as the ratio of information gain to cumulative memory erasure cost, with inefficiency arising from unexploited correlations between the measured system and memories, and/or between memories and environment (noise). Using this efficiency, we benchmark two limiting measurement paradigms: sequential, in which the same memory is exploited iteratively, and parallel, in which many memories are exploited simultaneously. In both cases, the minimal erasure cost reflects correlations across memories: temporal in sequential, spatial in parallel. Remarkably, when all system-memory correlations are exploited for inference, both paradigms attain the same minimal erasure cost, even in the presence of noise. Conversely, the parallel paradigm performs better in the presence of unexploited correlations, stemming from hidden memories' degrees of freedom. This approach provides a quantitative, physically grounded criterion to compare inference strategies, determine their efficiency, and link target information gains to their minimal entropic cost.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.