A universal theorem of sensory information
(2511.11463v1)
Published 14 Nov 2025 in q-bio.NC
Abstract: A universal theorem of sensory information, analogous to the second law of thermodynamics, is derived. Beginning from a minimal description of a sensory neuron, a state-space representation of firing rate emerges naturally from Shannon's measure of information. A special case of this formulation predicts a previously unknown inequality governing sensory adaptation, which was confirmed across different modalities, species, and experimental conditions. Further analysis shows that the firing rate behaves like a state function in thermodynamics, leading to an entropy production equation from which a general law follows: any closed cycle of stimulation yields a non-negative net gain of sensory information.
Sponsored by Paperpile, the PDF & BibTeX manager trusted by top AI labs.
The paper establishes an axiomatic model linking firing rate to sensory entropy, demonstrating a universal law of non-negative information gain.
It employs nonlinear state-space dynamics and validates its predictions across over 400 datasets from diverse sensory modalities.
The framework draws a thermodynamic analogy, unifying sensory adaptation and efficient coding under a common, mathematically principled rule.
Universal Theorem of Sensory Information: Formal Summary and Implications
Foundational Framework
The paper "A universal theorem of sensory information" (2511.11463) establishes a rigorous axiomatic model for sensory information, analogous in structure to the second law of thermodynamics. The concept is centered on the "ideal sensory unit," conceived as a modality-agnostic abstraction of peripheral sensory transduction. In this system, sensory input μ(t) modulates an internal state m(t), which governs the measurable firing rate output F(t). The model is defined via nonlinear, causal state-space dynamics:
m˙(t)=g(m(t),μ(t)),F(t)=f(m(t),μ(t))
The principal innovation lies in the direct linkage of firing rate F—the observable physiological correlate—to Shannon entropy H, serving as a measure of uncertainty reduction. Employing Faddeev’s characterization of entropy, the model posits F=kH, where k is a constant.
At its core, estimation of sensory stimuli is formally treated as a noisy measurement process, composed of repeated sampling epochs. The reduction in uncertainty about the mean stimulus is captured by:
H=21log(σR2+mσ2(μ))+const
Here σ2(μ) is the input-dependent stimulus variance, m is the effective sample size, and σR2 is representational noise.
Derivation of the Sensory Information Inequality
A crucial aspect of the model is the monotonic relationship between input intensity, stimulus variability, and equilibrium sampling effort—each scaling according to empirically validated fluctuation laws. The firing rate behaves as a nonlinear relaxation toward a stimulus-dependent steady state, with adaptation trajectories naturally admitting a general logarithmic form.
A key mathematical discovery is the following inequality governing the steady-state response (SS) of an adapting neuron, relating SS to its peak (PR) and spontaneous rates (SR):
PR⋅SR≤SS≤2PR+SR
This relationship—emerging directly from first principles—was shown to hold robustly across >400 individual datasets from multiple sensory modalities, species, and experimental conditions, with minimal empirical violation; the model thus achieves a degree of universality rarely encountered in biological systems.
Building upon this, the paper formalizes and proves a universal theorem for the accumulation of sensory information over cyclic stimulation:
∮CdI≥0
where I denotes sensory information, defined as the net reduction in Shannon uncertainty due to adaptation. Explicit geometric and algebraic proofs demonstrate that for any closed stimulus-response cycle, the cumulative information gain is non-negative, provided standard monotonicity conditions on the variance and equilibrium sampling are satisfied.
Thermodynamic Analogy and State Function Properties
The core structure of the model is shown to be formally analogous to statistical physics, particularly the entropy balance equations of thermodynamics. Firing rate F(t) at equilibrium is a state function: its value is path-independent and solely determined by current stimulus parameters, not by transition trajectories—a property confirmed by classical and contemporary single-unit recordings.
An entropy balance equation is constructed:
dH=∂μ∂Hdμ+∂m∂Hdm=δHflux+δHrelax
In this formulation, the net entropy change over a closed cycle vanishes (∮dH=0), yet the relaxation term consistently generates non-negative net sensory information (∮−δHrelax≥0). The analogy, while conceptual, reveals an underlying directional principle governing both physical and sensory information processing: processes involving adaptation and sampling drive systems toward increased informational (entropy-reducing) states.
Experimental Validation and Predictive Scope
The theoretical predictions are substantiated by an unparalleled breadth of experimental evidence. The adaptation inequality manifests in auditory, tactile, proprioceptive, visual, olfactory, gustatory, thermoceptive, and electroreceptive modalities, across vertebrate and invertebrate taxa. Notably, the predicted bounds on firing rates do not require parameter fitting, and empirical responses adhere tightly to model predictions—even data from Adrian and Zotterman’s century-old recordings comply.
This universality extends to temporal adaptation profiles and offers a basis for experimental verification. The paper demonstrates that stimulus cycles (on-off paradigms) yield strictly non-negative total "informational entropy production," with measurable changes directly linking firing rate phases (PR, SS, TR, SR) across adaptation.
Implications and Theoretical Considerations
The proven sensory information theorem imposes an intrinsic constraint on all sensory systems: net uncertainty must decrease over cyclic stimulus trajectories. This principle implies that adaptation, habituation, gain control, and perceptual transitions are governed by a directionality in information processing, tightly coupled to the statistical structure of natural signals. The framework is distinct from—yet compatible with—efficient coding, predictive coding, and the free-energy principle, as it eschews specific coding hypotheses and focuses exclusively on observable physiological variables and generic sampling properties.
Practically, the theorem implies that perceptual decision thresholds and reaction times can be interpreted in terms of minimal requisite information gain, unifying classical psychophysical laws (Weber, Pierson, Bloch) under a mathematically principled umbrella. The generality of the approach suggests it may extend to affective and behavioral phenomena, supporting broader theoretical integration.
Future Directions
Future research may exploit the explicit computability of entropy production in sensory processing to probe non-equilibrium neural phenomena, including rapid adaptation, path dependence, and large-scale behavioral correlates. The formal equivalence with thermodynamic state functions opens pathways to rigorous cross-disciplinary analogies, potentially informing synthetic sensory systems, optimizing information acquisition in artificial agents, and elucidating limits of biological perception. Extension to multi-unit and central sensory systems, as well as integration with non-classical signal statistics and power-law adaptation regimes, are natural next steps.
Conclusion
This work establishes a mathematically principled, experimentally validated theorem governing the acquisition of sensory information in biological systems. Employing a generalized state-space model, the paper demonstrates that all cyclic sensory processes yield non-negative information gain, formally mirroring the second law of thermodynamics. The approach advances the field from descriptive phenomenology to universally applicable laws, providing a quantitative foundation for both the analysis and design of sensory systems.