Diagrammatic expansion for the mutual-information rate in the realm of limited statistics (2504.06255v1)
Abstract: Neurons in sensory systems encode stimulus information into their stochastic spiking response. The Mutual information has been broadly applied to these systems to quantify the neurons' capacity of transmitting such information. Yet, while for discrete stimuli, like flashed images or single tones, its computation is straightforward, for dynamical stimuli it is necessary to compute a (mutual) information rate (MIR), therefore integrating over the multiple temporal correlations which characterize sensory systems. Previous methods are based on extensive sampling of the neuronal response, require large amounts of data, and are therefore prone to biases and inaccuracy. Here, we develop Moba-MIRA (moment-based mutual-information-rate approximation), a computational method to estimate the mutual information rate. To derive Moba-MIRA, we use Feynman diagrams to expand the mutual information in arbitrary orders in the correlations around the corresponding value for the empirical spike count distributions of single binss. As a result, only the empirical estimation of the pairwise correlations between time bins and the single-bin entropies are required, without the need for the whole joint probability distributions. We tested Moba-MIRA on synthetic data generated with generalized linear models, and showed that it requires only a few tens of stimulus repetitions to provide an accurate estimate of the information rate. Finally, we applied it to ex-vivo electrophysiological recordings of rats retina, obtaining rates ranging between 5 to 20 bits per second, consistent with earlier estimates.