Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
140 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Implicit embedding of prior probabilities in optimally efficient neural populations (1209.5006v1)

Published 22 Sep 2012 in q-bio.NC and physics.bio-ph

Abstract: We examine how the prior probability distribution of a sensory variable in the environment influences the optimal allocation of neurons and spikes in a population that represents that variable. We start with a conventional response model, in which the spikes of each neuron are drawn from a Poisson distribution with a mean rate governed by an associated tuning curve. For this model, we approximate the Fisher information in terms of the density and amplitude of the tuning curves, under the assumption that tuning width varies inversely with cell density. We consider a family of objective functions based on the expected value, over the sensory prior, of a functional of the Fisher information. This family includes lower bounds on mutual information and perceptual discriminability as special cases. For all cases, we obtain a closed form expression for the optimum, in which the density and gain of the cells in the population are power law functions of the stimulus prior. Thus, the allocation of these resources is uniquely specified by the prior. Since perceptual discriminability may be expressed directly in terms of the Fisher information, it too will be a power law function of the prior. We show that these results hold for tuning curves of arbitrary shape and correlated neuronal variability. This framework thus provides direct and experimentally testable predictions regarding the relationship between sensory priors, tuning properties of neural representations, and perceptual discriminability.

Summary

We haven't generated a summary for this paper yet.