Papers
Topics
Authors
Recent
Search
2000 character limit reached

Summary of Information Theoretic Quantities

Published 8 Jan 2015 in q-bio.NC | (1501.01854v1)

Abstract: Information theory is a practical and theoretical framework developed for the study of communication over noisy channels. Its probabilistic basis and capacity to relate statistical structure to function make it ideally suited for studying information flow in the nervous system. As a framework it has a number of useful properties: it provides a general measure sensitive to any relationship, not only linear effects; its quantities have meaningful units which in many cases allow direct comparison between different experiments; and it can be used to study how much information can be gained by observing neural responses in single experimental trials, rather than in averages over multiple trials. A variety of information theoretic quantities are in common use in neuroscience - including the Shannon entropy, Kullback-Leibler divergence, and mutual information. In this entry, we introduce and define these quantities. Further details on how these quantities can be estimated in practice are provided in the entry "Estimation of Information-Theoretic Quantities" and examples of application of these techniques in neuroscience can be found in the entry "Applications of Information-Theoretic Quantities in Neuroscience".

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.