Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Mutual Information and Conditional Mean Prediction Error (1407.7165v1)

Published 26 Jul 2014 in cs.IT, math.IT, math.PR, math.ST, physics.bio-ph, physics.data-an, and stat.TH

Abstract: Mutual information is fundamentally important for measuring statistical dependence between variables and for quantifying information transfer by signaling and communication mechanisms. It can, however, be challenging to evaluate for physical models of such mechanisms and to estimate reliably from data. Furthermore, its relationship to better known statistical procedures is still poorly understood. Here we explore new connections between mutual information and regression-based dependence measures, $\nu{-1}$, that utilise the determinant of the second-moment matrix of the conditional mean prediction error. We examine convergence properties as $\nu\rightarrow0$ and establish sharp lower bounds on mutual information and capacity of the form $\mathrm{log}(\nu{-1/2})$. The bounds are tighter than lower bounds based on the Pearson correlation and ones derived using average mean square-error rate distortion arguments. Furthermore, their estimation is feasible using techniques from nonparametric regression. As an illustration we provide bootstrap confidence intervals for the lower bounds which, through use of a composite estimator, substantially improve upon inference about mutual information based on $k$-nearest neighbour estimators alone.

Summary

We haven't generated a summary for this paper yet.