Papers
Topics
Authors
Recent
2000 character limit reached

Sub-Gaussian Error Bounds for Hypothesis Testing

Published 1 Jan 2021 in cs.IT, math.IT, math.ST, and stat.TH | (2101.00136v1)

Abstract: We interpret likelihood-based test functions from a geometric perspective where the Kullback-Leibler (KL) divergence is adopted to quantify the distance from a distribution to another. Such a test function can be seen as a sub-Gaussian random variable, and we propose a principled way to calculate its corresponding sub-Gaussian norm. Then an error bound for binary hypothesis testing can be obtained in terms of the sub-Gaussian norm and the KL divergence, which is more informative than Pinsker's bound when the significance level is prescribed. For $M$-ary hypothesis testing, we also derive an error bound which is complementary to Fano's inequality by being more informative when the number of hypotheses or the sample size is not large.

Citations (1)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.