2000 character limit reached
Formal limitations of sample-wise information-theoretic generalization bounds (2205.06915v2)
Published 13 May 2022 in cs.LG and stat.ML
Abstract: Some of the tightest information-theoretic generalization bounds depend on the average information between the learned hypothesis and a single training example. However, these sample-wise bounds were derived only for expected generalization gap. We show that even for expected squared generalization gap no such sample-wise information-theoretic bounds exist. The same is true for PAC-Bayes and single-draw bounds. Remarkably, PAC-Bayes, single-draw and expected squared generalization gap bounds that depend on information in pairs of examples exist.