2000 character limit reached
Jensen-Shannon Information Based Characterization of the Generalization Error of Learning Algorithms (2010.12664v2)
Published 23 Oct 2020 in cs.IT, math.IT, math.ST, stat.ML, and stat.TH
Abstract: Generalization error bounds are critical to understanding the performance of machine learning models. In this work, we propose a new information-theoretic based generalization error upper bound applicable to supervised learning scenarios. We show that our general bound can specialize in various previous bounds. We also show that our general bound can be specialized under some conditions to a new bound involving the Jensen-Shannon information between a random variable modelling the set of training samples and another random variable modelling the hypothesis. We also prove that our bound can be tighter than mutual information-based bounds under some conditions.