2000 character limit reached
Information-Theoretic Bounds on the Moments of the Generalization Error of Learning Algorithms (2102.02016v2)
Published 3 Feb 2021 in cs.IT, cs.LG, math.IT, and stat.ML
Abstract: Generalization error bounds are critical to understanding the performance of machine learning models. In this work, building upon a new bound of the expected value of an arbitrary function of the population and empirical risk of a learning algorithm, we offer a more refined analysis of the generalization behaviour of a machine learning models based on a characterization of (bounds) to their generalization error moments. We discuss how the proposed bounds -- which also encompass new bounds to the expected generalization error -- relate to existing bounds in the literature. We also discuss how the proposed generalization error moment bounds can be used to construct new generalization error high-probability bounds.