Fundamental Limits of Prediction, Generalization, and Recursion: An Entropic-Innovations Perspective (2001.03813v4)
Abstract: In this paper, we examine the fundamental performance limits of prediction, with or without side information. More specifically, we derive generic lower bounds on the $\mathcal{L}_p$ norms of the prediction errors that are valid for any prediction algorithms and for any data distributions. Meanwhile, we combine the entropic analysis from information theory and the innovations approach from prediction/estimation theory to characterize the conditions (in terms of, e.g., directed information or mutual information) to achieve the bounds. We also investigate the implications of the results in analyzing the fundamental limits of generalization in fitting (learning) problems from the perspective of prediction with side information, as well as the fundamental limits of recursive algorithms by viewing them as generalized prediction problems.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.