Hierarchical Models: Intrinsic Separability in High Dimensions (2003.07770v1)
Abstract: It has long been noticed that high dimension data exhibits strange patterns. This has been variously interpreted as either a "blessing" or a "curse", causing uncomfortable inconsistencies in the literature. We propose that these patterns arise from an intrinsically hierarchical generative process. Modeling the process creates a web of constraints that reconcile many different theories and results. The model also implies high dimensional data posses an innate separability that can be exploited for machine learning. We demonstrate how this permits the open-set learning problem to be defined mathematically, leading to qualitative and quantitative improvements in performance.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Collections
Sign up for free to add this paper to one or more collections.