Inverted Inference and Recursive Bootstrapping: A Primal-Dual Theory of Structured Cognition (2404.01183v3)
Abstract: This paper introduces a unifying framework that links the Context-Content Uncertainty Principle (CCUP) with optimal transport (OT) via primal-dual inference. We propose that cognitive representations are not static encodings but active dual constraints that shape feasible manifolds for learning and inference. Cognition is formalized as the dynamic alignment of high-entropy contexts with low-entropy content, implemented through cycle-consistent inference that minimizes conditional entropy. Central to this framework is the concept of inverted inference: a goal-driven mechanism that reverses the direction of conditioning to simulate latent trajectories consistent with internal goals. This asymmetric inference cycle closes the duality gap in constrained optimization, aligning context (primal variables) with content (dual constraints), and reframing inference as structure-constrained entropy minimization. Temporally, we introduce recursive bootstrapping, where each inference cycle sharpens the structural manifold for the next, forming memory chains that support path-dependent optimization and hierarchical goal decomposition. Spatially, we extend the model via hierarchical spatial bootstrapping, connecting to Hierarchical Navigable Small World (HNSW) graphs to enable sublinear retrieval of goal-consistent latent states. Altogether, this framework provides a computational theory of cognition in which dynamic alignment across time and space supports efficient generalization, abstraction, and adaptive planning. CCUP emerges as a scalable principle for both slow, recursive reasoning and fast, structure-aware recognition through layered primal-dual cycles.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.