Causal relationship between KAG and neural collapse
Determine whether Kolmogorov-Arnold geometry in early layers of neural networks enables neural collapse in later layers or, conversely, whether the formation of collapsed final-layer representations induces Kolmogorov-Arnold geometry in earlier layers, by analyzing the joint evolution of Jacobian-based geometric structure across all layers during training.
Sponsor
References
It's tempting to speculate that these phenomena might be connected. Where KAG in early layers provides the structured substrate that enables neural collapse in later layers, or conversely, that the pull toward collapsed final representations induces geometric organization earlier in the network. But these remain open questions requiring careful analysis of the joint evolution of geometry across all layers during training.