Generality of phase-evolution results across architectures, datasets, and tasks
Determine the extent to which the reported phase behavior of neural-network–derived multi-layer Ising spin systems—specifically, the monotonic and often power-law growth of the melting transition temperature T_c with training and the rapid replacement of the spin-glass transition by a hidden-order phase with a single Z2 symmetry-broken TAP solution—persists for neural networks beyond the studied feed-forward architectures trained on MNIST, including different network architectures, datasets, and tasks.
References
It also remains to determine how well our results hold for different architectures, datasets, tasks, etc.
                — Neural Networks as Spin Models: From Glass to Hidden Order Through Training
                
                (2408.06421 - Barney et al., 12 Aug 2024) in Section 5 (Conclusion)