Dice Question Streamline Icon: https://streamlinehq.com

Generalize the continuous-depth, large-regularization analysis to fully-connected deep networks

Determine whether the expansion-around-large-regularization, continuous-depth (dynamical mean-field-type) replica analysis developed for continuous graph convolutional networks trained on the contextual stochastic block model can be applied to fully-connected large-depth neural networks to yield an analogous asymptotic performance characterization in the high-dimensional limit.

Information Square Streamline Icon: https://streamlinehq.com

Background

This paper derives a tight asymptotic characterization of the generalization performance of a continuous graph convolutional network (GCN) trained on data from the contextual stochastic block model (CSBM). The analysis hinges on a replica-method formulation, takes a continuous-depth limit (akin to neural ODEs), and employs a large-regularization expansion that leads to solvable dynamical mean-field-type equations.

Having demonstrated that this framework can predict performance for continuous GCNs and approach Bayes optimality in certain regimes, the authors pose whether the same analytical approach could extend to fully-connected large-depth neural networks, where rigorous high-dimensional performance characterizations are notoriously challenging.

References

In is an interesting question for future work whether this approach could allow the study of fully-connected large-depth neural networks.