2000 character limit reached
Implicit Regularization with Polynomial Growth in Deep Tensor Factorization (2207.08942v2)
Published 18 Jul 2022 in cs.LG, cs.AI, cs.NE, and stat.ML
Abstract: We study the implicit regularization effects of deep learning in tensor factorization. While implicit regularization in deep matrix and 'shallow' tensor factorization via linear and certain type of non-linear neural networks promotes low-rank solutions with at most quadratic growth, we show that its effect in deep tensor factorization grows polynomially with the depth of the network. This provides a remarkably faithful description of the observed experimental behaviour. Using numerical experiments, we demonstrate the benefits of this implicit regularization in yielding a more accurate estimation and better convergence properties.