Gradient flow in the kernel learning problem
Abstract: This is a sequel to our paper `On the kernel learning problem'. We identify a canonical choice of Riemannian gradient flow, to find the stationary points in the kernel learning problem. In the presence of Gaussian noise variables, this flow enjoys the remarkable property of having a continuous family of Lyapunov functionals, and the interpretation is the automatic reduction of noise. PS. We include an extensive discussion in the postcript explaining the comparison with the 2-layer neural networks. Readers looking for additional motivations are encouraged to read the postscript immediately following the introduction.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.