2000 character limit reached
What are Neural Networks made of? (1909.09588v1)
Published 25 Aug 2019 in cs.NE, cs.AI, and cs.LG
Abstract: The success of Deep Learning methods is not well understood, though various attempts at explaining it have been made, typically centered on properties of stochastic gradient descent. Even less clear is why certain neural network architectures perform better than others. We provide a potential opening with the hypothesis that neural network training is a form of Genetic Programming.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.