2000 character limit reached
Self-Replicating Neural Programs
Published 27 Sep 2021 in cs.NE and cs.LG | (2109.12786v2)
Abstract: In this work, a neural network is trained to replicate the code that trains it using only its own output as input. A paradigm for evolutionary self-replication in neural programs is introduced, where program parameters are mutated, and the ability for the program to more efficiently train itself leads to greater reproductive success. This evolutionary paradigm is demonstrated to produce more efficient learning in organisms from a setting without any explicit guidance, solely based on natural selection favoring organisms with faster reproductive maturity.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.