Papers
Topics
Authors
Recent
2000 character limit reached

Learning and Unlearning: Bridging classification, memory and generative modeling in Recurrent Neural Networks (2410.05886v1)

Published 8 Oct 2024 in cond-mat.dis-nn and physics.bio-ph

Abstract: The human brain is a complex system that is fascinating scientists since a long time. Its remarkable capabilities include categorization of concepts, retrieval of memories and creative generation of new examples. At the same time, modern artificial neural networks are trained on large amounts of data to accomplish these same tasks with a considerable degree of precision. By contrast with biological systems, machines appear to be either significantly slow and energetically expensive to train, suggesting the need for a paradigmatic change in the way they learn. We here review a general learning prescription that allows to perform classification, memorization and generation of new examples in bio-inspired artificial neural networks. The training procedure can be split into a prior Hebbian learning phase and a subsequent anti-Hebbian one (usually referred to as Unlearning). The separation of training in two epochs allows the algorithm to go fully unsupervised while partially aligning with some modern biological theories of learning.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Paper to Video (Beta)

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.